
Archives
 November 2017
 May 2017
 April 2017
 December 2016
 September 2015
 December 2014
 February 2014
 January 2014
 September 2013
 September 2012
 January 2012
 November 2011
 October 2011
 July 2011
 May 2011
 March 2011
 November 2009
 October 2009
 August 2009
 June 2009
 May 2009
 March 2009
 February 2009
 January 2009
 December 2008
 November 2008
 October 2008
 August 2008
 July 2008
 June 2008

Meta
Tag Archives: math
The second and third best features of LyX you aren’t using.
LyX is a WYSIWYG editor for latex files. It’s a little bit clunky to use at first, and isn’t perfect (thank you, open source developers– I’m not ungrateful!) but after becoming familiar with it, it’s probably the single piece of … Continue reading
Reducing Sigmoid computations by (at least) 88.0797077977882%
A classic implementation issue in machine learning is reducing the cost of computing the sigmoid function . Specifically, it is common to profile your code and discover that 90% of the time is spent computing the in that function. This … Continue reading
Posted in Uncategorized
Tagged boltzmann machines, efficiency, machine learning, math, neural networks
9 Comments
Quotients
It seems to me that thinking of quotients as a fundamental operator is usually painful and unnecessary when the objects are almost anything other than real (or rational) numbers. Instead it is better to think of a quotient as a … Continue reading
Matrix Calculus
Based on a lot of requests from students, I did a lecture on matrix calculus in my machine learning class today. This was based on Minka’s Old and New Matrix Algebra Useful for Statistics and Magnus and Neudecker’s Matrix Differential … Continue reading
Automatic Differentiation Without Compromises
Automatic differentiation is a classic numerical method that takes a program, and (with minimal programmer effort) computes the derivatives of that program. This is very useful because, when optimizing complex functions, a lot of time tends to get spent manually … Continue reading
What GaussSeidel is Really Doing
I’ve been reading Alan Sokal’s lecture notes “Monte Carlo Methods in Statistical Mechanics: Foundations and New Algorithms” today. Once I learned to take the word “Hamiltonian” and mentally substitute “function to be minimized”, they are very clearly written. Anyway, the … Continue reading
A simple explanation of reversemode automatic differentiation
My previous rant about automatic differentiation generated several requests for an explanation of how it works. This can be confusing because there are different types of automatic differentiation (forwardmode, reversemode, hybrids.) This is my attempt to explain the basic idea … Continue reading