Tag Archives: machine learning

You deserve better than two-sided finite differences

In calc 101, the derivative is derived as . So, if you want to estimate a derivative, an easy way to do so would be to just pick some small and estimate: This can work OK. Let’s look at an … Continue reading

Posted in Uncategorized | Tagged , | 1 Comment

Sneaking up on Bayesian Inference (A fable in four acts)

Act 1: Magical Monkeys Two monkeys, Alfred () and Betty () live in a parallel universe with two kinds of blocks, green () and yellow (). Alfred likes green blocks, and Betty prefers the yellow blocks. One day, a Wizard … Continue reading

Posted in Uncategorized | Tagged | Leave a comment

Favorite things NIPS

I always enjoy reading conference reports, so I thought I’d mention a few papers that caught my eye.  (I welcome any corrections to my summaries of any of these.) 1. Recent Progress in the Structure of Large-Treewidth Graphs and Some … Continue reading

Posted in Uncategorized | Tagged , | 1 Comment

Truncated Bi-Level Optimization

In 2012, I wrote a paper that I probably should have called “truncated bi-level optimization”.  I vaguely remembered telling the reviewers I would release some code, so I’m finally getting around to it. The idea of bilevel optimization is quite … Continue reading

Posted in Uncategorized | Tagged , , , , | 5 Comments

Reducing Sigmoid computations by (at least) 88.0797077977882%

A classic implementation issue in machine learning is reducing the cost of computing the sigmoid function . Specifically, it is common to profile your code and discover that 90% of the time is spent computing the in that function.  This … Continue reading

Posted in Uncategorized | Tagged , , , , | 9 Comments

CRF Toolbox Updated

I updated the code for my Graphical Models / Conditional Random Fields toolbox This is a Matlab toolbox, though almost all the real work is done in compiled C++ for efficiency. The main improvements are: Lots of bugfixes. Various small … Continue reading

Posted in Uncategorized | Tagged , , , | 69 Comments

Personal opinions about graphical models 1: The surrogate likelihood exists and you should use it.

When talking about graphical models with people  (particularly computer vision folks) I find myself advancing a few opinions over and over again.  So, in an effort to stop bothering people at conferences, I thought I’d write a few entries here. … Continue reading

Posted in Uncategorized | Tagged , , | 7 Comments