-
Approximate log determinant of huge matrices
Log determinants frequently need to be computed as part of Bayesian inference in models with Gaussian priors or likelihoods. They can be quite troublesome to compute; done through the Cholesky decomposition, it’s an operation. It’s pretty much infeasible to compute the exact log det for arbitrary matrices > 10,000 x 10,000. Models with that many
-
Verifying analytical Hessians
When using optimization functions which require Hessians, it’s easy to mess up the math and end up with incorrect second derivatives. While Matlab’s optimization toolbox offers the DerivativeCheck option for checking gradients, it doesn’t work on Hessians. This nice package available on Matlab Central will compute Hessians numerically, so you can easily double check your
-
Hamiltonian Monte Carlo
I’ve been getting more into MCMC methodology recently. There’s a paper published this year by Ahmadian, Pillow & Paninski on different efficient MCMC samplers in the context of decoding spike trains with GLMs. The same methods could potentially be used, of course, for other purposes, like tracking receptive fields. Of particular interest is a remarkably
-
Removing line noise from LFPs, wideband signals
…Trail of papers had a recent post on standard analyses in neuroscience, which reminded me that I’ve meaning to post about signal preprocessing for a while now. If there’s too much line noise (electrical noise at multiples of 60Hz in North America and 50Hz in Europe) in a signal, this will render the signal unusable.
-
Using the binomial GLM instead of the Poisson for spike data
Cortical spike trains roughly follow Poisson statistics. When it comes to modeling the spike rate as a function of a stimulus, in a receptive field estimation context, for example, it’s thus natural to use the Poisson GLM. In the Poisson GLM with canonical link, the rate is assumed to be generated by a weighted sum
-
Don’t save anonymous Matlab functions
Bug of the day: if you use the save function inside a function, and one of the things you’re saving is an anonymous function, then all the variables in the scope of the (non-anonymous) function will be saved along in the .mat file. So for example: If you look at test.mat you will find that