-
Best email ever
I got an email yesterday from a reader of this blog that I can’t help but share. This guy has figured out the secret to life, the universe, and everything. Spoiler alert: Genetically engineered demons are out to eat our adrenal glands. Hello! My name is [J—]. I live here in [city] and do landscaping,
-
CSHL projects: variance estimation and N-AFC psychophysics
CSHL computational neuroscience: vision is over, and we presented projects using some of the new methods and ideas we explored in class. Earlier in the class, Geoff Boynton did a presentation on psychophysics, and illustrated how to estimate a threshold through Maximum Likelihood in a N-AFC task, where N >= 2. I was curious to
-
Funes, or parallax
In a CSHL lecture on attention, Marisa Carrasco used the fictional Funes as an illustration of the idea that perception is about prioritizing and throwing data away. Funes the Memorious is a famous short story by Borges about a man with a photographic memory that couldn’t make sense of the world as he couldn’t throw
-
Adam Kohn on population coding
Adam delivered a pretty intense lecture at CSHL on population coding, correlations and phase-locking. Consider myself mindfucked. Mainen & Sejnowski (1995) showed that single neurons have very reliable responses to current injections. Nevertheless, cortical neurons seem to have Poisson or supra-Poisson variability. It’s possible to find a bound on decodability using the Fisher information matrix (Sompolinsky
-
Geoff Boynton on fMRI
Geoff just delivered a lecture at CSHL computational vision on fMRI. He pointed out that it’s an incredibly convenient coincidence that hemoglobin and deoxyhemoglobin have sufficiently different magnetic moments that they can be picked up using MRI. I made a comment (which I thought was mind-blowing but others thought was funny; it wasn’t a joke,
-
Fitting a spline nonlinearity in a Poisson model
I was talking to Jeremy Freeman at CSHL and he asked about an easy way to fit a spline nonlinearity in a Poisson regression model. Recall that with the canonical exponential nonlinearity, we have the following setup: And the negative log-likelihood is given by: Start by fitting w by maximum likelihood. Compute . Then you