A Computational Cognitive and Visual Neuroscience Laboratory
Through experience, we learn to interpret the sights and sounds around us and to make decisions that move us closer to achieving our goals. Our ability to learn from and adapt to our ever changing environment is a foundation of complex behavior, as it allows us to make sense of incoming sensory stimuli and to plan successful actions. To study these questions, our laboratory uses advanced neurophysiological and behavioral techniques, in parallel with machine learning approaches for studying cognitive computations in artificial neural networks. Together, our work is providing insights into the brain mechanisms of visual learning, recognition and decision making.
Congrats to graduate student Ali Alamri on receiving a DOD National Defense Science and Engineering Graduate Fellowship!
Congrats to co-first authors Yang Zhou and Matthew Rosen on their paper in eLife examining neural mechanisms of categorical decisions in biological and artificial neural networks!
Congrats to graduate student Oliver Zhu on receiving a NIH NRSA Graduate Fellowship!
Congrats to FreedmanLab Ph.D. student Krithika Mohan on her paper in Neuron showing how task demands affect populations dynamics in parietal cortex.
Congratulations to graduate student Barbara Peysakhovich on receiving a NIH NRSA Graduate Fellowship!
Our latest paper is in press at Science! Postdoc Yang Zhou shows that posterior parietal cortex plays a causal role in perceptual and categorical decisions.