Thanks to their genetic makeup, their ability to navigate mazes and their willingness to work for cheese, mice have long been a go-to model for behavioral and neurological studies.
In recent years, they have entered a new arena – virtual reality – and now Cornell researchers have built miniature VR headsets to immerse them more deeply in it.
The team’s MouseGoggles – yes, they look as cute as they sound – were created using low-cost, off-the-shelf components, such as smartwatch displays and tiny lenses, and offer visual stimulation over a wide field of view while tracking the mouse’s eye movements and changes in pupil size.
The technology has the potential to help reveal the neural activity that informs spatial navigation and memory function, giving researchers new insights into disorders such as Alzheimer’s disease and its potential treatments.
The research, published Dec. 12 in Nature Methods, was led by Chris Schaffer, professor of biomedical engineering in Cornell Engineering, and Ian Ellwood, assistant professor in neurobiology and behavior in the College of Arts and Sciences. The study’s lead authors are postdoctoral researcher Matthew Isaacson and doctoral student Hongyu Chang.
“It’s a rare opportunity, when building tools, that you can make something that is experimentally much more powerful than current technology, and that is also simpler and cheaper to build,” Isaacson said. “It’s bringing more experimental power to neuroscience, and it’s a much more accessible version of the technology, so it could be used by a lot more labs.”
Schaffer’s lab, which he runs with Nozomi Nishimura, associate professor of biomedical engineering, develops optics-based tools and techniques that can be used, along with other methodologies, to investigate the molecular and cellular mechanisms that contribute to loss of function in neurodegenerative diseases. One particular line of research has been studying the unexplained reductions in brain blood flow in mice with Alzheimer’s disease. By unblocking tiny capillaries and increasing that flow, the researchers have shown that memory function in mice improves within hours.
“That was very exciting from the perspective of, hey, maybe there is something you could do in Alzheimer’s disease that could recover some cognitive function,” Schaffer said. “The next steps are to uncover how blood flow improvements are improving the function of neurons in the brain. But to do those experiments, we needed new capabilities compared to what existed in the world before.”
Beginning about a decade ago, researchers began rigging up cumbersome – and quite costly – projector screens for mice to navigate virtual-reality environments, but the apparatuses are often clunky, and the resulting light pollution and noise can disrupt the experiments.
“The more immersive we can make that behavioral task, the more naturalistic of a brain function we’re going to be studying,” Schaffer said.
Isaacson, who previously designed display systems for fruit flies, set about assembling a stationary VR setup that would be simpler but even more immersive, so the mice could learn more quickly. It so happened that many of the components he needed – tiny displays, tiny lenses – were already commercially available.
“It definitely benefited from the hacker ethos of taking parts that are built for something else and then applying it to some new context,” Isaacson said. “The perfect size display, as it turns out, for a mouse VR headset is pretty much already made for smart watches. We were lucky that we didn’t need to build or design anything from scratch, we could easily source all the inexpensive parts we needed.”
The goggles aren’t wearable in the traditional sense. A mouse stands on a treadmill, with its head fixed in place, as it peers into a pair of eye pieces. The mouse’s neural activity patterns can then be fluorescently imaged.
Working with Ellwood’s lab, the team conducted a battery of tests on begoggled mice. On the neurological front, they examined two key regions in the mouse brain: the primary visual cortex, to ensure the goggles form sharp, high contrast images on the retina; and in the hippocampus, to confirm that the mouse brain is successfully mapping its virtual environment. Other tests were more tech-oriented, to see if the goggle displays updated quickly and were responsive to the mouse’s movements.
And most importantly, the researchers needed to observe how the mice behaved in their new eyewear. One of the most effective tests was tricking a mouse into believing that an expanding dark blotch was approaching them.
“When we tried this kind of a test in the typical VR setup with big screens, the mice did not react at all,” Isaacson said. “But almost every single mouse, the first time they see it with the goggles, they jump. They have a huge startle reaction. They really did seem to think they were getting attacked by a looming predator.”
The researchers received an unexpected contribution when they submitted their findings to Nature Methods. An anonymous reviewer pushed the researchers to add a set of cameras in each eye piece that could record the mouse’s pupils and verify the animal’s engagement and arousal.
The request was both a difficult task and a fortuitous blessing.
“They challenged us to do something really hard and make it all work,” Schaffer said. “In the last year, there’s been now three papers published with VR goggles for mice. You know, the field was ripe for this to happen. But we’re the only one with pupillometry and eye tracking, and that is a critical capability for much of neuroscience.”
The researchers are looking to further develop the goggles, with a lightweight, mobile version for larger rodents, such as tree shrews and rats, that can include a battery and onboard processing. Schaffer also sees the potential of incorporating more senses, such as taste and smell, into the VR experience.
“I think five-sense virtual reality for mice is a direction to go for experiments,” he said, “where we’re trying to understand these really complicated behaviors, where mice are integrating sensory information, comparing the opportunity with internal motivational states, like the need for rest and food, and then making decisions about how to behave.”
Co-authors include doctoral student Rick Zirkel; postdoctoral researcher Laura Berkowitz; and Yusol Park ’22 and Danyu Hu ’22.
The research was supported by the Cornell Neurotech Mong Family Fellowship program; the BrightFocus Foundation Alzheimer’s disease fellowship program; the Brain and Behavior Research Foundation; and the ³Ô¹ÏÍøÕ¾ Institutes of Health.