ACUMEN_2025

COLLEGE OF ARTS AND SCIENCES 11 Michael Burger (left) in his lab. tuning of the capacitors in the radio to find your favorite station,” Burger says. The team injects these genetic constructs into embryos just two days after the eggs start incubating. They open a small hole in an egg, inject the gene into one of the chick’s developing ears, and then seal the egg to let it grow normally. This creates a chicken with one ear that hears all frequencies normally and one ear that only hears low frequencies. Why does this matter? The team wants to know whether neurons in the brain are programmed based on their location (like being “born” a high-frequency neuron) or if they can change based on the sounds they receive. Their findings show it’s the sounds—or the input— that determine what kind of neuron develops. “One of the things that differentiates low and high frequency cells in the brain is that the low frequency cells get many small inputs from the ear, about 10 to 13 tiny little synaptic inputs,” Burger says. “The high frequency cells, though, get one to three very large inputs with synapses that are large enough to easily see in the microscope. So, what we’re asking right now is if we change the organization of the ear, do we change the organization of this synaptic pattern in the brain?’ Applications of Virtual Reality in Synaptic Analysis Burger’s lab isn’t just innovating at the genetic level; they’re also revolutionizing how data is analyzed. Once doctoral student Kwame OwusuNyantakyi creates slices of a chicken’s brain, three-dimensional images are produced. The lab creates computations using this collected data to generate a virtual three-dimensional representation of the cell. Using virtual reality goggles with handheld controllers, undergraduate neuroscience major Audrey Snyder ’26 interacts with representations of neural synapses, rotating and exploring them in virtual space. This immersive approach allows for precise measurements of synaptic structures and facilitates collaboration across the lab. “Previously there was a system that did this, but in a more tedious way,” Owusu-Nyantakyi says. “So, for all of the data we get, we use confocal imaging, and you get optical slices of each of the segments through the tissue. Back then you had to go and literally stitch up all those images back together to give you the 3D volume, do some post-hoc analysis, and make sure your image is good, and then try to pull out the data you want. Now, once you do the annotation, you’ve got your data, which saves a fundamental organizing principle of auditory processing, Burger says. This mapping of “frequency to place” is called tonotopy. This tonotopic organization is then remapped everywhere in the brain where sounds are processed. The tonotopic organization of hearing is of particular interest to Burger. His lab has identified several properties of auditory neurons that appear to “tune” their own frequencies along the tonotopy within the brain. The key question is, how did this perfect tuning arise in development? Burger thinks it might be explained by one of two theories. One suggests that the tonotopic properties first arise in the ear, then during development the ear drives the tuning of neurons in the brain. Alternatively, brain organization may develop independently of the ear, instead relying on mapping cues present in the developing brain itself to establish tonotopic patterns. “How do the neurons in the brain develop their specializations since they’re spatially distributed,” says Burger, professor of neuroscience in the department of biological sciences. “Do the neurons become who they are because of where they live? Or alternatively, are they instructed to become that way by the ear?” Pioneering Techniques in Auditory Research Burger’s research examines how the brain organizes and processes auditory information, focusing on understanding how the brain adapts to changes in hearing. Using chicken embryos, his team alters the development of the inner ear to study how the brain responds to modified sound input. By injecting a gene that carries the instructions to produce bone morphogenetic protein (BMP7) into one of a chicken embryo’s developing ears, they change its auditory structure, so it only picks up low-frequency sounds and high frequency processing is dramatically diminished. This creates a uniquely patterned ear for studying how the brain’s neurons develop with this altered input pattern. Do they adjust to this new sound range, or do they stick to their original role? Chickens are ideal for this research because their auditory system is simpler than mammals and furthermore, the embryos grow in an accessible egg outside of the mother. Like humans, chickens’ ears and brains use tonotopic organization, but unlike our coiled cochleae, chickens’ cochleae are straight, making it easier to study how hair cells are organized. “In the birds, the hair cells are electrically tuned to resonate different frequencies the same way you turn the dial of an old-fashioned radio and change the electrical CHRISTINE KRESCHOLLEK

RkJQdWJsaXNoZXIy MTA0OTQ5OA==