Resume: Researchers developed an AI model of the fruit fly’s brain to understand how vision drives behavior. By genetically silencing specific visual neurons and observing behavioral changes, they trained the AI to accurately predict neural activity and behavior.
Their findings show that combinations of multiple neurons, rather than individual types, process visual data in a complex ‘population code’. This breakthrough paves the way for future research into the human visual system and related disorders.
Key Facts:
- CSHL scientists created an AI model of the fruit fly’s brain to study vision-driven behavior.
- The AI predicts neural activity by analyzing behavioral changes after specific visual neurons are turned off.
- The research revealed a complex ‘population code’ involving multiple neuron combinations processing visual data.
Source: CSHL
We are told, “The eyes are the window to the soul.” Well, windows work in two ways. Our eyes are also our windows to the world. What we see and how we see it partly determines how we move through the world. In other words, our vision helps guide our actions, including social behavior.
Now a young scientist at Cold Spring Harbor Laboratory (CSHL) has discovered an important clue about how this works. He did this by building a special AI model of the brain of the common fruit fly.
CSHL Assistant Professor Benjamin Cowley and his team honed their AI model using a technique they developed called “knockout training.” First they recorded the courtship behavior of a male fruit fly: chasing and singing for a female.
They then genetically silenced specific types of visual neurons in the male fly and trained their AI to detect any behavioral changes. By repeating this process with many different types of visual neurons, they were able to get the AI to accurately predict how the real fruit fly would react to each sight of the female.
“We can predict neural activity computationally and ask how specific neurons contribute to behavior,” says Cowley. “This is something we couldn’t do before.”
With their new AI, Cowley’s team discovered that the fruit fly brain uses a ‘population code’ to process visual data. Instead of one neuron type linking each visual feature to one action, as previously thought, many combinations of neurons were needed to shape behavior.
A diagram of these neural pathways looks like an incredibly complex subway map and will take years to decipher. Yet it gets us where we need to go. It allows Cowley’s AI to predict how a real fruit fly will behave when presented with visual stimuli.
Does this mean AI can ever predict human behavior? Not so fast. The brain of fruit flies contains approximately 100,000 neurons. The human brain has almost 100 billion.
“This is what it’s like for the fruit fly. You can imagine what our visual system looks like,” says Cowley, referring to the subway map.
Still, Cowley hopes his AI model will one day help us decipher the calculations underlying the human visual system.
“This will be decades of work. But if we can figure this out, we’ll be ahead of the curve,” Cowley says. “By learning [fly] With calculations we can build a better artificial visual system. More importantly, we will understand disorders of the visual system much better.”
How much better? You’ll have to see it to believe it.
About this AI and neuroscience research news
Author: Sara Giarnieri
Source: CSHL
Contact:Sara Giarnieri – CSHL
Image: The image is credited to Neuroscience News
Original research: Open access.
“Mapping Model Units to Visual Neurons Reveals Population Code for Social Behavior” by Benjamin Cowley et al. Nature
Abstract
Mapping model units to visual neurons reveals population code for social behavior
The rich variety of behaviors observed in animals arise from the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neural responses to sensory input, but also how each neuron causally contributes to behavior.
Here we demonstrate a novel modeling approach to identify a one-to-one mapping between internal units in a deep neural network and real neurons by predicting the behavioral changes arising from systematic perturbations of more than a dozen neuronal cell types.
A key ingredient we introduce is “knockout training,” where the network is perturbed during training to match the perturbations of real neurons during behavioral experiments. We apply this approach to model the sensorimotor transformations of Drosophila melanogaster males during complex, visually guided social behavior.
The visual projection neurons at the interface between the optic lobe and the central brain form a series of distinct channels, and previous research indicates that each channel encodes a specific visual feature to drive a particular behavior.
Our model reaches a different conclusion: combinations of visual projection neurons, including neurons involved in nonsocial behavior, drive male interactions with females and form a rich population code for behavior.
Overall, our framework consolidates the behavioral effects arising from different neural perturbations into a single, unified model, providing a map from stimulus to neuronal cell type to behavior, and allowing future integration of brain wiring diagrams into the model.