You are currently viewing AI decodes fruit fly vision, paving the way for human insight Neuroscience News

AI decodes fruit fly vision, paving the way for human insight Neuroscience News

Summary: Researchers developed an AI model of the fruit fly brain to understand how vision guides behavior. By genetically silencing specific visual neurons and observing changes in behavior, they trained an AI to accurately predict neural activity and behavior.

Their findings reveal that multiple combinations of neurons, rather than single types, process visual data in a complex “population code.” This breakthrough paves the way for future research into the human visual system and related disorders.

Key facts:

  • CSHL scientists created an AI model of the fruit fly brain to study vision-driven behavior.
  • AI predicts neural activity by analyzing changes in behavior after silencing specific visual neurons.
  • The research reveals a complex “population code” where multiple neural combinations process visual data.

source: CSHL

We’ve been told, “The eyes are the windows to the soul.” Well, windows work in two ways. Our eyes are also our windows to the world. What we see and how we see it help determine how we navigate the world. In other words, our vision helps guide our actions, including social behavior.

Now, a young scientist at Cold Spring Harbor Laboratory (CSHL) has uncovered a key clue to how this works. He did this by building a special AI model of the brain of the common fruit fly.

Still, Cowley hopes his AI model will one day help us decode the computations that underlie the human visual system. Credit: Neuroscience News

CSHL Assistant Professor Benjamin Cowley and his team refined their AI model through a technique they developed called “knockout training.” First, they recorded the courtship behavior of a male fruit fly – chasing and singing to a female.

They then genetically silenced specific types of visual neurons in the male fly and trained their AI to detect any changes in behavior. By repeating this process with many different types of visual neurons, they were able to get the AI ​​to predict exactly how a real fruit fly would act in response to each sight of the female.

“We can actually predict neural activity computationally and ask how specific neurons contribute to behavior,” Cowley says. “It’s something we couldn’t do before.”

With their new artificial intelligence, Cowley’s team discovered that the fruit fly brain uses a “population code” to process visual data. Instead of a single type of neuron linking each visual feature to a single action, as previously assumed, many combinations of neurons were required to sculpt behavior.

A diagram of these neural pathways looks like an incredibly complex subway map and would take years to decipher. Still, it gets us where we need to go. This allows Cowley’s AI to predict how a fruit fly will behave in real life when presented with visual stimuli.

Does this mean AI can one day predict human behavior? Not so fast. The brain of a fruit fly contains about 100,000 neurons. The human brain has almost 100 billion.

“That’s the fruit fly situation. You can imagine what our visual system is like,” says Cowley, referring to the subway map.

Still, Cowley hopes his AI model will one day help us decode the computations that underlie the human visual system.

“It will be decades of work. But if we can figure that out, we’re ahead of the game,” says Cowley. “Through learning [fly] computations, we can build a better artificial visual system. More importantly, we will understand the disorders of the visual system in much more detail.

How much better? You’ll have to see it to believe it.

About this AI and neuroscience research news

Author: Sara Giarnieri
source: CSHL
Contact:Sara Giarnieri – CSHL
Image: Image credit: Neuroscience News

Original Research: Free access.
“Mapping Model Units to Visual Neurons Reveals the Population Code for Social Behavior” by Benjamin Cowley et al. Nature


Summary

Mapping pattern units to visual neurons reveals the population code for social behavior

The rich variety of behaviors observed in animals arises through the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neuronal responses to sensory input, but also how each neuron causally contributes to behavior.

Here, we demonstrate a novel modeling approach to identify one-to-one mappings between internal units in a deep neural network and real neurons by predicting the behavioral changes that arise from systematic perturbations of more than a dozen neuronal cell types.

A key ingredient we introduce is “knockout learning,” which involves perturbing the network during training to match the perturbations of real neurons during behavioral experiments. We apply this approach to modeling the sensorimotor transformations of Drosophila melanogaster males during complex, visually guided social behavior.

Visual projection neurons at the interface between the optic lobe and the midbrain form a set of discrete channels, and previous work has shown that each channel encodes a specific visual feature to drive certain behaviors.

Our model reaches a different conclusion: combinations of visual projection neurons, including those involved in nonsocial behavior, drive male interactions with females, forming a rich population code for behavior.

Overall, our framework consolidates the behavioral effects elicited by different neural perturbations into a single, unified model, providing a map from stimulus to neural cell type to behavior and enabling future incorporation of brain electrical circuitry into the model.

Leave a Reply