Monday, April 13, 2026

How the brain retrieves previously viewed objects on images, perceiving and retrieving activates many of the same neurons

Amazing stuff!

"... A new study ... suggests ... a real neurological process, as the brain reactivates the same neural areas during recall that were used to see that image in the first place.

To measure this, researchers recorded the activity of over 700 neurons in the visual temporal cortex (VTC), a region involved in recognizing complex features and categorizing images, across 16 patients with previously implanted electrodes for epilepsy monitoring. The team tracked how individual brain cells responded as participants viewed a series of images of plants, animals, objects, and faces, and then later reconstructed them from memory.

Many of the same neurons were activated in both cases, with recall engaging 40% of the same visually responsive neurons active when an object was initially perceived. Based on this preserved structure, researchers were able to reconstruct which images patients were recalling from patterns of neural activity. In some cases, the reconstruction captured enough detail to distinguish general categories, like animals or faces. Still, the two processes were not identical, with recall producing less precise representations.

Exactly how these signals are generated and coordinated is still unclear, with authors noting that although the same regions of the brain are involved in both perception and recall, it’s not yet clear how the brain reactivates these patterns; current methods cannot fully distinguish which neurons are involved in each process."

From the editor's summary and abstract:
"Editor’s summary
The ventral temporal cortex (VTC) is a brain area involved in identifying and categorizing visual stimuli. Wadia et al. performed single-neuron recording in the VTC of patients with epilepsy while the subjects were presented with real visual stimuli or were asked to imagine them. Deep network analysis showed that visually responsive neurons were tuned on specific axes. While imagining the objects, around 40% of the visually responsive VTC neurons were also robustly activated. Thus, mental imagery reactivates the same sensory codes used during visual stimuli, suggesting the existence of a generative model capable of synthesizing detailed sensory contents from an abstract, semantic representation.  ...

Structured Abstract
INTRODUCTION
Mental imagery refers to our brains’ capacity to generate percepts, emotions, and thoughts in the absence of external stimuli. This ability allows us to generate art, simulate actions and outcomes, remember previous experiences, and imagine new ones.
Uncontrolled mental imagery can contribute to psychological disorders, including anxiety, schizophrenia, and posttraumatic stress disorder.
Despite its importance in our lives, little is known about the single-neuron mechanisms of mental imagery. Neuroimaging results support a long-standing theory that imagery of a given sense ...

RATIONALE
We investigated the single-neuron mechanisms of visual imagery by recording single neurons in human patients implanted with electrodes to localize their focal epilepsy as they viewed and subsequently imagined objects. We focused our investigations on the ventral temporal cortex (VTC), a part of the temporal lobe dedicated to representing visual objects.
We first determined the code for visual objects. We found that as in macaques, neurons in human VTC represent objects by using a distributed axis code. This code emphasizes the geometric picture that neurons project incoming stimuli—formatted as points in feature space—onto specific preferred axes and respond proportionally to the projection value. We then examined whether this code is reactivated during imagery.

RESULTS
We recorded 714 neurons in the human VTC across 16 patients as they viewed visual objects.
A majority of neurons (456 of 714) were visually selective for one of the five object categories used (faces, plants, text, animals, and objects). To represent general objects with arbitrary features, we built a low-dimensional object space using the unit activations of deep networks trained to perform object classification. Nearly ~80% (367 of 456) of all visually responsive single neurons were significantly axis tuned.
We used this axis code to reconstruct objects and generate maximally effective synthetic stimuli. Last, we recorded the responses of the same neurons in a subset of patients (6 of 16) as they imagined the same objects. Mean responses to perceived and imagined objects were comparable, with some neurons active only during perception, some only during imagery, and some during both.
In particular, ~40% (43 of 107) of axis-tuned VTC neurons recorded during the imagery task reactivated, and the responses during imagery of individual neurons were proportional to the projection value of those objects onto the neurons’ viewing axes. We used this observation to reconstruct imagined objects while still easily distinguishing whether those objects were viewed or imagined.

CONCLUSION
We leveraged the opportunity to record from the same population of VTC neurons in humans as they viewed and imagined objects. Neurons use an axis code to represent visual objects, and neural activity during imagination reactivates this code. These findings provide single-neuron evidence for a generative model in the human brain."

ScienceAdviser




Fig. 1 Selective visual responses in human VTC.


Fig. 2 Axis tuning in human VTC neurons.


No comments: