Watch the seconds tick by on a clock and, in visual regions of your brain, neighboring groups of angle-selective neurons will fire in sequence as the second hand sweeps around the clock face. These cells form beautiful "pinwheel" maps, with each segment representing a visual perception of a different angle.
The Stanford team developed a new kind of AI algorithm - a topographic deep artificial neural network (TDANN) - that successfully predicts both the sensory responses and spatial organization of multiple parts of the human brain's visual system.
After seven years of extensive research, the findings were published in a new paper - "A unifying framework for functional organization in the early and higher ventral visual cortex" - on May 10 in the journal Neuron.
Collection
[
|
...
]