Topographic alignment of auditory inputs to the visual cortex
1/N What are the organizational principles underlying crossmodal cortical connections? We address this in this new preprint, led by @alexegeaweiss.bsky.social & @bturner-bridger.bsky.social in collab w/ @petrznam.bsky.social @crick.ac.uk www.biorxiv.org/content/10.1...
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
2/ To integrate information from our different senses cortical areas form extensive connections. The logic of these connections is not fully understood. Is auditory information uniformly distributed across visual cortical areas or is there patterned information routing?
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
3/ High-throughput mapping of A1 single neuron projections showed that auditory cortical projections to the visual cortex are not uniform, but instead follow the division of the visual cortex into dorsal and ventral processing streams.
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
4/ These projections exhibit topographic organisation at both population and single-cell levels. At the population level, anterior AC neurons preferentially target anterior VC areas in the dorsal stream, while posterior AC neurons project to posterior ventral stream areas.
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
5/ At the single-neuron level, individual AC neurons co-innervate multiple higher visual areas within the same stream and exhibit stream-specific long range projection patterns.
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
6/ What is the function of these projections?
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
7/ 2P imaging of auditory cortex axons shows that while sound frequency information is homogenously distributed across visual cortical areas, sound location information is differentially broadcasted…
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
8/ Specifically, sound azimuth is differentially encoded across visual cortical areas and streams matching the retinotopic bias of the target area.
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
9/ And we find the same for elevation...
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10
[image or embed]
10/ Our results show topographically organised, stream-specific routing of crossmodal signals. Crossmodal information is selectively routed according to the organising principles of the target area. This structured connectivity enables the spatially coherent integration of multisensory information.
— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10