1/N What are the organizational principles underlying crossmodal cortical connections? We address this in this new preprint, led by @alexegeaweiss.bsky.social & ‪@bturner-bridger.bsky.social‬ in collab w/ ‪@petrznam.bsky.social‬ @crick.ac.uk www.biorxiv.org/content/10.1...

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

2/ To integrate information from our different senses cortical areas form extensive connections. The logic of these connections is not fully understood. Is auditory information uniformly distributed across visual cortical areas or is there patterned information routing?

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

3/ High-throughput mapping of A1 single neuron projections showed that auditory cortical projections to the visual cortex are not uniform, but instead follow the division of the visual cortex into dorsal and ventral processing streams.

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

4/ These projections exhibit topographic organisation at both population and single-cell levels. At the population level, anterior AC neurons preferentially target anterior VC areas in the dorsal stream, while posterior AC neurons project to posterior ventral stream areas.

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

5/ At the single-neuron level, individual AC neurons co-innervate multiple higher visual areas within the same stream and exhibit stream-specific long range projection patterns.

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

6/ What is the function of these projections?

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

7/ 2P imaging of auditory cortex axons shows that while sound frequency information is homogenously distributed across visual cortical areas, sound location information is differentially broadcasted…

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

8/ Specifically, sound azimuth is differentially encoded across visual cortical areas and streams matching the retinotopic bias of the target area.

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

9/ And we find the same for elevation...

[image or embed]

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10

10/ Our results show topographically organised, stream-specific routing of crossmodal signals. Crossmodal information is selectively routed according to the organising principles of the target area. This structured connectivity enables the spatially coherent integration of multisensory information.

— Flor Iacaruso (@flor-iacaruso.bsky.social) Aug 1, 2025 at 11:10