Skip to main content
eScholarship
Open Access Publications from the University of California

UCSF

UC San Francisco Electronic Theses and Dissertations bannerUCSF

Crossmodal influences in mouse auditory cortex during passive stimulation and an audiovisual behavior

Abstract

To enable flexible behavior, the brain utilizes signals from across the sensory modalities, merging some streams while filtering out others in a context-dependent manner. These processes occur at many levels in the sensory hierarchy, but the cerebral cortex appears to play a prominent role in enabling dynamic use of crossmodal sensory information. This dissertation explores the interactions of auditory and visual sensory processing in the auditory cortex (ACtx) of the awake mouse using two experimental approaches. First, the influence of visual stimuli on neural firing in ACtx is investigated using multisite probes to sample activity across cortical layers. Visual stimuli elicit spiking responses in both primary and secondary ACtx. Through fluorescent dye electrode track tracing and optogenetic identification using layer-specific markers, these responses are revealed to be largely restricted to infragranular layers and particularly prominent in layer 6. Presentation of drifting visual gratings show that responses are not orientation-tuned, unlike visual cortex responses. The deepest cortical layers thus appear to be an important locus for crossmodal integration in ACtx. Second, to test the influence of modality-specific attention on ACtx stimulus processing, a novel audiovisual (AV) go/no-go rule-switching task for mice is presented. Translaminar ACtx extracellular recordings from mice performing the task show that attentional state modulates responses to AV stimuli. On average, single-unit firing rates (FRs) in deep and middle cortex are reduced during auditory attention in response to task-relevant stimuli, although a smaller population of units increases FRs. Pre-stimulus activity also decreases when behavior is guided by the auditory rule and appears to account for much of the change in stimulus-evoked activity. This general reduction in activity does not impair decoding with a PSTH-based pattern classifier, but instead increases mutual information encoding efficiency in the deep, putatively-excitatory neurons. Analysis of spectrotemporal receptive field (STRF) nonlinearities calculated from stimuli delivered between behavioral trials suggests that attending to sound increases the selectivity of neurons for STRF-defined sound features. These results suggest that modality-specific attention can act on ACtx in through rapid, context-dependent shifts in activity level as well as information processing.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View