Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine

UC Irvine Previously Published Works bannerUC Irvine

Rapid Eye Movements in Sleep Furnish a Unique Probe Into Consciousness.

Abstract

The neural correlates of rapid eye movements (REMs) in sleep are extraordinarily robust; including REM-locked multisensory-motor integration and accompanying activation in the retrosplenial cortex, the supplementary eye field and areas encompassing cholinergic basal nucleus (Hong et al., 2009). The phenomenology of REMs speaks to the notion that perceptual experience in both sleep and wakefulness is a constructive process - in which we generate predictions of sensory inputs and then test those predictions through actively sampling the sensorium with eye movements. On this view, REMs during sleep may index an internalized active sampling or scanning of self-generated visual constructs that are released from the constraints of visual input. If this view is correct, it renders REMs an ideal probe to study consciousness as an exclusively internal affair (Metzinger, 2009). In other words, REMs offer a probe of active inference - in the sense of predictive coding - when the brain is isolated from the sensorium in virtue of the natural blockade of sensory afferents during REM sleep. Crucially, REMs are temporally precise events that enable powerful inferences based on time series analyses. As a natural, task-free probe, (REMs) could be used in non-compliant subjects, including infants and animals. In short, REMs constitute a promising probe to study the ontogenetic and phylogenetic development of consciousness and perhaps the psychopathology of schizophrenia and autism, which have been considered in terms of aberrant predictive coding.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View