fMRI/EEG combination used to decode dream imagesA study recently published in the journal Science described work on dream image mapping carried out by neuroscientist Yukiyasu Kamitani and colleagues at the Advanced Telecommunications Research Computational Neuroscience labs in Kyoto, Japan.
Functional magnetic resonance imaging (fMRI) was used to scan the brains of three young men as they drifted off to sleep inside an fMRI scanner, while simultaneously recording their brain activity using electroencephalography (EEG).
When the men had entered a ' hypnagogic state' - when their brain wave patterns had begun to resemble those known to be associated with sleep - they were woken up and asked to describe their dreams, then allowed to go back to sleep. This procedure was carried in three-hour blocks, repeated 7 to 10 times (on different days) for each volunteer. Approximately 200 dream reports were recorded from each participant, and the reported images were then grouped into categories that were specifically oriented to the individual's particular patterns of repeatedly-occurring elements using the lexical database WordNet. A video montage of images from the ImageNet database corresponding to the keywords generated by the dream reports was presented to the wide-awake men while their brain activity was being monitored. An algorithm developed to recognise the brain activity ''signatures'' associated with various dream images separated non-visual brain activity from vision-related excitation patterns, to verify that dreaming involves some of the same brain areas that are associated with visual imagery. This algorithm was combined with machine-learning techniques that used the waking brain activity patterns as 'training' examples. After training the program, the researchers input patterns of sleeping brain activity - the 'test' examples - and were able to predict which category of image had produced that pattern of brain activity.
Subsequently, upon awakening the dreamers in a second round of tests, the researchers were able to identify broad categories of dream images with 60 per cent accuracy.
''Our findings provide evidence that specific contents of visual experience during sleep are represented by, and can be read out from, visual cortical activity patterns shared with stimulus representation.''
"By analysing the brain activity during the nine seconds before we woke the subjects, we could predict whether a man is in the dream or not, for instance, with an accuracy of 75–80%."
In 2008 Kamitani and his team had reported that they could decode and reconstruct visual images from brain activity in the primary visual cortical areas, where inputs from the retina are received. The present work looks at activity in the higher order brain regions that combine integration of the visual input with concept-level processes as well as information from other senses.
The 'hypnagogic state' is a state of consciousness between waking and sleeping. 'Embodied imagination' work on the hypnagogic state was pioneered by Robert Bosnak and based on principles first developed by Carl Jung. See International Journal of Dream Research Volume 4, Supplement 1 (2011)
Dreams in the hypnagogic state were studied rather than the dreams that occur during REM (rapid eye movement) sleep later in the night because it normally takes hours to transition to REM from normal sleep. Immobilisation inside an MRI tube until REM sleep is achieved, several times a day for 10 days, would be an ordeal.
Jack Gallant, a neuroscientist at the University of California, Berkeley, commented:
"In this field of dream decoding, no one has managed to successfully do this before. So this is not the final step down this road, it's the first step."
"If you could build the perfect dream decoder it would create a movie on your television screen and it would just replay your dreams. It would replay all the actions that happened, the actors, the people involved and it would replay the sound."
Horikawa, T., et al. (2013). Neural Decoding of Visual Imagery During Sleep. Science, doi: 10.1126/science.1234330)