Multisensory processing
Chapter 6 endnote 4, from How Emotions are Made: The Secret Life of the Brain by Lisa Feldman Barrett.
Some context is:
Sensations from the outside world have become concepts in the infant’s model of the world; what was outside is now inside. These sensory experiences, over time, create the opportunity for the infant brain to make coordinated predictions that span the senses. [...] Different senses play “supporting roles” for one another.
The sensory networks are connected to the brain’s multimodal integration network, which knits together the different perceptual features of objects and events.[1] For example, the neurons in this network will associate the auditory properties of a word, like the phoneme “DAWG,” with the visual properties of objects corresponding to those words, like “four-legged,” “furry,” and “tail-wagging,” and the somatosensory features of touching the objects, like the feeling of a dog’s fur against your fingers. The neurons might also associate a dog's odor, along with the taste of anything you happen to be eating or drinking when you are with a dog, and even the interoceptive sensations and their affective feeling you have for the dog, creating a complete, multisensory representation of the dog in your brain.
For example, when I am in Oakland, California visiting my friends Ann and Angie, I spend Sunday mornings lazing on the couch in their cozy living room reading the Sunday paper, generally in a state of gezellig (see chapters 2 and 5). Their golden retriever Biskit (sister to Rowdy, whom you meet in chapter 12), is usually sprawled across my lap like a warm, furry blanket. As I write this, I can simulate the entire experience, from the way the room looks, feel of the newspaper ink on my fingers, the taste of coffee that I am usually drinking, the smell of Biskit's breath (from her occasional, unfortunate belch), and the pleasantly calm and relaxed feeling that comes from having my body budget regulated by good friends and a loving dog. The multimodal integration network contains a brain region, the anterior insula, that is also part of the olfactory (smell) and gustatory (taste) networks, and is a body budgeting region (that also launches interoceptive predictions).
In fact, many of the hubs that make up the multimodal integration network are also called the salience network, which is part of the larger interoceptive network. The multimodal integration network allows the networks for sight, sound, touch, smell and taste, as well as for interoception, to pass information back and forth. Sensory systems influence one another even at very low levels of processing.[2] You can think about these constraints as a Darwinesque “selection” pressure on sensory predictions.
See also
Notes on the Notes
- ↑ Sepulcre, Jorge, Mert R. Sabuncu, Thomas B. Yeo, Hesheng Liu, and Keith A. Johnson. 2012. "Stepwise connectivity of the modal cortex reveals the multimodal organization of the human brain." The Journal of Neuroscience 32 (31): 10649-10661.
- ↑ For a recent review, see Murray, Micah M., David J. Lewkowicz, Amir Amedi, and Mark T. Wallace. 2016. "Multisensory Processes: A Balancing Act across the Lifespan." Trends in Neurosciences 39 (8): 567-579.