A new scientific understanding of perception has emerged in the past few decades, and it has overturned classical, centuries-long beliefs about how our brains work—though it has apparently not penetrated the medical world yet. The old understanding of perception is what neuroscientists call “the naïve view,” and it is the view that most people, in or out of medicine, still have. We’re inclined to think that people normally perceive things in the world directly. We believe that the hardness of a rock, the coldness of an ice cube, the itchiness of a sweater are picked up by our nerve endings, transmitted through the spinal cord like a message through a wire, and decoded by the brain.
[…]Our assumption had been that the sensory data we receive from our eyes, ears, nose, fingers, and so on contain all the information that we need for perception, and that perception must work something like a radio. It’s hard to conceive that a Boston Symphony Orchestra concert is in a radio wave. But it is. So you might think that it’s the same with the signals we receive—that if you hooked up someone’s nerves to a monitor you could watch what the person is experiencing as if it were a television show.
Yet, as scientists set about analyzing the signals, they found them to be radically impoverished. Suppose someone is viewing a tree in a clearing. Given simply the transmissions along the optic nerve from the light entering the eye, one would not be able to reconstruct the three-dimensionality, or the distance, or the detail of the bark—attributes that we perceive instantly.
[…]The images in our mind are extraordinarily rich. We can tell if something is liquid or solid, heavy or light, dead or alive. But the information we work from is poor—a distorted, two-dimensional transmission with entire spots missing. So the mind fills in most of the picture. You can get a sense of this from brain-anatomy studies. If visual sensations were primarily received rather than constructed by the brain, you’d expect that most of the fibres going to the brain’s primary visual cortex would come from the retina. Instead, scientists have found that only twenty per cent do; eighty per cent come downward from regions of the brain governing functions like memory. Richard Gregory, a prominent British neuropsychologist, estimates that visual perception is more than ninety per cent memory and less than ten per cent sensory nerve signals. When Oaklander theorized that M.’s itch was endogenous, rather than generated by peripheral nerve signals, she was onto something important.
[…]The account of perception that’s starting to emerge is what we might call the “brain’s best guess” theory of perception: perception is the brain’s best guess about what is happening in the outside world. The mind integrates scattered, weak, rudimentary signals from a variety of sensory channels, information from past experiences, and hard-wired processes, and produces a sensory experience full of brain-provided color, sound, texture, and meaning. We see a friendly yellow Labrador bounding behind a picket fence not because that is the transmission we receive but because this is the perception our weaver-brain assembles as its best hypothesis of what is out there from the slivers of information we get. Perception is inference.
[WARNING! Full article contains what may be the single ickiest medical anecdote I’ve ever read. -egg]