scorecardresearch Skip to main content

The new science of our cross-wired senses

Yes, your ears can change what you taste. What discoveries about cross-sensory perception are revealing about the brain.

Martin Gee/Globe Staff

The senses have always been our portals into the outer world. We have the classic five that Aristotle talked about — sight, hearing, smell, taste, and touch — plus more recently recognized senses of balance, temperature, pain, and body position and movement. Each evolved to collect some distinct type of information about our environment, and to tell us our status within it.

That’s largely how we tend to think about the senses, anyway: separately, each one its own distinct way to understand the world around us.

But in recent years, various findings have emerged to challenge that assumption — strange illusions in which one sense seemed to change the perceptions of another. One study published in 2000 particularly grabbed people’s attention: When researchers at Caltech showed test subjects a brief flash of light accompanied by two quick tones, many people saw two flashes instead of one. The same effect occurred when the researchers tapped their subjects’ skin twice as the light flashed. Vision — considered our most reliable and dominant sense — could be altered by sound or touch.

And that wasn’t all. Other studies showed that what people saw affected what they heard; that certain types of music or background noise affected how food tasted; and that smells could influence how a texture felt to the touch.

Advertisement



What the researchers were uncovering, in other words, is that our senses are not so separate after all. Scientists have realized that interaction between the senses “is the rule rather than the exception,” says Ladan Shams, one of the researchers who conducted the light-flashing study and now a sensory scientist at the University of California at Los Angeles. From the earliest stages of perception, it appears, the senses are enhancing, competing with, and even altering one another in surprising ways.

Since then, a new field has emerged to study cross-sensory perception, with laboratories throughout the world devoted to understanding how the senses merge. Scientists are developing a new way of thinking about how our brains are organized and how we perceive the world. And what began as basic scientific research to understand the brain’s organization is spreading into other fields, such as marketing: Companies are starting to engineer foods that taste better by appealing to the eyes and ears, for instance. The work may even have implications for medicine — helping to explain, say, how the brain can compensate for a missing sense — and for education.

Advertisement



It might seem unsettling that the perceptual tools we rely on to navigate the world are so fluid — not just capable of being fooled, but capable of fooling one another. But the constant interaction and interference between our senses, in fact, is central to one of the brain’s most astonishing feats: its ability to take a sea of complex, conflicting sensory input and assemble it into a fairly reliable picture of the world.

Philosophers have long debated the primacy of the senses in knowing truth, but they have rarely questioned their separateness. The Epicurean poet and philosopher Lucretius, for example, argued that the senses couldn’t influence one another, “for each has powers discrete and apart, its separate force.” Because of these separate powers, he reasoned, “it must be, then, that one sense cannot prove another wrong.”

Yet we’ve always understood intuitively that senses do affect one another in certain ways. As anyone who’s ever eaten dinner while nursing a bad cold knows, nearly all of food’s flavor comes from our sense of smell, not taste. Since the dawn of the talkies, moviegoers have experienced this kind of sensory interaction, too. Their ears might hear sounds from a speaker behind them, but their eyes persuade them that the voices are coming from actors projected on the screen.

Advertisement



Now, science is showing that such connections among the senses are more widespread and deeply rooted than we ever imagined. What happens in the movie theater isn’t just an isolated illusion — the blending of sensory information is critical for the brain to create a seamless interpretation of its outside world.

Research into perception is following suit. Over the past decade, previously disparate studies of the senses have begun to merge. There is now a yearly conference devoted to multisensory research, and the topic is finding its way into neuroscience meetings. Some scientists focus specifically on the integration of senses, while others have expanded their previously single-sense research to include others. Shams, at UCLA, says that while some people initially doubted whether isolated illusions had bearing on the everyday function of the senses, most now accept there are countless ways they are intertwined.

Martin Gee/Globe Staff

One researcher who has spearheaded this change is psychologist Charles Spence, head of the Crossmodal Research Laboratory at Oxford University. While neuroscientists have been piecing together how senses connect in the brain, his work has revealed how the crossing of sensory information affects perception and behavior. His recent work on the psychology of flavor perception, for instance, has shown that the flavor of your food is influenced by touch, vision, and even sound. A study from his lab a few years ago showed that people rate potato chips as crisper and better-tasting when a louder crunch is played back over headphones as they eat. A study published this year showed that people thought a strawberry mousse tasted sweeter, more intense, and better when they ate it off a white plate rather than a black plate. Other researchers have conducted similar studies showing that our impressions of experiences, and our emotional responses to them, derive from a blending of different kinds of sensory input — a process that is usually completely unconscious.

Advertisement



These findings are leading to a fuller picture of how we really perceive the world around us. Barry Stein, a multisensory scientist at Wake Forest University, says that what’s been surprising is how early in the process of perception the senses begin to overlap. Even before the brain makes higher-level judgments about the sensory information it is receiving, Stein says, special “multisensory neurons” that respond to more than one sense begin to synthesize it.

This process allows the brain to quickly blend different channels of information into one impression. In some cases, senses enhance one another: A distant image paired with a weak sound can appear more noticeable than each alone. In some cases they compete with each other and one wins out (as your eyes win over your ears in the movies). In others, the information merges into something new; when people watch a video of a person saying “ga” while the audio is dubbed with a voice saying “ba,” they hear an intermediate “da.” Though the senses can fool us in certain cases, being able to integrate them helps us make a quick judgment and move on, rather than puzzling over conflicting information.

Advertisement



The ability to coordinate among the different senses seems to be something the brain learns; we’re not born being able to do it. “You’d think that the brain comes with all this hardware built into it,” says Stein. “But that’s not the case.” Instead, research shows that after we’re born, the brain quickly learns to put information from the senses together. This early wiring of the brain to coordinate sensory input helps explain why people born without a sense who then regain it — such as deaf people who receive cochlear implants later in life — have a difficult time learning to integrate the new sensory information.

This research sheds light on other fascinating phenomena that neuroscientists have observed in those with impaired sensory functions, too — and it may ultimately suggest possible therapies. In blind people, for example, research has shown that the sense of touch activates the visual cortex; in other words, areas of the brain normally designated for processing one sense can adjust to make use of information from another. Then there are people, like those with autism or other conditions, who have impaired abilities of sensory integration. Therapists influenced by the science of multisensory integration have worked with people with autism to create “sensory diets,” interventions that focus on using senses together.

And the new work may ultimately affect how the rest of us learn, as well. Shams’s group at UCLA has found that people learn a visual task better when it’s accompanied by sound, for instance — even when they are later tested using only vision.

In broader commercial applications, meanwhile, the science is already providing a new basis for what marketers have long surmised: They are selling customers more than just the core sensory experience. Restaurant owners, for instance, know that choosing decor, lighting, music, and table settings that complement their food can boost their bottom line, and companies have long market-tested food products for texture and packaging as well as taste. But we are now beginning to understand that these elements don’t just create atmosphere and associations — they can actually make food taste different. For example, several studies have found that adding red coloring can make drinks taste sweeter, allowing a company to reduce sugar content while turning color up a notch.

Scott King, part of a UK company called Condiment Junkie that creates sounds to enhance products and events, says that recruiting multiple senses works best when “one sense is choreographed with another in a way that has an effect greater than the sum of its parts.” The company has worked with Fat Duck restaurant in Bray, England, run by celebrity chef Heston Blumenthal, to develop soundtracks to bring out specific flavors in the food, based on their finding that hearing certain sounds (high tones, tinkling pianos) make people perceive a bittersweet toffee as more sweet, while hearing low-pitched tones and trombones make the toffee taste more bitter.

Beyond the practical consequences of this new model of how we perceive the world, however, lie the philosophical implications. What does it mean that Lucretius was wrong — that our perceptions of the world are not just a product of five pure separate senses, but of a dynamic interaction between them?

Barry Smith, a philosopher at the University of London, says that philosophers have long puzzled over the relationship between the senses and the truth: Descartes, for instance, felt that we could never trust our senses as representing an outer reality. But Descartes felt we could at least rely on our own minds. By showing how much our minds are the sites of intersecting, conflicting sensory input, Smith says, neuroscience shakes up this trust. “Descartes seems to have not been going far enough,” Smith says. If senses can change one another, “we’re not so reliable about even our own experiences.”

On the other hand, seeing the senses as interdependent can be a boon to more than just marketers, educators, or those trying to overcome disabilities: In everyday life, the reminder to consider all our senses may change our experience. Smith says he hopes the research will encourage people to value senses they often overlook, like smell, and to look for ways to make our senses work better together to enhance our experiences, whether we’re cooking a holiday meal, decorating our houses, or creating art. Though it might seem strange or even superfluous to think about the color of the plates we will eat from, it stands to alter our experience. After all, no sight or sound exists in a vacuum; at the deepest neurological level, when we sit down to that meal, all our senses will be working together.


Courtney Humphries is a freelance writer in Boston and the author of “Superdove: How the Pigeon Took Manhattan...And the World.”