clock menu more-arrow no yes mobile

Filed under:

The science of regrettable decisions

A doctor explains how our brains can trick us into making bad choices — and how to fight back.

Javier Zarracina/Vox

As Full House actress Lori Loughlin and her husband await their next court date, they stand accused of paying a $500,000 bribe to get their daughters into the University of Southern California as crew team recruits. Their defense is said to rest on the belief that they were making a perfectly legal donation to the university and its athletic teams (their children never rowed a competitive race in their lives).

Legal strategies and moral considerations aside, this strange behavior has left many observers wondering, “What were they thinking?” Surely, Loughlin and her family must have considered someone at the university would audit the admissions records or realize the coach’s high-profile recruits had never rowed a boat.

We may never know exactly what Loughlin and her family were thinking. But as a physician who has studied how perception alters behavior, I believe that to understand what compelled them to do something so foolish, a more relevant question would be, “What were they perceiving?”

Understanding the science of regrettable decisions

Several years ago, I joined forces with my colleague George York, a respected neurologist affiliated with the University of California Davis, to understand why smart people make foolish choices in politics, sports, relationships, and everyday life. Together, we combed through the latest brain-scanning studies and decades of psychological literature.

We compared the scientific findings with an endless array of news stories and firsthand accounts of real people doing remarkably irrational things: We examined the court testimony of a cop who, despite graduating top five in his academy, mistook his gun for a Taser and killed an innocent man. We dug through the career wreckage of a once-rising politician who, despite knowing the risks, used his work phone to send sexually explicit messages. And we found dozens of studies confirming that doctors, the people we trust to keep us safe from disease, fail to wash their hands one out of every three times they enter a hospital room, a mistake that kills thousands of patients each year.

When we read about famous people ruining their lives or hear about normal people becoming famous for public follies, we shake our heads in wonder. We tell ourselves we’d never do anything like that.

But science tells us that we would, far more often than we’d like to believe.

What alters our perceptions

In the scientific literature, George and I noticed an interesting pattern: Under the right circumstances, a subconscious neurobiological sequence in our brains causes us to perceive the world around us in ways that contradict objective reality, distorting what we see and hear. This powerful shift in perception is unrelated to our intelligence, morals, or past behaviors. In fact, we don’t even know it’s happening, nor can we control it.

George and I named this phenomenon “brainshift” and found that it happens in two distinct situations: those involving high anxiety and those associated with major reward.

Under these conditions, all of us would do something just as regrettable as the headline-grabbing stories above, contrary to what we tell ourselves. Phrased differently, we don’t consciously decide to act a fool. Rather, once our perception is distorted, we act in ways that seem reasonable to us but foolish to observers.

Javier Zarracina/Vox

How our fears and desires fool us

This neurobiological process is best observed in a research study, published in 2005 in the journal Biological Psychiatry, by the neuro-economist Gregory Berns. He recruited volunteers for what he advertised as a vision experiment. Five participants at a time were asked to look at computerized 3D shapes and decide whether the figures would match or clash when rotated. The trick was this: Four of the five test subjects were part of the research team, intentionally giving wrong answers to specific questions, which could be seen by the one non-actor in the room. Would the other answers influence that person’s selections?

Berns found that 30 percent of the subjects answered correctly every time, despite the contradictory responses given by others. MRI scans revealed that this act of nonconformity caused the participants great discomfort. It also activated an almond-shaped structure in the temporal lobes of the brain called the amygdala, which is associated with negative emotions such as fear and apprehension.

By contrast, those participants whose answers aligned with the others activated a different part of the brain called the parietal lobes. This area, near the back of the head, is responsible for our perceptions: what we see, hear, taste, and feel. Knowing the answers from the others caused their brains to subconsciously alter what they saw. Based on this changed perception, they then concurred with the others, avoiding the amygdala stimulation and associated pain they otherwise would have experienced.

Looking at the data, when subjects were presented with the erroneous answers, they gave the wrong response 41 percent of the time, but only 13 percent when deciding by themselves. In almost all cases, they felt their answers were correct. Only 3.4 percent of the subjects said they had known the right answer but went along with the majority response anyway.

If peer pressure and conscious choice were the culprits in their decisions, the participants would have been aware it was happening. But the study suggests it was a subconscious shift in perception that can occur even when subjects think they’re alone.

The case of the good seminarian

In 1973, the research duo of John Darley and Daniel Batson asked Princeton Theological Seminary students to visit a group of children across campus to deliver a sermon on the parable of the Good Samaritan.

The researchers told some of the future pastors, “It’ll be a few minutes before they’re ready for you, but you might as well head on over.” They told others, “You’re late. They were expecting you a few minutes ago. You’d better get moving.”

While proceeding across campus, each subject passed a man slumped in a doorway, moaning and coughing.

Imagine yourself in this situation: A classroom of children awaits you but, along the way, you encounter a man who’s clearly in distress. Is there any doubt what you do? Or what religiously attuned students would do? No matter the circumstances, we’d expect everyone to help. However, only 10 percent of the “hurried” students stopped to offer assistance.

The best explanation for this behavior is that, amid the anxiety of running late, most of the students experienced a perceptual shift that caused them not to see the man or recognize his distress. Otherwise, logically, all would have stopped to help.

So far, these examples have demonstrated how people behave in the context of controlled research studies. But George and I observed the same subconscious distortion of reality play out in dozens of real-life examples throughout history.

Observing the “brainshift” process in real life

One of the more notorious examples is the case of the Norden Bombsight, a story masterfully told in Malcolm Gladwell’s famous 2011 TED talk.

It was the early days of World War II, and with Nazi aggression on the move, the Allies needed to conduct massive airstrikes to achieve victory. But US generals and senior military officials faced a fear-inducing dilemma: how to take out military targets without inadvertently killing civilians in nearby buildings? Carl Norden, a Swiss engineer, promised a solution. He claimed the Norden Bombsight could drop a bomb into a pickle barrel from 20,000 feet above.

Convinced it would save civilian lives, American leaders bought 90,000 units in 1940 and paid a modern-day equivalent of $30 billion. There was just one problem: Norden’s devices didn’t work. American flyers estimated as many as 90 percent of bombs missed their targets.

Of course, MRI machines didn’t exist in the 1940s, but we can predict what they would have found. The immense value of a precision bombing tool would have stimulated the generals’ reward centers, activated their parietal lobes, and led them to perceive the technology as effective despite overwhelming evidence to the contrary.

Perhaps the generals would have made different decisions if they were on the battlefield themselves. This next study examines what people do when they’re directly in harm’s way.

When reward opportunities put us in life-threatening situations

To demonstrate the mind-altering effects of a dangerous situation, we turn to a 2010 episode of NBC’s Dateline called “What Were You Thinking?”

Host Chris Hansen sets the scene: “We rented this room on the fourth floor of an old building and hired these temp workers who were told they’d be doing clerical work for the day.”

The workers don’t know it, but everyone in the room is a Dateline staffer who knows what’s about to happen. As smoke begins to fill the room, the staffers pretend nothing’s wrong. The smoke is harmless, of course, but the temp workers don’t know that. It would appear the building’s on fire and yet 90 percent of applicants remain seated, even after the room has completely filled with smoke. When asked why they ignored the threat, the subjects reported that they didn’t see the situation as dangerous.

We can’t assign this illogical behavior to “groupthink” or “peer pressure,” or any explanation other than altered perception. When our safety is in jeopardy, we don’t decide to die with others just to fit in. Parents like to ask children whether they’d jump off a bridge if their friends did. They know the answer is no.

Based on the available neurobiological data, the most logical conclusion is that these temp workers, seeking the reward of a full-time position, experienced a subconscious shift in perception that led them to behave in ways they probably regretted once the show was aired. The same phenomenon was illustrated decades earlier during Stanley Milgram’s electric shock study, the kind of horrific experiment today’s scientific community would no longer permit.

Why we stick with bad decisions after we make them

The Dateline experiment showed us that situations involving fear and reward can lead to poor “snap judgments.” But what would cause someone to stand by a foolish decision?

The science of behavioral economics tells us that after we’ve made a decision, even an illogical one, we tend to cling to it. That is, we filter out dissenting information while seeking data that confirms our original viewpoints. Psychologists call this “anchoring.”

The combination of distorted perception and anchoring explains why a bevy of venture capitalists, high-ranking generals, and business tycoons all lined up to invest in Theranos, the now-disgraced blood-testing startup founded by Elizabeth Holmes.

It’s unclear whether Holmes studied or knew about the neurobiological progressions that distort our perceptions, but she used them to perfection. In her sales presentations, she played to a fear that nearly all humans share: She spoke of large-bore needles drawing vial after vial of blood and promised that her technology could make the process painless. Simultaneously, her comments triggered the reward center of the brain as she explained how just a few drops of blood could lead to the earlier detection of cancer and, in her words, create “a world in which no one ever has to say goodbye too soon.”

Just how powerful were these fear and reward triggers? By May 2015, investors had given Holmes $900 million without ever demanding to see an audited financial statement or published proof that her technology worked. Anchoring bias, brainshift’s partner in crime, explains why so many of Holmes’s board members and investors stood by her even after investigative reports began exposing the company as fraudulent.

Can we protect ourselves from this?

Based on our research, the first big step toward avoiding the dangerous consequences of brainshift is to be aware that we are all vulnerable, regardless of our ethics, social status, or IQ.

Next, we must be cognizant of situations that stoke our fears and desires: Those involving money, sex, and fame/recognition are good places to start. Before making decisions, we should ask a trusted friend or even an outsider for an opinion.

When situations allow, consult an independent expert. If an investment opportunity seems too good to be true, try talking yourself out of it. If your counterargument seems rational, listen.

Finally, and particularly in the context of reward, write down the answer to these questions:

  1. What’s the worst thing that could happen?
  2. How would I feel if that outcome occurred?

Had Lori Loughlin and her husband asked these questions — with the reward of a USC acceptance letter on the line — they might not be facing potential jail time.

Dr. Robert Pearl is the former CEO of the Permanente Medical Group, Kaiser Permanente. He currently is a professor in the Stanford Graduate Schools of Business and Medicine.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.