The Math behind Your Uncashed Checks
Receiving a check can feel rewarding; the tedious chore of traveling to the bank and cashing it often doesn’t.
The Math behind Your Uncashed ChecksAround 12:40 a.m. on February 5, 1999, four plainclothes New York City police officers saw a black man standing near a building in New York’s Bronx borough. He seemed to match the description of a serial rapist on the loose, according to later reports, so they pulled over to investigate. When the man reached into his pocket and took out an object that was hard to see in the darkness, the officers fired 41 shots, hitting the man 19 times and killing him. That decision to shoot would be later dissected in press accounts and court cases.
The victim, 23-year-old Amadou Diallo, was unarmed. More recently, other unarmed black men—including Michael Brown in Ferguson, Missouri, in August 2014—have been shot and killed by police, prompting a rancorous public debate. The police officer who shot and killed Walter L. Scott, a 50-year-old father of four, in South Carolina in April—an incident caught on video—appeared to make a blatantly bad choice on whether and how to use his gun. He was charged with murder five days after Scott’s death.
But in many other cases, what happened is murkier. Did police officers shoot people because they were black? Many minorities claim pure racism, while police groups usually deny it. The question is emotionally charged, and simply raising it can be interpreted as an accusation. It’s also difficult to answer scientifically. Every shooting has its own circumstances, and many factors, from the time of day to a person’s behavior before a shooting, can interact to influence an officer’s decision to pull the trigger.
But a decade’s worth of research from experimental psychology, some of it inspired by Diallo’s death, may help. Joshua Correll at the University of Colorado, Boulder, and Bernd Wittenbrink, professor of behavioral science at Chicago Booth, lead research teams analyzing the role of stereotypes in shootings. The findings overwhelmingly suggest that race matters—even if all other factors were aligned to yield an accurate decision on the use of deadly force.
Overt racial discrimination has fallen in the United States in recent decades, but still exists, linked to popular stereotypes. Patricia G. Devine at the University of Wisconsin demonstrated in a 1989 lab experiment that people—both those who exhibit high amounts of prejudice, and those who don’t—associate black people with aggressive behavior and criminality. When researchers led by Iowa State’s Stephanie Madon in 2001 asked psychology students to write down attributes characterizing African Americans, “aggressive” made the top 10.
Could such associations influence an officer’s decision about whether to fire a gun? Correll, Wittenbrink, and colleagues from the University of Colorado, Boulder, recruited people to participate in the “shooter task,” an experiment mimicking high-stakes, split-second decision-making. The researchers asked participants to look at a series of photos of outdoor scenes in places such as city streets and suburban malls. In one photo, a man suddenly appeared, holding an object. In some cases he held a gun, in others he held an innocuous object such as a phone or wallet. If participants perceived the man to be armed, they pressed a button corresponding to a “shoot” decision. If they thought he was unarmed, they pressed a “don’t shoot” button. Participants had to decide quickly—within a few hundred milliseconds—which button to press.
The researchers saw patterns. “Though race is irrelevant to this task, participants are faster and more likely to shoot black targets,” the researchers write, in a paper published in 2007. “They are also faster and more likely to indicate don’t shoot for whites.” Typically, participants responded quickly and accurately when people they saw on the screen conformed to cultural stereotypes, such as that of the “aggressive” African American established by earlier research. When the presented scene violated those stereotypes—a black man holding a wallet, for example—study participants responded more slowly and inaccurately.
And when these researchers presented participants with news reports that reinforced stereotypes linking black people with violence, participants were more likely to shoot a black person in error. “Reading newspaper articles about violent criminals yields significantly greater bias when those criminals are described as Black rather than White,” they write. After reading an article that described a string of armed robberies, and included a description and sketch of black male suspects, “participants set a more lenient criterion for the decision to shoot Black (rather than White) targets.” Participants who read similar news reports about white suspects showed no evidence of bias.
The lab studies were designed to recreate the circumstances and decision-making process of an officer in a high-stakes situation such as the Diallo shooting. The researchers rewarded participants for accurate decisions and imposed a time limit that prevented participants from pondering their decisions. Doing this allowed the scientists to isolate and focus on less deliberate factors, such as subconscious bias and stereotypes. In split-second decisions, the bias created by cultural associations isn’t deliberate but may occur as a result of how our brain rapidly processes information. “Stereotypes often operate implicitly, in the sense that we don’t necessarily deliberately use stereotypes, but they nevertheless shape our judgments and perceptions,” says Wittenbrink. “African-Americans are at risk of appearing more dangerous because of ingrained cultural stereotypes.” (For more on implicit bias, see “Think You’re Not Racist?” Summer 2014.) A police officer, facing a decision to shoot a potential suspect, is subject to exactly the same type of influences.
The findings on how exactly stereotypes can cause a fatal mistake are evolving. Some research indicates that, when making split-second decisions, people may know the correct thing to do, but bias can interfere. In a 2005 study, for example, B. Keith Payne of the University of North Carolina at Chapel Hill and Yujiro Shimizu and Larry L. Jacoby of Washington University in St. Louis presented participants with what psychologists call a “weapon identification task.” Participants were handed pictures of guns or innocuous tools, and given a split second to decide what each object was. In some cases, researchers primed participants by first showing them pictures of black faces. When primed, people more frequently—and wrongly—identified tools as guns.
Payne argues that the errors people make may be caused by time pressure: the brain is processing a lot of potentially competing information, such as perceptual input about the object and stereotypic associations tied to the race of the person. Under stress or time pressure, it becomes difficult to sort out what is relevant and correct information, leading the person to make the wrong decision. Without the time pressure, people make fewer mistakes. Payne, Shimizu, and Jacoby’s study indicates that when people are asked to perform the weapon identification task a second time, with no time limit, the race bias they had displayed disappears.
But new research by Correll, Wittenbrink, and their colleagues challenges that idea. It’s not that people know how to make correct decisions, but in executing those decisions, they are affected by stereotypes, the researchers say. Rather, stereotypes also shape the decision input, changing how people perceive objects themselves. “Stereotypes really do affect how we see the object,” Wittenbrink says. “At first glance, a phone may look like a gun in the hand of a black man.”
This is something we know from decades of basic research on visual perception, he says: our expectations shape what we see. It’s how many popular visual illusions work—and why, for example, we see a man in the moon. Our brains integrate patterns of visual input with other information that we are familiar with, and stereotypes operate in the same way. They can provide information for how our brains interpret visual input.
To demonstrate this, the researchers—Correll, Wittenbrink, and colleagues from Victoria University of Wellington and San Diego State University—again turned to the shooter task, revisiting some of the data they had published in earlier papers. To probe their hypothesis about perception, they analyzed participants’ performance in the shooter task to isolate specific aspects of the decision-making process. They focused on processes relevant to perception, such as the rate at which information about the critical object is acquired.
If stereotypes shape how a participant perceives an object, stereotypes ought to affect the rate of information acquisition. When a stereotype matches the actual visual input, the researchers hypothesized, participants should be able to acquire the stimulus information faster. When the stereotype conflicts with the actual input, the rate of information acquisition should be lower.
This is indeed what the researchers find. Expectations, or stereotypes, play tricks on the brain. A phone in the hand of a black person more readily looks like a gun.
Receiving a check can feel rewarding; the tedious chore of traveling to the bank and cashing it often doesn’t.
The Math behind Your Uncashed ChecksAlthough many people work hard for years in hopes of retiring comfortably, new retirees may find themselves longing for the days when they were busy beavering away.
From 2013: Idle TrapOne well-known logical flaw is the gambler’s fallacy, which tends to affect not only roulette players but anyone forced to make a series of similar decisions.
What’s Next, Heads or Tails?In another study, the researchers gave participants a second chance to perform the shooter task. They again had only a few hundred milliseconds to decide whether or not to shoot the man on their screen. But then he disappeared, and participants were asked again for their decision, with no time restriction imposed.
This time, even when given unlimited time to make their decision, participants still exhibited unmitigated racial bias. Participants who had seen a photo of a black man holding a phone still truly believed they had seen a gun in his hand.
“Participants who mistakenly ‘see’ the cell phone as a gun when it is held by a Black target do not err by incorrectly executing their intended response. Rather, they err by correctly executing an intention that is, itself, incorrect,” the researchers write.
Data collected with an eye-gaze-tracking device further illustrate this point. Race affected how participants in the shooter task searched for the critical object, the gun or phone. When the scene involved an armed black man, participants’ eye gaze didn’t reach the object as closely as when the scene showed an armed white man. Thus, participants made their decision with relatively less detailed visual information about the object itself—presumably because the race of the person holding the object was used to fill in the missing information about what he was holding.
Stereotypes can cause fatal errors, but there’s evidence that training can reduce these biases, or improve a potential shooter’s acuity. Police-academy students undergo extensive training in these types of decisions, in the form of suddenly appearing silhouettes, videogames, and interactions with live actors. This training seems to help in some important ways.
Back in 2007, Correll, Wittenbrink, and several colleagues looked at how police officers differed from laypeople in how they decided whether to shoot black or white people armed with guns or innocuous objects. Police officers from Denver, Colorado, and 14 states participated.
Like community members, police officers exhibited racial bias in the shooter task. But officers did so to a lesser extent, and overall, their decisions were more accurate than those of lay people.
The recent perception studies offer a possible explanation for these differences. Experience with similar situations and training improve the officers’ abilities to extract visual information effectively and perceive objects more accurately.
The explanation also points to possible ways of further improving officers’ decision-making. “In particular, training exercises that focus on the officers’ ability to attend to and process critical information in the situation should be useful in limiting race bias from stereotypes,” says Wittenbrink. Stereotypes persist, but adequate police training might help limit their adverse influences.
To motivate yourself, either change your circumstances or change the way you think about those circumstances.
Motivation Isn’t About Being Strong, It’s About Being WiseResearch in China explores the connection between crops and contentedness.
Line of Inquiry: Thomas Talhelm on the Happiness Gap between CulturesAn expert panel discusses some of the most important risks and trends facing investors, executives, and policy makers.
What Will Move the Global Economy in 2024?Your Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.