In April, a Starbucks manager in Philadelphia called the police on two black men sitting peacefully inside a coffee shop. The culprit, according to Starbucks’ CEO and the city’s mayor, was implicit bias—subconscious thought that can influence behavior. Headlines like CNN’s “What the Starbucks incident tells us about implicit bias” soon followed. In response, Starbucks closed its doors for half a day in May to put 175,000 employees through a program to educate them about racial bias.
Implicit bias is a trendy explanation for everyday discrimination. In the wake of high-profile bias incidents, companies and organizations often prescribe implicit bias training without knowing what actually caused the incident in question. And that, says some experts in the field, is a big problem. Implicit bias is subconscious thought. Although it might lead to discrimination, there’s no way to know without testing someone in a lab.
“Our whole discipline has no business explaining individual instances of behavior,” says Liz Redford, a consultant with the nonprofit Project Implicit, which does implicit bias research, education, and consulting.
Here’s the problem with the way the phrase gets bandied about: When a landlord uses Facebook’s ad targeting system to exclude Latinos and blacks from seeing apartment listings, or a lawyer harasses workers in a New York deli, that’s discrimination, which is behavior.
Jimmy Calanchini, a social psychologist at University of California, Riverside, says, “Every time another headline blames something on implicit bias, the other cognitive scientists and behavioral scientists in my social media circle do a collective eye roll.”
Cognitive scientists study trends and examine big-picture data to try and determine what we can learn about implicit bias on a societal level; there’s little research value to a handful of incidents. And even if the experts could agree that implicit bias is indeed the sole cause of headline-grabbing discrimination, they don’t necessarily agree on what to do about it.
Take the Starbucks bias training, for example. Starbucks shut their doors to customers so employees could watch videos about profiling, discuss their thoughts, and practice different scenarios. Despite the good intentions of this training, research shows this approach doesn’t work. A 2016 study examined different attempts to reduce bias and found that, although training reduced bias for a short period of time, the positive effects disappeared within several days (or hours in some cases). Implicit bias is simply too entrenched for this training to be effective.
With Eric Hehman and Jessica Flake, Calanchini did a study that measured implicit bias in police officers by comparing regional results from Project Implicit’s black-white bias association test—it measures how fast a person associates white with good, black with bad and vice-versa—with police killings in the U.S. in 2015, mapped by the Guardian. The findings were chilling, if not unexpected: In more racially biased communities, police kill more black people.
But Calanchini is wary of the way such findings are applied. “I have more confidence in programs that focus on changing behavior without telling everybody they should feel bad for being secret racists,” he says.
He cites the example of an effective measure the Las Vegas Metropolitan Police Department took to combat implicit bias. In 2011, the department worked with the Center for Policing Equity to develop a new policy: An officer in a foot pursuit would no longer put hands on the person fleeing if possible. It was up to the officer’s backup, usually right behind, to make the arrest. Running after a suspect pumps the pursuing officer with adrenaline and stress, both of which allow implicit bias to impair decision-making. The Las Vegas policy allows the more cool-headed officers to arrest the suspect as their biases are less likely to result in violence. The Center for Policing Equity reported that use of force incidents fell 23 percent in the year after the policy was established.
Strict policing rules like this have shown the ability to significantly reduce police violence. Campaign Zero, the police reform campaign launched by Black Lives Matter, emphasizes policies that decrease police violence: Among them, requiring comprehensive reporting and requiring officers to exhaust all non-lethal means to subdue a suspect before shooting. Research by the Campaign-Zero-affiliated Police Use of Force Project shows that with each additional reduction of force policy, a police department reduced their killings by 15 percent.
But Redford, who delivers workshops with Project Implicit, sees value in anti-bias thought-training—as long as it goes beyond just thinking happy thoughts about minorities and women. “If you just want people to feel satisfied with your training or feel good about themselves, then sitting around thinking those thoughts might be effective,” she says. A more thorough program to address bias would also include enacting effective policies in hiring practices.
Redford recounts the example of the Boston Symphony, which wanted to hire more women and began offering blind auditions in 1952. The belief was this would help judges overcome their biases against female musicians. But it didn’t work: The symphony still only seemed to hire men. Finally, someone realized the problem: The tapping of heels gave the female candidates away. A new policy asked candidates to remove their shoes before auditions. Suddenly, the gender imbalance began to correct itself.
Calanchini isn’t totally opposed to egalitarian thought training because it’s never a bad idea to improve your worldview and fight stereotypes. But he believes those techniques are no substitute for clearly defined policies that change behavior. Take the example of Starbucks again: The company now allows anyone to hang out in the store or use the bathroom without buying something. This will have a more lasting effect than a few hours of implicit bias training.
For employers, city officials and law enforcement, it’s helpful to anticipate situations where implicit bias might lead to discrimination—and be proactive about addressing it directly. Either check everyone’s receipts in a store, or check none; put up signs that say “Spanish-speakers welcome here.” Even simply requiring officers to warn a suspect before shooting reduced police killings by 5 percent, the Police Use of Force Project found.
“You can think of implicit bias as your gut reaction,” Calanchini says. “But if you slow down and engage in more deliberative thinking, implicit bias doesn’t drive judgment and behavior as much.”