When researchers at Stanford University analyzed 95 million traffic stop records from 2011 to 2018, they found that African Americans are pulled over more frequently than whites by day—but are much less likely to be stopped at night, when “a veil of darkness” masks their race and makes it harder to racially profile drivers. Despite the lower number of nighttime stops, the study found, African Americans and Latinos are still more likely to have their cars searched than their white counterparts.
These results suggest that at least some police are engaging in racial profiling—that is, they are making assumptions about individuals based on stereotypes and then using that to inform their actions. These traffic stops can turn deadly, as we’ve seen numerous times during the past few years.
This reality is on the minds of the hundreds of thousands of people who took to the streets to protest against police brutality and for reforms following the death of George Floyd, an African-American man in Minneapolis who was detained by police, one of whom knelt on his neck for over eight minutes. His killing raised the specter of Philando Castile, a black man who was killed by a police officer during a 2016 traffic stop in Minnesota.
The data suggest their deaths are part of a larger pattern. In an oft-cited 2004 study, Jennifer Eberhardt and her colleagues demonstrated that even quickly glimpsing a black face causes civilians and police officers alike to imagine seeing weapons. Several studies of police officers found that exposure to negative stereotypes (mainly equating black people with guns) made them more likely to shoot black suspects. That’s probably a factor in why, according to a recent study led by Frank Edwards of Rutgers University’s School of Criminal Justice, African Americans are 2.5 times more likely to be killed by police than white people.
Unfortunately, stereotyping is natural and automatic. In order to make sense of the world around us, we often take mental shortcuts, which psychologists call heuristics. One of them is the availability heuristic, where we learn to see the world based on generalizations we make from the information that is available to us. When something is fresh on your mind, it’s more likely to influence your thinking.
For instance, many Americans have been exposed to negative portrayals of Arabs or Muslims in the news media and entertainment, and so they become more likely to stereotype these people as fanatics or terrorists. Something similar has been historically true of black men. Because the American public is repetitively exposed to images of black men as criminals, studies have found, we are more likely to see them that way.
That’s the bad news. There’s good news, too. We can change our stereotypical thinking by simply exposing people to information and images that counter stereotypes. As U.S. cities erupt in rebellions against police violence, it’s a good time to ask what we can do to see beyond our prejudices.
York University social psychologist Kerry Kawakami is one of North America’s leading academics conducting research into stereotyping and how we often create automatic associations between particular traits and groups of people. For years, she has done research into how being exposed to “counter-stereotypic information” can help change our automatic assumptions about certain groups.
“Some of my earlier work looked at trying to teach people non-stereotypical associations,” says Kawakami. “We would show [study participants] an image of a black or white person and we would tell them to choose traits not typically associated with that group in our culture.”
In a 2000 study, Kawakami and her fellow researchers asked participants to respond “NO” when they saw a picture of an African-American person paired with an associated stereotype (such as the word “lazy”). They were also asked to respond “YES” when they saw a picture of an African-American person with a non-stereotypical word (e.g., “hard-working”). This task—simply negating the negative stereotypes and affirming the non-stereotypical traits instead—successfully reduced participants’ automatic activation of prejudice.
Nilanjana Dasgupta and Shaki Asgari studied how to reduce the unconscious stereotypic beliefs that women have about their own in-group—other women. In one experiment, they found that exposing women to women in counter-stereotypic leadership positions (real-life faculty and deans, as well as historical figures) led them to associate women with the qualities of leadership more than women who were not given the same exposure. Exposure “to women leaders did not simply reduce stereotypic beliefs among study participants, but rather activated more counter-stereotypic beliefs, such as positive beliefs about women as leaders.”
In another study, Dasgupta and Asgari monitored participants who attended either a coeducational college or a women’s college. The participants in both groups had statistically equivalent scores on a test used to measure “implicit” bias—that is, reflexive and unconscious associations.
After a year in both environments, the students at the women’s college were not found to hold any implicit gender stereotypes, while the other group of students who attended the coeducational college showed stronger stereotypical beliefs.
The struggle against police violence
Could these scientific insights and practical experience help reduce police violence toward the black community?
Many studies find that training that offers counter-stereotypes holds at least the potential to save black lives. Some police departments have tried to address unconscious stereotypes with trainings against implicit bias, but it’s not clear that these trainings work in reducing real-life shootings. However, there is some promising research suggesting that we can reduce people’s tendency to see certain races as a threat.
Social psychologist E. Ashby Plant and her colleagues randomly assigned college students to play computer games where they pretended they were police officers. In the game, participants saw black and white faces paired either with a weapon or a non-weapon, such as a camera. The researchers designed the game so that participants were greeted with an equal number of white and black faces paired with weapons—thus trying to break the implicit association between weapons and black faces. They then asked a control group to play a game involving swatting insects on flowers.
The next day, all participants came back to the lab to play the same training game. In fact, players who experienced the counter-stereotyping training—which put weapons in the hands of whites and blacks equally—were much less likely to show racial bias in deciding who to shoot. In short, the game appears to help break the automatic association between black faces and threats.
Of course, the college students in that experiment were not trained police officers. In a series of three studies published in 2012, Jessica J. Sim and her colleagues examined how training may affect racial bias in the decision to pull the trigger, comparing police and civilians. In the first experiment, one group of participants read articles about black criminals. Afterward, they showed a “pronounced racial bias” in the decision to shoot. This wasn’t true of another group that read about white criminals.
However, the researchers offered this task to both civilians and police officers—and discovered that the civilians showed more bias than the police. Their next two studies looked at the content of police training. When training reinforced the association between African Americans and danger—if it involved images of black men holding guns—bias was higher. Officers whose training didn’t make this association didn’t show the same level of bias.
In other words, according to this paper, police training seems to actually restrain the potential for violence, compared to civilians without training—but the training can still be done in a way that fuels stereotypes. This creates an implicit bias that can increase the likelihood of violence against stereotyped populations.
The double-edged sword of stereotyping
Stereotypes might affect police violence in another, unexpected way. In a study published last year in Law and Human Behavior, researchers asked almost 800 police officers about how stereotyped they felt, with questions such as, “How much do you worry that people may think of you as racist?” They were also asked about the use of force and how important it was to treat members of the community with respect.
The result? Officers who felt stereotyped as racist were much more likely to endorse violence—in part because they didn’t feel legitimate as authority figures, which might make them more sensitive to threats to their authority.
“Police are typically trained to use their moral authority as peace officers to resolve conflicts,” says lead author Rick Trinkner in an announcement about the paper. “But if that moral authority is called into question, they may feel they have limited tools to gain compliance, leading to more harmful actions with potentially disastrous results.”
In other words, stereotyping of anyone seems to increase the possibility of violence. Police officers are also influenced by the culture that surrounds them, and there is quite a bit of evidence that suggests a more racially biased society leads to more biased police. One 2017 study by Eric Hehman and his colleagues examined the implicit and explicit racial biases of over two million Americans, as well as the use of lethal force against African Americans by police. Unsurprisingly, perhaps, the police in places with stronger implicit racial prejudices and stereotypes among white residents were more likely to kill black people.
In the final analysis, suggest many researchers, the issue of stereotyping and use of force seems to boil down to people being able to see each other as complex, individual human beings. That requires effort—but it’s an effort that could save lives.