Sybil Dunlop, BridgeTower Media Newswires
There is a new trend in the legal field. As we have learned more about the real dangers of implicit bias (and how such biases impact judicial outcomes), we have started warning juries about the risk of implicit bias before jurors begin deliberations. The Iowa Supreme Court recently held that it would not be an abuse of discretion to use the American Bar Association's proposed instruction on bias. And the Western District of Washington is showing a video on unconscious bias to potential jurors. But while I want nothing more than to eliminate implicit bias from impacting jury decision making, I have real concern that implicit-bias jury instructions and even the well-meaning video may actually produce worse outcomes (and by worse outcomes, I mean outcomes influenced by racial or gender bias).
As a refresher, let me start with the definition of implicit bias. Implicit bias is not explicit racism or sexism, which of course exists. Implicit bias, instead, is an unconscious association, belief, or attitude toward any social group. As a result of implicit biases, we (as a society) are more likely to associate weapons with black faces faster than we do with white faces. We find it easier to pair women with family words than career words. Due to our implicit biases, we can experience surprise when our Delta pilot is a young African American woman. Because we weren't expecting that. Our expectations are, of course, informed by the messages we have received our whole life about what a Delta pilot looks like. It looks like Captain Sully.
When we make decisions too quickly, we can rely on implicit biases and not good, solid information. So, for example, when we decide we want to hire someone based on "a gut feeling" that they will be a good employee, we might not realize that we feel that way simply because they already look like everyone else we work with. Nobel laureate and psychologist Daniel Kahneman is a leading researcher in this field. In his book, Thinking Fast and Slow, he explains that we make decisions in one of two ways. When we use "System 1," we go with our gut and make decisions quickly. But our "gut" is generally just a reliance on implicit biases. When we go with our gut, we are more likely to hire the pilot who looks like Tom Hanks than the one who looks like Halle Berry because aren't digging into the pilot's resume.
When we use "System 2," we engage with a decision and aim to think through problems more carefully. We would examine Halle Berry and Tom Hanks' resumes to see who has the most experience. The problem, however, is that it takes real energy and concentration to engage our System 2 thinking. So we get tired and, towards the end of most days, end up relying on System 1.
There is some research out there suggesting that if people are primed to think of themselves as "objective," our implicit biases get worse. Specifically, researchers Uhlmann and Cohen (2007) demonstrated that being primed with a sense of one's own objectivity leads to greater reliance on gender bias in hiring scenarios. To reach this conclusion, they conducted a series of experiments by first dividing their subjects into two groups. Subjects in Group 1 were primed to think of themselves as "objective," by answering a questionnaire that asked them to rank how strongly they agreed with statements such as, "When forming an opinion, I try to objectively consider all facts I have access to," "My judgments are based on a logical analysis of the facts," and "My decision making is rational and objective." Subjects in Group 2 were not primed at all.
All the participants were then asked to evaluate a job candidate for a factory manager position. Each participant received the description of a job candidate. All resumes were identical except for the candidate's name, which was either "Lisa" or "Gary." The candidate was described as technically proficient and organized, but interpersonally unskilled. Participants were then asked to rate the strength of the applicant with respect to a series of traits. Group 1 participants (who were primed to think of themselves as objective) rated the female candidate less favorably than the male candidate. Group 2, the unprimed group, gave approximately equal ratings to the male and female candidates. In other words, when people were primed to think of themselves as objective, there was a sizeable gender-discrimination effect.
Why did this happen? We think it's because when you think of yourself as objective you can give yourself permission to engage in System 1 thinking. You can "go with your gut," because you are naturally an objective person. You don't need to think hard about things! When you aren't primed of yourself to think of yourself as objective, you are more likely to engage in System 2 thinking and thoughtfully explore the information provided.
What does this have to do with implicit bias jury instructions and the video? Based on the research regarding "Gary" and "Lisa," I worry that the instructions and the video might have the effect of priming people to think of themselves as objective. Indeed, social scientists have found (https://hbr.org/2016/ 07/why-diversity-programs-fail) that people often rebel against rules to demonstrate autonomy. "Try to coerce me to do X, Y or Z, and I'll do the opposite just to prove that I'm my own person." So if we warn people that their implicit biases could impact their decision making, they may resist. They may say to themselves, "I don't need this instruction. I'm objective." In which case, we may actually get worse outcomes.
To be clear, I have no research to back up my hypothesis. But I am craving research on this point (and sincerely hope someone reading this article decides to test this hypothesis in their next jury mock exercise or as part of a dissertation). Because wouldn't it be awful if, in our attempts to solve our society's implicit biases, we ended up making things even worse.
Published: Mon, Nov 04, 2019