Understanding Implicit Bias

What Educators Should Know

 

American Educator Winter 2015-2016

As a profession, teaching is full of well-intentioned individuals deeply committed to seeing all children succeed. Touching innumerable lives in direct and indirect ways, educators uniquely recognize that our future rests on the shoulders of young people and that investing in their education, health, and overall well-being benefits society as a whole, both now and into the future.

This unwavering desire to ensure the best for children is precisely why educators should become aware of the concept of implicit bias: the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner. Operating outside of our conscious awareness, implicit biases are pervasive, and they can challenge even the most well-intentioned and egalitarian-minded individuals, resulting in actions and outcomes that do not necessarily align with explicit intentions.

In this article, I seek to shed light on the dynamics of implicit bias with an eye toward educators. After introducing the concept and the science undergirding it, I focus on its implications for educators and suggest ways they can mitigate its effects.

The Unconscious Mind

Psychologists estimate that our brains are capable of processing approximately 11 million bits of information every second.1 Given the tremendous amount of information that inundates this startlingly complex organ in any given moment, many researchers have sought to understand the nuances of our remarkable cognitive functioning. In his 2011 tome on cognition, Thinking, Fast and Slow, Daniel Kahneman articulates a widely accepted framework for understanding human cognitive functioning by delineating our mental processing into two parts: System 1 and System 2.2
System 1 handles cognition that occurs outside of conscious awareness. This system operates automatically and extremely fast. For example, let's say you stop your car at a red light. When the light turns green, you know to proceed through the intersection. Thanks to the speed and efficiency of System 1, experienced drivers automatically understand that green means go, and so this mental association requires no conscious or effortful thought.

In contrast, System 2 is conscious processing. It's what we use for mental tasks that require concentration, such as completing a tax form. Rather than being automatic and fast, this undertaking requires effortful, deliberate concentration.

Together, these two systems help us make sense of the world. What is fascinating, though, is how much our cognition relies on System 1. Of the millions of possible pieces of information we can process each second, most neuroscientists agree that the vast majority of our cognitive processing occurs outside of our conscious awareness.3 Besides its vastness, System 1 cognitive processing is also notable because it helps us understand that many of the mental associations that affect how we perceive and act are operating implicitly (i.e., unconsciously). As such, System 1 is responsible for the associations known as implicit biases.

Because the implicit associations we hold arise outside of conscious awareness, implicit biases do not necessarily align with our explicit beliefs and stated intentions. This means that even individuals who profess egalitarian intentions and try to treat all individuals fairly can still unknowingly act in ways that reflect their implicit—rather than their explicit—biases. Thus, even well-intentioned individuals can act in ways that produce inequitable outcomes for different groups.

Moreover, because implicit biases are unconscious and involuntarily activated as part of System 1, we are not even aware that they exist, yet they can have a tremendous impact on decision making. A large body of social science evidence has shown that implicit biases can be activated by any number of various identities we perceive in others, such as race, ethnicity, gender, or age. Since these robust associations are a critical component of our System 1 processing, everyone has implicit biases, regardless of race, ethnicity, gender, or age. No one is immune. Consequently, the range of implicit bias implications for individuals in a wide range of professions—not just education—is vast. For example, researchers have documented implicit biases in healthcare professionals,4 law enforcement officers,5 and even individuals whose careers require avowed commitments to impartiality, such as judges.6 Indeed, educators are also susceptible to the influence of these unconscious biases.

Implicit Bias in Education

 

American Educator Winter 2015-2016

Research on implicit bias has identified several conditions in which individuals are most likely to rely on their unconscious System 1 associations. These include situations that involve ambiguous or incomplete information; the presence of time constraints; and circumstances in which our cognitive control may be compromised, such as through fatigue or having a lot on our minds.7 Given that teachers encounter many, if not all, of these conditions through the course of a school day, it is unsurprising that implicit biases may be contributing to teachers' actions and decisions.

Let's consider a few examples in the context of school discipline.

First, classifying behavior as good or bad and then assigning a consequence is not a simple matter. All too often, behavior is in the eye of the beholder. Many of the infractions for which students are disciplined have a subjective component, meaning that the situation is a bit ambiguous. Thus, how an educator interprets a situation can affect whether the behavior merits discipline, and if so, to what extent.

Infractions such as "disruptive behavior," "disrespect," and "excessive noise," for example, are ambiguous and dependent on context, yet they are frequently provided as reasons for student discipline.8 That is not to say that some form of discipline is unwarranted in these situations, or that all disciplinary circumstances are subjective, as certainly many have objective components. However, these subjective infractions constitute a very large portion of disciplinary incidents.

There are no standardized ways of assessing many infractions, such as disobedient or disruptive behavior, though schools do attempt to delineate some parameters through codes of conduct and by outlining associated consequences. Yet subjectivity can still come into play. Teachers' experiences and automatic unconscious associations can shape their interpretation of situations that merit discipline, and can even contribute to discipline disparities based on a student's race.

One study of discipline disparities9 found that students of color were more likely to be sent to the office and face other disciplinary measures for offenses such as disrespect or excessive noise, which are subjective, while white students were more likely to be sent to the office for objective infractions, such as smoking or vandalism. (For more about discipline disparities, see "From Reaction to Prevention" by Russell J. Skiba and Daniel J. Losen.) Thus, in disciplinary situations that are a bit ambiguous (What qualifies as disrespect? How loud is too loud?), educators should be aware that their implicit associations may be contributing to their decisions without their conscious awareness or consent.

Second, implicit attitudes toward specific racial groups can unconsciously affect disciplinary decisions. For example, extensive research has documented pervasive implicit associations that link African Americans, particularly males, to stereotypes such as aggression, criminality, or danger, even when explicit beliefs contradict these views.10

In education, these implicit associations can taint perceptions of the discipline severity required to ensure that the misbehaving student understands what he or she did wrong. In short, these unconscious associations can mean the difference between one student receiving a warning for a confrontation and another student being sent to school security personnel. In the words of researcher Carla R. Monroe, "Many teachers may not explicitly connect their disciplinary reactions to negative perceptions of Black males, yet systematic trends in disproportionality suggest that teachers may be implicitly guided by stereotypical perceptions that African American boys require greater control than their peers and are unlikely to respond to nonpunitive measures."11

A recent study from Stanford University sheds further light on this dynamic by highlighting how racial disparities in discipline can occur even when black and white students behave similarly.12 In the experiment, researchers showed a racially diverse group of female K–12 teachers the school records of a fictitious middle school student who had misbehaved twice; both infractions were minor and unrelated. Requesting that the teachers imagine working at this school, researchers asked a range of questions related to how teachers perceived and would respond to the student's infractions. While the student discipline scenarios were identical, researchers manipulated the fictitious student's name; some teachers reviewed the record of a student given a stereotypically black name (e.g., Deshawn or Darnell) while others reviewed the record of a student with a stereotypically white name (e.g., Jake or Greg).

Results indicated that from the first infraction to the second, teachers were more likely to escalate the disciplinary response to the second infraction when the student was perceived to be black as opposed to white. Moreover, a second part of the study, with a larger, more diverse sample that included both male and female teachers, found that infractions by a black student were more likely to be viewed as connected, meaning that the black student's misbehavior was seen as more indicative of a pattern, than when the same two infractions were committed by a white student.13

Another way in which implicit bias can operate in education is through confirmation bias: the unconscious tendency to seek information that confirms our preexisting beliefs, even when evidence exists to the contrary. The following example is from the context of employee performance evaluations, which explored this dynamic. Relevant parallels also exist for K–12 teachers evaluating their students' work.

A 2014 study explored how confirmation bias can unconsciously taint the evaluation of work that employees produce. Researchers created a fictitious legal memo that contained 22 different, deliberately planted errors. These errors included minor spelling and grammatical errors, as well as factual, analytical, and technical writing errors. The exact same memo was distributed to law firm partners under the guise of a "writing analysis study,"14 and they were asked to edit and evaluate the memo.

Half of the memos listed the author as African American while the remaining portion listed the author as Caucasian. Findings indicated that memo evaluations hinged on the perceived race of the author. When the author was listed as African American, the evaluators found more of the embedded errors and rated the memo as lower quality than those who believed the author was Caucasian. Researchers concluded that these findings suggest unconscious confirmation bias; despite the intention to be unbiased, "we see more errors when we expect to see errors, and we see fewer errors when we do not expect to see errors."15

While this study focused on the evaluation of a legal memo, it is not a stretch of the imagination to consider the activation of this implicit dynamic in grading student essays or evaluating other forms of subjective student performance. Confirmation bias represents yet another way in which implicit biases can challenge the best of explicit intentions.

Finally, implicit biases can also shape teacher expectations of student achievement. For example, a 2010 study examined teachers' implicit and explicit ethnic biases, finding that their implicit—not explicit—biases were responsible for different expectations of achievement for students from different ethnic backgrounds.16

While these examples are a select few among many, together they provide a glimpse into how implicit biases can have detrimental effects for students, regardless of teachers' explicit goals. This raises the question: How can we better align our implicit biases with the explicit values we uphold?

Mitigating the Influence of Implicit Bias

 

American Educator Winter 2015-2016

Recognizing that implicit biases can yield inequitable outcomes even among well-intentioned individuals, a significant portion of implicit bias research has explored how individuals can change their implicit associations—in effect "reprogramming" their mental associations so that unconscious biases better align with explicit convictions. Thanks to the malleable nature of our brains, researchers have identified a few approaches that, often with time and repetition, can help inhibit preexisting implicit biases in favor of more egalitarian alternatives.

With implicit biases operating outside of our conscious awareness and inaccessible through introspection, at first glance it might seem difficult to identify any that we may hold. Fortunately, researchers have identified several approaches for assessing these unconscious associations, one of which is the Implicit Association Test (IAT). Debuting in 1998, this free online test measures the relative strength of associations between pairs of concepts. Designed to tap into unconscious System 1 associations, the IAT is a response latency (i.e., reaction time) measure that assesses implicit associations through this key idea: when two concepts are highly associated, test takers will be faster at pairing those concepts (and make fewer mistakes doing so) than they will when two concepts are not as highly associated.*

To illustrate, consider this example. Most people find the task of pairing flower types (e.g., orchid, daffodil, tulip) with positive words (e.g., pleasure, happy, cheer) easier than they do pairing flower types with negative words (e.g., rotten, ugly, filth). Because flowers typically have a positive connotation, people can quickly link flowers to positive terms and make few mistakes in doing so. In contrast, words such as types of insects (e.g., ants, cockroaches, mosquitoes) are likely to be easier for most people to pair with those negative terms than with positive ones.17

While this example is admittedly simplistic, these ideas laid the foundation for versions of the IAT that assess more complex social issues, such as race, gender, age, and sexual orientation, among others. Millions of people have taken the IAT, and extensive research has largely upheld the IAT as a valid and reliable measure of implicit associations.18 There are IATs that assess both attitudes (i.e., positive or negative emotions toward various groups) and stereotypes (i.e., how quickly someone can connect a group to relevant stereotypes about that group at an implicit level).

Educators can begin to address their implicit biases by taking the Implicit Association Test. Doing so will enable them to become consciously aware of some of the unconscious associations they may harbor. Research suggests that this conscious awareness of one's own implicit biases is a critical first step for counteracting their influence.19 This awareness is especially crucial for educators to help ensure that their explicit intentions to help students learn and reach their full potential are not unintentionally thwarted by implicit biases.

By identifying any discrepancies that may exist between conscious ideals and automatic implicit associations, individuals can take steps to bring those two into better alignment. One approach for changing implicit associations identified by researchers is intergroup contact: meaningfully engaging with individuals whose identities (e.g., race, ethnicity, religion) differ from your own. Certain conditions exist for optimal effects, such as equal status within the situation, a cooperative setting, and working toward common goals.20 By getting to know people who differ from you on a real, personal level, you can begin to build new associations about the groups those individuals represent and break down existing implicit associations.21

Another approach that research has determined may help change implicit associations is exposure to counter-stereotypical exemplars: individuals who contradict widely held stereotypes. Some studies have shown that exposure to these exemplars may help individuals begin to automatically override their preexisting biases.22 Examples of counter-stereotypical exemplars may include male nurses, female scientists, African American judges, and others who defy stereotypes.

This approach for challenging biases is valuable not just for educators but also for the students they teach, as some scholars suggest that photographs and décor that expose individuals to counter-stereotypical exemplars can activate new mental associations.23 While implicit associations may not change immediately, using counter-stereotypical images for classroom posters and other visuals may serve this purpose.
Beyond changing cognitive associations, another strategy for mitigating implicit biases that relates directly to school discipline is data collection. Because implicit biases function outside of conscious awareness, identifying their influence can be challenging. Gathering meaningful data can bring to light trends and patterns in disparate treatment of individuals and throughout an institution that may otherwise go unnoticed.

In the context of school discipline, relevant data may include the student's grade, the perceived infraction, the time of day it occurred, the name(s) of referring staff, and other relevant details and objective information related to the resulting disciplinary consequence. Information like this can facilitate a large-scale review of discipline measures and patterns and whether any connections to implicit biases may emerge.24 Moreover, tracking discipline data over time and keeping implicit bias in mind can help create a school- or districtwide culture of accountability.

Finally, in the classroom, educators taking enough time to carefully process a situation before making a decision can minimize implicit bias. Doing so, of course, is easier said than done, given that educators are constantly pressed for time, face myriad challenges, and need crucial support from administrators to effectively manage student behavior.

As noted earlier, System 1 unconscious associations operate extremely quickly. As a result, in circumstances where individuals face time constraints or have a lot on their minds, their brains tend to rely on those fast and automatic implicit associations. Research suggests that reducing cognitive load and allowing more time to process information can lead to less biased decision making.25 In terms of school discipline, this can mean allowing educators time to reflect on the disciplinary situation at hand rather than make a hasty decision.26

*  *  *

 

American Educator Winter 2015-2016

While implicit biases can affect any moment of decision making, these unconscious associations should not be regarded as character flaws or other indicators of whether someone is a "good person" or not. Having the ability to use our System 1 cognition to make effortless, lightning-fast associations, such as knowing that a green traffic light means go, is crucial to our cognition.

Rather, when we identify and reflect on the implicit biases we hold, we recognize that our life experiences may unconsciously shape our perceptions of others in ways that we may or may not consciously desire, and if the latter, we can take action to mitigate the influence of those associations.

In light of the compelling body of implicit bias scholarship, teachers, administrators, and even policymakers are increasingly considering the role of unconscious bias in disciplinary situations. For example, the federal school discipline guidance jointly released by the U.S. departments of Education and Justice in January 2014 not only mentions implicit bias as a factor that may affect the administration of school discipline, it also encourages school personnel to receive implicit bias training. (For more information on that guidance, see "School Discipline and Federal Guidance.") Speaking not only to the importance of identifying implicit bias but also to mitigating its effects, the federal guidance asserts that this training can "enhance staff awareness of their implicit or unconscious biases and the harms associated with using or failing to counter racial and ethnic stereotypes."27 Of course, teachers who voluntarily choose to pursue this training and explore this issue on their own can also generate interest among their colleagues, leading to more conversations and awareness.

Accumulated research evidence indicates that implicit bias powerfully explains the persistence of many societal inequities, not just in education but also in other domains, such as criminal justice, healthcare, and employment.28 While the notion of being biased is one that few individuals are eager to embrace, extensive social science and neuroscience research has connected individuals' System 1 unconscious associations to disparate outcomes, even among individuals who staunchly profess egalitarian intentions.

In education, the real-life implications of implicit biases can create invisible barriers to opportunity and achievement for some students—a stark contrast to the values and intentions of educators and administrators who dedicate their professional lives to their students' success. Thus, it is critical for educators to identify any discrepancies that may exist between their conscious ideals and unconscious associations so that they can mitigate the effects of those implicit biases, thereby improving student outcomes and allowing students to reach their full potential.


Cheryl Staats is a senior researcher at the Kirwan Institute for the Study of Race and Ethnicity, housed at Ohio State University.

*Implicit Association Tests are publicly available through Project Implicit. (back to the article)

Endnotes

1. Tor Nørretranders, The User Illusion: Cutting Consciousness Down to Size (New York: Penguin, 1999).

2. Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011).

3. See, for example, George A. Miller, "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information," Psychological Review 63, no. 2 (1956): 81–97.

4. See, for example, Janice A. Sabin, Brian A. Nosek, Anthony G. Greenwald, and Frederick P. Rivara, "Physicians' Implicit and Explicit Attitudes about Race by MD Race, Ethnicity, and Gender," Journal of Health Care for the Poor and Underserved 20 (2009): 896–913.

5. See, for example, Joshua Correll, Bernadette Park, Charles M. Judd, Bernd Wittenbrink, Melody S. Sadler, and Tracie Keesee, "Across the Thin Blue Line: Police Officers and Racial Bias in the Decision to Shoot," Journal of Personality and Social Psychology 92 (2007): 1006–1023.

6. See, for example, Jeffrey J. Rachlinski, Sheri Lynn Johnson, Andrew J. Wistrich, and Chris Guthrie, "Does Unconscious Racial Bias Affect Trial Judges?," Notre Dame Law Review 84 (2009): 1195–1246.

7. Marianne Bertrand, Dolly Chugh, and Sendhil Mullainathan, "Implicit Discrimination," American Economic Review 95, no. 2 (2005): 94–98.

8. See, for example, Cheryl Staats and Danya Contractor, Race and Discipline in Ohio Schools: What the Data Say (Columbus, OH: Kirwan Institute for the Study of Race and Ethnicity, 2014).

9. Russell J. Skiba, Robert S. Michael, Abra Carroll Nardo, and Reece L. Paterson, "The Color of Discipline: Sources of Racial and Gender Disproportionality in School Punishment," Urban Review 34 (2002): 317–342.

10. Jennifer L. Eberhardt, Phillip Atiba Goff, Valerie J. Purdie, and Paul G. Davies, "Seeing Black: Race, Crime, and Visual Processing," Journal of Personality and Social Psychology 87 (2004): 876–893.

11. Carla R. Monroe, "Why Are 'Bad Boys' Always Black? Causes of Disproportionality in School Discipline and Recommendations for Change," The Clearing House: A Journal of Educational Strategies, Issues and Ideas 79 (2005): 46.

12. Jason A. Okonofua and Jennifer L. Eberhardt, "Two Strikes: Race and the Disciplining of Young Students," Psychological Science 26 (2015): 617–624.

13. Okonofua and Eberhardt, "Two Strikes."

14. Arin N. Reeves, Written in Black & White: Exploring Confirmation Bias in Racialized Perceptions of Writing Skills (Chicago: Nextions, 2014).

15. Reeves, Written in Black & White, 6.

16. Linda van den Bergh, Eddie Denessen, Lisette Hornstra, Marinus Voeten, and Rob W. Holland, "The Implicit Prejudiced Attitudes of Teachers: Relations to Teacher Expectations and the Ethnic Achievement Gap," American Educational Research Journal 47 (2010): 497–527.

17. This example is from Anthony G. Greenwald, Debbie E. McGhee, and Jordan L. K. Schwartz, "Measuring Individual Differences in Implicit Cognition: The Implicit Association Test," Journal of Personality and Social Psychology 74 (1998): 1464–1480.

18. Brian A. Nosek, Anthony G. Greenwald, and Mahzarin R. Banaji, "The Implicit Association Test at Age 7: A Methodological and Conceptual Review," in Social Psychology and the Unconscious: The Automaticity of Higher Mental Processes, ed. John A. Bargh (New York: Psychology Press, 2007), 265–292.

19. Patricia G. Devine, Patrick S. Forscher, Anthony J. Austin, and William T. L. Cox, "Long-Term Reduction in Implicit Bias: A Prejudice Habit-Breaking Intervention," Journal of Experimental Social Psychology 48 (2012): 1267–1278; and John F. Dovidio, Kerry Kawakami, Craig Johnson, Brenda Johnson, and Adaiah Howard, "On the Nature of Prejudice: Automatic and Controlled Processes," Journal of Experimental Social Psychology 33 (1997): 510–540.

20. Gordon W. Allport, The Nature of Prejudice (Cambridge, MA: Addison-Wesley, 1954). Allport also recognizes a fourth condition for optimal intergroup contact, which is authority sanctioning the contact.

21. Thomas F. Pettigrew and Linda R. Tropp, "A Meta-Analytic Test of Intergroup Contact Theory," Journal of Personality and Social Psychology 90 (2006): 751–783.

22. Nilanjana Dasgupta and Anthony G. Greenwald, "On the Malleability of Automatic Attitudes: Combating Automatic Prejudice with Images of Admired and Disliked Individuals," Journal of Personality and Social Psychology 81 (2001): 800–814; and Nilanjana Dasgupta and Shaki Asgari, "Seeing Is Believing: Exposure to Counterstereotypic Women Leaders and Its Effect on the Malleability of Automatic Gender Stereotyping," Journal of Experimental Social Psychology 40 (2004): 642–658.

23. Jerry Kang, Mark Bennett, Devon Carbado, et al., "Implicit Bias in the Courtroom," UCLA Law Review 59 (2012): 1124–1186.

24. Kent McIntosh, Erik J. Girvan, Robert H. Horner, and Keith Smolkowski, "Education Not Incarceration: A Conceptual Model for Reducing Racial and Ethnic Disproportionality in School Discipline," Journal of Applied Research on Children: Informing Policy for Children at Risk 5, no. 2 (2014): art. 4.

25. Diana J. Burgess, "Are Providers More Likely to Contribute to Healthcare Disparities under High Levels of Cognitive Load? How Features of the Healthcare Setting May Lead to Biases in Medical Decision Making," Medical Decision Making 30 (2010): 246–257.

26. Prudence Carter, Russell Skiba, Mariella Arredondo, and Mica Pollock, You Can't Fix What You Don't Look At: Acknowledging Race in Addressing Racial Discipline Disparities, Disciplinary Disparities Briefing Paper Series (Bloomington, IN: Equity Project at Indiana University, 2014).

27. U.S. Department of Education, Guiding Principles: A Resource Guide for Improving School Climate and Discipline (Washington, DC: Department of Education, 2014), 17.

28. For more on implicit bias and its effects in various professions, see the Kirwan Institute's annual State of the Science: Implicit Bias Review publication.

[illustrations by Souther Salazar]

Download the Article (202.84 KB)
American Educator, Winter 2015-2016