Implicit stereotype

In social identity theory, an implicit bias or implicit stereotype, is the pre-reflective attribution of particular qualities by an individual to a member of some social out group.[1]

Implicit stereotypes are thought to be shaped by experience and based on learned associations between particular qualities and social categories, including race and/or gender.[2] Individuals' perceptions and behaviors can be influenced by the implicit stereotypes they hold, even if they are sometimes unaware they hold such stereotypes.[3] Implicit bias is an aspect of implicit social cognition: the phenomenon that perceptions, attitudes, and stereotypes can operate prior to conscious intention or endorsement.[4] The existence of implicit bias is supported by a variety of scientific articles in psychological literature.[5] Implicit stereotype was first defined by psychologists Mahzarin Banaji and Anthony Greenwald in 1995.

Explicit stereotypes, by contrast, are consciously endorsed, intentional, and sometimes controllable thoughts and beliefs.[6]

Implicit biases, however, are thought to be the product of associations learned through past experiences.[7] Implicit biases can be activated by the environment and operate prior to a person's intentional, conscious endorsement.[1] For example, a person may unwittingly form a bias towards all Pitbulls as being dangerous animals. This bias may be associated with a single unpleasant experience in the past, but the source of association may be misidentified, or even unknown. In the example, this implicit bias may manifest itself as a person declining an invitation to touch someone's Pitbull (dog) on the street, without this person understanding the reason behind the response.[8] Implicit bias can persist even when an individual rejects the bias explicitly.[1]

Bias, attitude, stereotype and prejudice

Attitudes, stereotypes, prejudices, and bias are all examples of psychological constructs. psychological constructs are mental associations that can influence a person's behavior and feelings toward an individual or group. If the person is unaware of these mental associations the stereotypes, prejudices, or bias is said to be implicit.

Bias is defined as prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair. Bias can be seen as the overarching definition of stereotype and prejudice, because it is how we associate traits (usually negative) to a specific group of people. Our “implicit attitudes reflect constant exposure to stereotypical portrayals of members of, and items in, all kinds of different categories: racial groups, professions, women, nationalities, members of the LGBTQ community, moral and political values, etc.”[9]

An attitude is an evaluative judgment of an object, a person, or a social group.[10] An attitude is held by or characterizes a person. Implicit attitudes are evaluations that occur without conscious awareness towards an attitude object or the self.

A stereotype is the association of a person or a social group with a consistent set of traits. This may include both positive and negative traits, such as African Americans are great at sports or African Americans are more violent than any other race in the United States. There are many types of stereotypes that exists: racial, cultural, gender, group (i.e. college students), all being very explicit in the lives of many people.

Prejudice is defined as unfair negative attitude toward a social group or a member of that group.[11] Prejudices can stem from many of the things that people observe in a different social group that include, but are not limited to, gender, sex, race/ethnicity, or religion. This is pertinent to stereotypes because a stereotype can influence the way people feel toward another group, hence prejudice.

Methods for investigation

There is a clear challenge in measuring the degree to which someone is biased. There are two different forms of bias: implicit or explicit. The two forms of bias are however connected. “Explicit bias encompasses our conscious attitudes which can be measured by self-report, but pose the potential of individuals falsely endorsing more socially desirable attitudes. Although implicit biases have been considered unconscious and involuntary attitudes which lie below the surface of consciousness, some people seem to be aware of their influence on their behavior and cognitive processes.[12] The Implicit-association test (IAT) is one validated tool used to measure implicit bias. The IAT requires participants to rapidly pair two social groups with either positive or negative attributes.”[13]

Implicit-association test

The implicit-association test (IAT) alleges to predict prejudice an individual has toward different social groups. The test claims to do this by capturing the differences in the time it takes respondent to choose between two unassociated but related topics. Respondents are instructed to click one of two computer keys to categorize stimuli into associated categories. When the categories appear consistent to the respondent, the time taken to categorize the stimuli will be less than when the categories seem inconsistent. An implicit association is said to exist when respondents take longer to respond to a category-inconsistent pairing than a category-consistent pairing. The implicit-association test is used in psychology for a wide array of topics. These fields include gender, race, science, career, weight, sexuality, and disability.[14] While acclaimed and highly influential, the implicit-association test falls short of a strong scientific consensus. Critics of the implicit-association test cite studies that counterintuitively link biased test scores with less discriminatory behavior.[15] Studies have also asserted that the implicit-association test fails to measure unconscious thought.[16]

Go/no-go association task (GNAT)

The GNAT is similar to the implicit-association test. Although the IAT reveals differential associations of two target concepts (e.g. male-female and weak-strong), the GNAT reveals associations within one concept (for example, whether female is associated more strongly with weak or strong).[17] Participants are presented with word pairs among distractors. Participants are instructed to indicate "go" if the words are target pairs, or "no-go" if they are not. For example, participants may be instructed to indicate "go" if the word pairs are female names and words that are related to strength. Then, participants are instructed to indicate "go" if the word pairs are female names and words that are related to weakness. This method relies on signal detection theory; participants' accuracy rates reveal endorsement of the implicit stereotype. For example, if participants are more accurate for female-weak pairs than for female-strong pairs, this suggests the subject more strongly associates weakness with females than strength.[18]

Semantic priming and lexical decision task

Semantic priming measures the association between two concepts.[19] In a lexical decision task, subjects are presented with pair of words, and asked to indicate whether the pair are words (for example, "butter") or non-words (for example, "tubter"). The theory behind semantic priming is that subjects are quicker to respond to a word if preceded by a word related to it in meaning (e.g. bread-butter vs. bread-dog).[19] In other words, the word "bread" primes other words related in meaning, including butter. Psychologists utilize semantic priming to reveal implicit associations between stereotypic-congruent words. For instance, participants may be asked to indicate whether pronouns are male or female. These pronouns are either preceded by professions that are predominantly female ("secretary, nurse"), or male ("mechanic, doctor"). Reaction times reveal strength of association between professions and gender.[20]

Sentence completion

In a sentence completion task, subjects may be presented with sentences that contain stereotypic black and white names (Jerome, Adam), positive and negative stereotypic black behaviors (easily made the team, blasted loud music in his car) and counter-stereotypic behaviors (got a job at Microsoft, refused to dance). Subjects are asked to add to the end of a sentence in any way that is grammatical, e.g. "Jerome got an A on his test..." could be completed with "because it was easy" (stereotypic-congruent) or "because he studied for months" (stereotypic-incongruent) or "and then he went out to celebrate" (non-explanatory). This task is used to measure stereotypic explanatory bias (SEB): participants have a larger SEB if they give more explanations for stereotype-congruent sentences than stereotype–incongruent sentences, and if they give more stereotypic-congruent explanations.[21]

Differences between measures

The Implicit Association Test (IAT), sequential priming, and other implicit bias tests, are mechanisms for determining how susceptible we are to stereotypes. They are widely used in Social Psychology, although measuring response time to a question as a good measure of implicit biases is still up for debate. “Some theorists do question the interpretation of the scores from tests such as the IAT, but the debate is still going on and responses to the criticisms are certainly widespread.”[22]

Findings

Gender bias

Gender biases are the stereotypical attitudes or prejudices that we have towards specific genders. "The concept of gender also refers to the constantly ongoing social construction of what is considered ‘feminine’ and ‘masculine’ and is based on power and sociocultural norms about women and men."[23] Gender biases are the ways in which we judge men and women based on their hegemonically feminine and masculine assigned traits.

The category of male has been found to be associated with traits of strength and achievement. Both male and female subjects associate male category members more strongly than female category members with words like bold, mighty, and power.[24] The strength of this association is not predicted by explicit beliefs, such as responses on a gender stereotype questionnaire (for example, one question asked if subjects endorsed the word feminist).[1] In a test to reveal the false fame effect, non famous male names are more likely to be falsely identified as famous than non famous female names; this is evidence for an implicit stereotype of male achievement.[25] Females are more associated with weakness. This is true for both male and female subjects, but female subjects only show this association when the weak words are positive, such as fine, flower and gentle; female subjects do not show this pattern when the weak words are negative, such as feeble, frail, and scrawny.[24]

Particular professions are implicitly associated with genders. Elementary school teachers are implicitly stereotyped to be female, and engineers are stereotyped to be male.[26]

Gender bias in science and engineering

Implicit-association tests reveal an implicit association for male with science and math, and females with arts and language.[27] Girls as young as nine years old have been found to hold an implicit male-math stereotype and an implicit preference for language over math.[28] Women have stronger negative associations with math than men do, and the stronger females associate with a female gender identity, the more implicit negativity they have towards math.[27] For both men and women, the strength of these implicit stereotypes predicts both implicit and explicit math attitudes, belief in one's math ability, and SAT performance.[27] The strength of these implicit stereotypes in elementary-aged girls predicts academic self-concepts, academic achievement, and enrollment preferences, even more than do explicit measures.[28] Women with a stronger implicit gender-math stereotype were less likely to pursue a math-related career, regardless of their actual math ability or explicit gender-math stereotypes.[29] This may be because women with stronger implicit gender-math stereotypes are more at risk for stereotype threat. Thus, women with strong implicit stereotypes perform much worse on a math test when primed with gender than women who have weak implicit stereotypes.[30]

Though the number of women pursuing and earning degrees in engineering has increased in the last 20 years, women are below men at all degree levels in all fields of engineering.[31] These implicit gender stereotypes are robust; in a study of more than 500,000 respondents from 34 nations, more than 70% of individuals held this implicit stereotype.[32] The national strength of the implicit stereotype is related to national sex differences among 8th graders on the International TIMSS, a worldwide math &science standardized achievement exam. This effect is present even after statistically controlling for gender inequality in general.[32] Additionally, for women across cultures, studies have shown individual differences in strength of this implicit stereotype is associated with interest, participation and performance in sciences.[32] Extending to the professional world, implicit biases and subsequent explicit attitudes toward women can "negatively affect the education, hiring, promotion, and retention of women in STEM".[33]

The effects of such implicit biases can be seen in across multiple studies including:

  • Parents rate the math abilities of their daughters lower than parents with sons who perform identically well in school[34]
  • College faculty are less likely to respond to inquiries about research opportunities if the email appears to be from a woman as opposed to an identical email from a man[35]
  • Science faculty are less likely to hire or mentor students they believe are women as opposed to men[36]

An interagency report from the Office of Science and Technology Policy and Office of Personnel Management has investigated systemic barriers including implicit biases that have traditionally inhibited particularly women and underrepresented minorities in science, technology, engineering, and mathematics (STEM) and makes recommendations for reducing the impact of bias.[37] Research has shown that implicit bias training may improve attitudes towards women in STEM.[33]

Racial bias

Racial bias can be used synonymously with "stereotyping and prejudice" because "it allows for the inclusion of both positive and negative evaluations related to perceptions of race."[38] We begin to create racial biases towards other groups of people starting as young as age 3, creating an ingroup and outgroup view on members of various races, usually starting with skin color.

In lexical decision tasks, after subjects are subliminally primed with the word BLACK, they are quicker to react to words consistent with black stereotypes, such as athletic, musical, poor and promiscuous. When subjects are subliminally primed with WHITE, they are quicker to react to white stereotypes, such as intelligent, ambitious, uptight and greedy.[39] These tendencies are sometimes, but not always, associated with explicit stereotypes.[39][40]

People may also hold an implicit stereotype that associates black category members as violent. People primed with words like ghetto, slavery and jazz were more likely to interpret a character in a vignette as hostile.[41] However, this finding is controversial; because the character's race was not specified, it is suggested that the procedure primed the race-unspecified concept of hostility, and did not necessarily represent stereotypes.[39] An implicit stereotype of violent black men may associate black men with weapons. In a video game where subjects were supposed to shoot men with weapons and not shoot men with ordinary objects, subjects were more likely to shoot a black man with an ordinary object than a white man with an ordinary object. This tendency was related to subjects' implicit attitudes toward black people. Similar results were found in a priming task; subjects who saw a black face immediately before either a weapon or an ordinary object more quickly and accurately identified the image as a weapon than when it was preceded by a white face.

Implicit race stereotypes affect behaviors and perceptions. When choosing between pairs of questions to ask a black interviewee, one of which is congruent with racial stereotype, people with a high stereotypic explanatory bias (SEB) are more likely to ask the racially congruent stereotype question. In a related study, subjects with a high SEB rated a black individual more negatively in an unstructured laboratory interaction.[21]

In-group and out-group bias

Group prototypes define social groups through a collection of attributes that define both what representative group members have in common and what distinguishes the ingroup from relevant outgroups.[42] In-group favoritism, sometimes known as in-group–out-group bias, in-group bias, or intergroup bias, is a pattern of favoring members of one's in-group over out-group members. This can be expressed in evaluation of others, in allocation of resources, and in many other ways.[43][44] Implicit in-group preferences emerge very early in life,[45] even in children as young as six years old. In-group bias wherein people who are ‘one of us’ (i.e., our ingroup) are favored compared to those in the outgroup, meaning those who differ from ourselves.[46] Ingroup favoritism is associated with feelings of trust and positive regard for ingroup members and surfaces often on measures of implicit bias. This categorization (ingroup vs. outgroup) is often automatic and pre-conscious.[47]

The reasons for having in-group and out-group bias could be explained by ethnocentrism, social categorization, oxytocin, etc. A research paper done by Carsten De Dreu reviewed that oxytocin enables the development of trust, specifically towards individuals with similar characteristics - categorized as ‘in-group’ members - promoting cooperation with and favoritism towards such individuals.[48] People who report that they have strong needs for simplifying their environments also show more ingroup favoritism.[49] The tendency to categorize into ingroups and outgroups and resulting ingroup favoritism is likely a universal aspect of human beings.[50]

We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.[46][51] The most prominent example of negative affect towards an ingroup was recorded in 1939 by Kenneth and Mamie Clark using their now famous “Dolls Test”. In this test, African American children were asked to pick their favorite doll from a choice of otherwise identical black and white dolls. A high percentage of these African American children indicated a preference for the white dolls.[52] Social identity theory and Freudian theorists explain in-group derogation as the result of a negative self-image, which they believe is then extended to the group.[53]

Other stereotypes

Research on implicit stereotypes primarily focuses on gender and race. However, other topics, such as age, weight, and profession, have been investigated. IATs have revealed implicit stereotypes reflecting explicit stereotypes about adolescents. The results from these tests claim that adolescents are more likely to be associated with words like trendy and defiant than adults.[54] In addition, one IAT study revealed that older adults had a higher preference for younger adults compared to older adults; and younger adults had a lower implicit preference for younger adults compared to older adults. The study also found that women and participants with more education had lower implicit preference for younger adults.[55] IATs have also revealed implicit stereotypes on the relationship between obese individuals and low work performance. Words like lazy and incompetent are more associated with images of obese individuals than images of thin ones.[56] This association is stronger for thin subjects than overweight ones.[57] Like explicit stereotypes, implicit stereotypes may contain both positive and negative traits. This can be seen in examples of occupational implicit stereotypes where people perceive preschool teachers as both warm and incompetent, while lawyers are judged as both cold and competent.[58]

Activation of implicit stereotypes

Implicit stereotypes are activated by environmental and situational factors. These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.[59] In the laboratory, implicit stereotypes are activated by priming. When subjects are primed with dependence by unscrambling words such as dependent, cooperative, and passive, they judge a target female as more dependent. When subjects are primed with aggression with words like aggressive, confident, argumentative, they judge a target male as more aggressive.[60] The fact that females and words such as dependent, cooperative, and passive and males and words like aggressive, confident, argumentative are thought to be associated together suggest an implicit gender stereotype. Stereotypes are also activated by a subliminal prime. To exemplify, white subjects exposed to subliminal words which consist of a black stereotype (ghetto, slavery, jazz) interpret a target male as more hostile, consistent with the implicit stereotype of hostile black man.[41] However, this finding is controversial because the character's race is not specified. Instead, it is suggested that the procedure primed the race-unspecified concept of hostility, and did not necessarily represent stereotypes.[39] By getting to know people who differ from you on a real, personal level, you can begin to build new associations about the groups those individuals represent and break down existing implicit associations.[61]

Malleability of implicit stereotypes

Implicit stereotypes can, at least temporarily, be reduced or increased. Most methods have been found to reduce implicit bias temporarily, and are largely based on context.[62] Some evidence suggests that implicit bias can be reduced long-term, but it may require education and consistent effort. Some implicit bias training techniques designed to counteract implicit bias are stereotype replacement, counter-stereotypic imaging, individuation, perspective taking, and increasing opportunities for contact.[63]

Stereotype replacement is when you replace a stereotypical response with a non-stereotypical response. Counter-stereotypic imagining is when you imagine others in a positive light and replace stereotypes with positive examples. Individuation is when you focus on specific details of a certain member of a group to avoid over-generalizing. Perspective taking is when you take the perspective of a member of a marginalized group. Increasing opportunities for contact is when you actively seek out opportunities to engage in interactions with members of marginalized groups.[64]

Self and social motives

The activation of implicit stereotypes may be decreased when the individual is motivated to promote a positive self-image, either to oneself or to others in a social setting. There are two parts to this: internal and external motivation. Internal motivation is when an individual wants to be careful of what they say, and external motivation is when an individual has a desire to respond in a politically correct way.[65]

Positive feedback from a black person decreases stereotypic sentence completion, while negative feedback from a black person increases it.[66] Subjects also reveal lesser strength of race stereotypes when they feel others disagree with the stereotypes.[67] Motivated self-regulation does not immediately reduce implicit bias. It raises awareness of discrepancies when biases stand in the way of personal beliefs.[68]

Promote counterstereotypes

Implicit stereotypes can be reduced by exposure to counterstereotypes. Reading biographies of females in leadership roles (such as Meg Whitman, the CEO of eBay) increases females’ associations between female names and words like leader, determined, and ambitious in a gender stereotype IAT.[69] Attending a women's college (where students are presumably more often exposed to women in leadership positions) reduces associations between leadership and males after one year of schooling.[69] Merely imagining a strong woman reduces implicit association between females and weakness, and imagining storybook princesses increases the implicit association between females and weakness.[18]

Focus of attention

Diverting a participant's focus of attention can reduce implicit stereotypes. Generally, female primes facilitate reaction time to stereotypical female traits when participants are instructed to indicate whether the prime is animate. When participants instead are instructed to indicate whether a white dot is present on the prime, this diverts their focus of attention from the primes’ feminine features. This successfully weakens the strength of the prime and thus weakening the strength of gender stereotypes.[70]

Configuration of stimulus cues

Whether stereotypes are activated depends on the context. When presented with an image of a Chinese woman, Chinese stereotypes were stronger after seeing her use chopsticks, and female stereotypes were stronger after seeing her put on makeup.[71]

Characteristics of individual category members

Stereotype activation may be stronger for some category members than for others. People express weaker gender stereotypes with unfamiliar than familiar names.[72] Judgments and gut reactions that go along with implicit biases are based on how familiar something is.[73]

Criticism

Some social psychology research has indicated that individuating information (giving someone any information about an individual group member other than category information) may eliminate the effects of stereotype bias.[74]

Meta-analyses

Researchers from the University of Wisconsin at Madison, Harvard, and the University of Virginia examined 426 studies over 20 years involving 72,063 participants that used the IAT and other similar tests. They concluded two things:

  1. The correlation between implicit bias and discriminatory behavior appears weaker than previously thought.
  2. There is little evidence that changes in implicit bias correlate with changes in a person’s behavior.[75]

In a 2013 meta-analysis of papers, Hart Blanton, et al. declared that, despite its frequent misrepresentation as a proxy for the unconscious, "the IAT provides little insight into who will discriminate against whom, and provides no more insight than explicit measures of bias."[76]

News outlets

Heather Mac Donald, writing in the Wall Street Journal, noted that:

Few academic ideas have been as eagerly absorbed into public discourse lately as “implicit bias.” Embraced by Barack Obama, Hillary Clinton and most of the press, implicit bias has spawned a multimillion-dollar consulting industry, along with a movement to remove the concept of individual agency from the law. Yet its scientific basis is crumbling.

Mac Donald suggests there is still a political and economic drive to use the implicit bias paradigm as a political lever and to profit off entities which want to avoid litigation.[77]

Statement by original authors

Where previously Greenwald and Banaji asserted in their book BlindSpot (2013).

Given the relatively small proportion of people who are overtly prejudiced and how clearly it is established that automatic race preference predicts discrimination, it is reasonable to conclude not only that implicit bias is a cause of Black disadvantage but also that it plausibly plays a greater role than does explicit bias.[77]

The evidence presented by their peer researchers led them to concede in correspondence that:

  1. The IAT does not predict biased behaviour(in laboratory settings)
  2. It is "problematic to use [the IAT] to classify persons as likely to engage in discrimination".

However, they also stated, "Regardless of inclusion policy, both meta-analyses estimated aggregate correlational effect sizes that were large enough to explain discriminatory impacts that are societally significant either because they can affect many people simultaneously or because they can repeatedly affect single persons."[78]

Summary

Implicit bias is thought to be the product of positive or negative mental associations about persons, things, or groups that are formed and activated pre-consciously or subconsciously. In 1995, researchers Banaji and Greenwald noted that someone’s social learning experiences, such as observing parents, friends, or others, could create this type of association and, therefore, trigger this type of bias. Many studies have found that culture is able to stimulate biases as well, both in a negative and positive way regardless someone’s personal experience with other cultures.[79] As far as many people are concerned, implicit bias knows no age restriction and it can be held by anyone regardless of their age. In fact, implicit biases can be found in a person as young as six years old.[79] Even though implicates bias may be difficult to catch, especially compared to explicit bias, it can be measured through a number of mechanisms, such as sequential priming, response competition, EDA, EMG, fMRI, ERP and ITA.[80] Thus, once a person becomes aware of their own bias, they can take action to change it, if they wish.[81]

The existence of implicitly biased behavior is supported by several articles in psychological literature. Adults- and even children- may hold implicit stereotypes of social categories, categories to which they may themselves belong to. Without intention, or even awareness, implicit stereotypes affect human behavior and judgments. This has wide-ranging implications for society, from discrimination and personal career choices to understanding others in social interactions each day.[1][28][25][41][60]

See also

References

  1. Greenwald, A. G.; Banaji, M. R. (1995). "Implicit social cognition: Attitudes, self-esteem, and stereotypes". Psychological Review. 102 (1): 4–27. CiteSeerX 10.1.1.411.2919. doi:10.1037/0033-295x.102.1.4. PMID 7878162.
  2. Byrd, N. (2019). What we can (and can’t) infer about implicit bias from debiasing experiments. Synthese, 1–29. https://doi.org/10.1007/s11229-019-02128-6
  3. Hahn, A., Judd, C. M., Hirsh, H. K., & Blair, I. V. (2014). Awareness of implicit attitudes. Journal of Experimental Psychology: General, 143(3), 1369–1392. https://doi.org/10.1037/a0035028
  4. Gawronski, B. (2019). Six Lessons for a Cogent Science of Implicit Bias and Its Criticism. Perspectives on Psychological Science, 14(4), 574–595. https://doi.org/10.1177/1745691619826015
  5. Jost, J. T., Rudman, L. A., Blair, I. V., Carney, D. R., Dasgupta, N., Glaser, J., & Hardin, C. D. (2009). The existence of implicit bias is beyond reasonable doubt: A refutation of ideological and methodological objections and executive summary of ten studies that no manager should ignore. Research in Organizational Behavior, 29, 39–69. https://doi.org/10.1016/j.riob.2009.10.001
  6. Gaertner, Brown, Sam, Rupert (2008-04-15). Blackwell Handbook of Social Psychology: Intergroup Processes. ISBN 9780470692707. Retrieved 2013-08-11.
  7. Del Pinal, G. D., & Spaulding, S. (2018). Conceptual centrality and implicit bias. Mind & Language, 33(1), 95–111. https://doi.org/10.1111/mila.12166
  8. McReynolds, Tony (6 June 2019). "New study identifies most damaging dog bites by breed". aaha.org. American Animal Hospital Association. Retrieved 22 August 2020. Pit bulls were responsible for the highest percentage of reported bites across all the studies (22.5%), followed by mixed breeds (21.2%), and German shepherds (17.8%).
  9. "Implicit Bias: From Social Structure to Representational Format.: Discovery Service for Loyola Marymount Univ". eds.b.ebscohost.com. Retrieved 2018-04-13.
  10. Crano, W.D., & Prislin, R. (2008). Attitudes and attitude change. New York: CRC Press.
  11. Gaertner, Samuel L.; Dovidio, John F. (1999). "Reducing Prejudice: Combating Intergroup Biases". Current Directions in Psychological Science. 8 (4): 101–105. doi:10.1111/1467-8721.00024. JSTOR 20182575.
  12. Hahn, A., & Gawronski, B. (2019). Facing one’s implicit biases: From awareness to acknowledgment. Journal of Personality and Social Psychology, 116(5), 769–794. https://doi.org/10.1037/pspi0000155
  13. Maina, Ivy W., et al. “A Decade of Studying Implicit Racial/Ethnic Bias in Healthcare Providers Using the Implicit Association Test.” Social Science & Medicine, vol. 199, 2018, pp. 219–229., doi:10.1016/j.socscimed.2017.05.009.
  14. Harris, Matthew, et al. "Measuring the Bias against Low-Income Country Research: An Implicit Association Test." Globalization & Health, vol. 13, 06 Nov. 2017, pp. 1-9. EBSCOhost, doi:10.1186/s12992-017-0304-y.
  15. "Ironic Effects of Racial Bias During Interracial Interactions J. Nicole Shelton, 1 Jennifer A. Richeson, 2 Jessica Salvatore, 1 and Sophie Trawalter 2 - PDF Free Download". docplayer.net. Retrieved 2020-07-19.
  16. Hahn, Adam; Judd, Charles M.; Hirsh, Holen K.; Blair, Irene V. (June 2014). "Awareness of implicit attitudes". Journal of Experimental Psychology. General. 143 (3): 1369–1392. doi:10.1037/a0035028. ISSN 1939-2222. PMC 4038711. PMID 24294868.
  17. Nosek, B. A.; Banaji, M. R. (2001). "The Go/No-go Association Task". Social Cognition. 19 (6): 625–666. doi:10.1521/soco.19.6.625.20886.
  18. Blair, I. V.; Ma, J. E.; Lenton, A. P. (2001). "Imagining stereotypes away: The moderation of implicit stereotypes through mental imagery". Journal of Personality and Social Psychology. 81 (5): 828–841. CiteSeerX 10.1.1.555.2509. doi:10.1037/0022-3514.81.5.828.
  19. Meyer, D. E.; Schvaneveldt, R. W. (1971). "Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations". Journal of Experimental Psychology. 90 (2): 227–234. doi:10.1037/h0031564. PMID 5134329.
  20. Banaji, M. R.; Hardin, C. D. (1996). "Automatic stereotyping". Psychological Science. 7 (3): 136–141. doi:10.1111/j.1467-9280.1996.tb00346.x.
  21. Sekaquaptewa, D.; Espinoza, P.; Thompson, M.; Vargas, P.; von Hippel, W. (2003). "Stereotypic explanatory bias: Implicit stereotyping as a predictor of discrimination". Journal of Experimental Social Psychology. 39 (1): 75–82. doi:10.1016/S0022-1031(02)00512-7.
  22. Toribio, Josefa. "Implicit Bias: From Social Structure to Representational Format." Theoria, vol. 33, no. 1, Jan. 2018, pp. 41-60. EBSCOhost, doi:10.1387/theoria.17751.
  23. Hamberg, Katrina (May 2008). "Gender bias in medicine". Women's Health. 4 (3): 237–243. doi:10.2217/17455057.4.3.237. PMID 19072473.
  24. Rudman, L. A.; Greenwald, A. G.; McGhee, D. E. (2001). "Implicit self-concept and evaluative implicit gender stereotypes: Self and ingroup share desirable traits". Personality and Social Psychology Bulletin. 27 (9): 1164–1178. CiteSeerX 10.1.1.43.6589. doi:10.1177/0146167201279009.
  25. Banaji, M. R.; Greenwald, A. G. (1995). "Implicit gender stereotyping in judgments of fame". Journal of Personality and Social Psychology. 68 (2): 181–198. CiteSeerX 10.1.1.74.895. doi:10.1037/0022-3514.68.2.181.
  26. White, M. J.; White, G. B. (2006). "Implicit and explicit occupational gender stereotypes". Sex Roles. 55 (3–4): 259–266. doi:10.1007/s11199-006-9078-z.
  27. Nosek, B. A.; Banaji, M. R.; Greenwald, A. G. (2002). "Math = male, me = female, therefore math ≠ me". Journal of Personality and Social Psychology. 83 (1): 44–59. CiteSeerX 10.1.1.463.6120. doi:10.1037/0022-3514.83.1.44. PMID 12088131.
  28. Steffens, M. C.; Jelenec, P.; Noack, P. (2010). "On the leaky math pipeline: Comparing implicit math-gender stereotypes and math withdrawal in female and male children and adolescents". Journal of Educational Psychology. 102 (4): 947–963. doi:10.1037/a0019920.
  29. Kiefer, A. K.; Sekaquaptewa, D. (2007). "Implicit Stereotypes, Gender Identification, and Math-Related Outcomes: A Prospective Study of Female College Students". Psychological Science. 18 (1): 13–18. doi:10.1111/j.1467-9280.2007.01841.x. PMID 17362371.
  30. Kiefer, A. K.; Sekaquaptewa, D. (2007). "Implicit stereotypes and women's math performance: How implicit gender-math stereotypes influence women's susceptibility to stereotype threat". Journal of Experimental Social Psychology. 43 (5): 825–832. doi:10.1016/j.jesp.2006.08.004.
  31. "Women, Minorities, and Persons with Disabilities in Science and Engineering". National Science Foundation. Retrieved 31 March 2017.
  32. Nosek, B. A.; Smyth, F. L.; Sriram, N. N.; Lindner, N. M.; Devos, T.; Ayala, A.; Greenwald, A. G. (2009). "National differences in gender–science stereotypes predict national sex differences in science and math achievement". Proceedings of the National Academy of Sciences of the United States of America. 106 (26): 10593–10597. Bibcode:2009PNAS..10610593N. doi:10.1073/pnas.0809921106. PMC 2705538. PMID 19549876.
  33. Jackson, Sarah M.; Hillard, Amy; Schneider, Tamara R. (May 11, 2011). "Using implicit bias training to improve attitudes toward women in STEM". Social Psychology of Education. 17 (1): 419–438. doi:10.1007/s11218-014-9259-5. Retrieved March 31, 2017.
  34. Yee, D.K.; Eccles, J.S. (1988). "Parent perceptions and attributions for children's math achievement". Sex Roles. 19 (5–6): 317–333. doi:10.1007/bf00289840. hdl:2027.42/45585.
  35. Milkman, KL; Akinola, M; Chugh, D (2015). "What happens before? A field experiment exploring how pay and representation differentially shape bias on the pathway into organizations". Journal of Applied Psychology. 100 (6): 1678–712. doi:10.1037/apl0000022. PMID 25867167.
  36. Moss-Racusin, C.A.; Dovidio, J.F. (2012). "Science faculty's subtle gender biases favor male students". Proceedings of the National Academy of Sciences. 109 (41): 16474–17479. Bibcode:2012PNAS..10916474M. doi:10.1073/pnas.1211286109. PMC 3478626. PMID 22988126.
  37. Handelsman, Jo; Ward, Wanda (2016-12-12). "Increasing Diversity in the STEM Workforce by Reducing the Impact of Bias". The White House. Retrieved 31 March 2017.
  38. Noles, Erica (2014). "What's age got to do with it? examining how the age of stimulus faces affects children's implicit racial bias". ProQuest Dissertations Publishing. ProQuest 1566943023.
  39. Wittenbrink, B.; Judd, C. M.; Park, B. (1997). "Evidence for racial prejudice at the implicit level and its relationship with questionnaire measures". Journal of Personality and Social Psychology. 72 (2): 262–274. CiteSeerX 10.1.1.462.7827. doi:10.1037/0022-3514.72.2.262.
  40. Gaertner, S. L.; McLaughlin, J. P. (1983). "Racial stereotypes: Associations and ascriptions of positive and negative characteristics". Social Psychology Quarterly. 46 (1): 23–30. doi:10.2307/3033657. JSTOR 3033657.
  41. Devine, P. G. (1989). "Stereotypes and prejudice: Their automatic and controlled components". Journal of Personality and Social Psychology. 56: 5–18. doi:10.1037/0022-3514.56.1.5.
  42. Hohman, Zachary P.; Gaffney, Amber M.; Hogg, Michael A. (September 2017). "Who Am I If I Am Not like My Group? Self-Uncertainty and Feeling Peripheral in a Group". Journal of Experimental Social Psychology. 72: 125–132. doi:10.1016/j.jesp.2017.05.002.
  43. Aronson, E., Wilson, T. D., & Akert, R. (2010). Social psychology. 7th ed. Upper Saddle River: Prentice Hall.
  44. Taylor, Donald M.; Doria, Janet R. (April 1981). "Self-serving and group-serving bias in attribution". Journal of Social Psychology. 113 (2): 201–211. doi:10.1080/00224545.1981.9924371. ISSN 0022-4545.
  45. Dunham, Y.; Baron, A. S.; Banaji, M. R. (2008). "The Development of Implicit Intergroup Cognition". Trends in Cognitive Sciences. 12 (7): 248–253. doi:10.1016/j.tics.2008.04.006. PMID 18555736.
  46. Greenwald, A. G.; Krieger, L. H. (2006). "Implicit Bias: Scientific Foundations". California Law Review. 94 (4): 945–967. doi:10.2307/20439056. hdl:10125/66105. JSTOR 20439056.
  47. Reskin, B (2000). "The Proximate Causes of Employment Discrimination". Contemporary Sociology. 29 (2): 319–328. doi:10.2307/2654387. JSTOR 2654387.
  48. De Dreu, Carsten K.W. (2012). "Oxytocin modulates cooperation within and competition between groups: An integrative review and research agenda". Hormones and Behavior. 61 (3): 419–428. doi:10.1016/j.yhbeh.2011.12.009. PMID 22227278.
  49. Strangor, Charles; Leary, Scott P. (2006). Intergroup beliefs: Investigations from the social side. Advances in Experimental Social Psychology. 38. pp. 243–281. doi:10.1016/S0065-2601(06)38005-7. ISBN 9780120152384.
  50. Brewer, Marilynn B. (2002). "The psychology of prejudice: Ingroup love or outgroup hate?". Journal of Social Issues. 55 (3): 429–444. doi:10.1111/0022-4537.00126.
  51. Reskin, B. (2005). Unconsciousness Raising. Regional Review, 14(3), 32–37.
  52. Clark, Kenneth B.; Clark, Mamie P. (1950). "Emotional Factors in Racial Identification and Preference in Negro Children". The Journal of Negro Education. 19 (3): 341–350. doi:10.2307/2966491. JSTOR 2966491.
  53. Ma-Kellams, Christine; Spencer-Rodgers, Julie; Peng, Kaiping (2011). "I Am Against Us? Unpacking Cultural Differences in Ingroup Favoritism via Dialecticism". Personality and Social Psychology Bulletin. 37 (1): 15–27. doi:10.1177/0146167210388193. PMID 21084525.
  54. Gross, E. F.; Hardin, C. D. (2007). "Implicit and explicit stereotyping of adolescents". Social Justice Research. 20 (2): 140–160. CiteSeerX 10.1.1.514.3597. doi:10.1007/s11211-007-0037-9.
  55. Chopik, William J.; Giasson, Hannah L. (2017-08-01). "Age Differences in Explicit and Implicit Age Attitudes Across the Life Span". The Gerontologist. 57 (suppl_2): S169–S177. doi:10.1093/geront/gnx058. ISSN 1758-5341. PMC 5881761. PMID 28854609.
  56. Agerström, J.; Rooth, D. (2011). "The role of automatic obesity stereotypes in real hiring discrimination". Journal of Applied Psychology. 96 (4): 790–805. doi:10.1037/a0021594. PMID 21280934.
  57. Schwartz, M. B.; Vartanian, L. R.; Nosek, B. A.; Brownell, K. D. (2006). "The Influence of One's Own Body Weight on Implicit and Explicit Anti-fat Bias". Obesity. 14 (3): 440–447. doi:10.1038/oby.2006.58. PMID 16648615.
  58. Carlsson, Rickard; Björklund, Fredrick (2010). "Implicit stereotype content: mixed stereotypes can be measured with the implicit association test". Social Psychology. 41 (4): 213–222. doi:10.1027/1864-9335/a000029.
  59. "Understanding Implicit Bias". kirwaninstitute.osu.edu. Retrieved 2018-04-06.
  60. Banaji, M. R.; Hardin, C.; Rothman, A. J. (1993). "Implicit stereotyping in person judgment". Journal of Personality and Social Psychology. 65 (2): 272–281. doi:10.1037/0022-3514.65.2.272.
  61. "Understanding Implicit Bias". American Federation of Teachers. 2015-12-16. Retrieved 2018-04-06.
  62. Calanchini, J., Lai, C. K., & Klauer, K. C. (2020). Reducing implicit racial preferences: III. A process-level examination of changes in implicit preferences. Journal of Personality and Social Psychology. https://doi.org/10.1037/pspi0000339
  63. Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. L. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of Experimental Social Psychology, 48(6), 1267–1278. https://doi.org/10.1016/j.jesp.2012.06.003
  64. "Long-term reduction in implicit race bias: A prejudice habit-breaking inter...: Discovery Service for Loyola Marymount Univ". eds.a.ebscohost.com. Retrieved 2018-04-10.
  65. Burns, Mason D.; Monteith, Margo J.; Parker, Laura R. (2017). "Training away bias: The differential effects of counterstereotype training and self-regulation on stereotype activation and application". Journal of Experimental Social Psychology. 73: 97–110. doi:10.1016/j.jesp.2017.06.003.
  66. Sinclair, L.; Kunda, Z. (1999). "Reactions to a Black professional: Motivated inhibition and activation of conflicting stereotypes". Journal of Personality and Social Psychology. 77 (5): 885–904. doi:10.1037/0022-3514.77.5.885. PMID 10573871.
  67. Stangor, C.; Sechrist, G. B.; Jost, J. T. (2001). "Changing racial beliefs by providing consensus information". Personality and Social Psychology Bulletin. 27 (4): 486–496. CiteSeerX 10.1.1.297.266. doi:10.1177/0146167201274009.
  68. "Training away bias: The differential effects of counterstereotype training ...: Discovery Service for Loyola Marymount Univ". eds.a.ebscohost.com. Retrieved 2018-04-10.
  69. Dasgupta, N.; Asgari, S. (2004). "Seeing is believing: Exposure to counterstereotypic women leaders and its effect on the malleability of automatic gender stereotyping". Journal of Experimental Social Psychology. 40 (5): 642–658. doi:10.1016/j.jesp.2004.02.003.
  70. Macrae, C.; Bodenhausen, G. V.; Milne, A. B.; Thorn, T. J.; Castelli, L. (1997). "On the activation of social stereotypes: The moderating role of processing objectives". Journal of Experimental Social Psychology. 33 (5): 471–489. doi:10.1006/jesp.1997.1328.
  71. Macrae, C.; Bodenhausen, G. V.; Milne, A. B. (1995). "The dissection of selection in person perception: Inhibitory processes in social stereotyping". Journal of Personality and Social Psychology. 69 (3): 397–407. doi:10.1037/0022-3514.69.3.397. PMID 7562387.
  72. Macrae, C.; Mitchell, J. P.; Pendry, L. F. (2002). "What's in a forename? Cue familiarity and stereotypical thinking". Journal of Experimental Social Psychology. 38 (2): 186–193. doi:10.1006/jesp.2001.1496.
  73. "Explicit Reasons, Implicit Stereotypes and the Effortful Control of the Min...: Discovery Service for Loyola Marymount Univ". eds.a.ebscohost.com. Retrieved 2018-04-10.
  74. Rubinstein, Rachel; Jussim, Lee (2018-03-01). "Reliance on individuating information and stereotypes in implicit and explicit person perception". Journal of Experimental Social Psychology. 75: 54–70. doi:10.1016/j.jesp.2017.11.009.
  75. Forscher, Patrick; Lai, Calvin; R. Axt, Jordan; R. Ebersole, Charles; Herman, Michelle; Devine, Patricia; Nosek, Brian (5 May 2016). "A Meta-Analysis of Change in Implicit Bias". Department of Psychology, University of Wisconsin. Retrieved 2 September 2018. Cite journal requires |journal= (help)
  76. Oswald, Frederick; Mitchell, Gregory; Blanton, Hart; Jaccard, James; Tetlock, Philip (17 June 2013). "Predicting Ethnic and Racial Discrimination: A Meta-Analysis of IAT Criterion Studies". Journal of Personality and Social Psychology. 105 (2): 171–192. doi:10.1037/a0032734. PMID 23773046. Retrieved 2 September 2018.
  77. Donald, Heather Mac (9 October 2017). "The False 'Science' of Implicit Bias". Wall Street Journal. Retrieved 2 September 2018.
  78. Greenwald, Anthony G.; Banaji, Mahzarin R.; Nosek, Brian A. (2015). "Statistically small effects of the Implicit Association Test can have societally large effects". Journal of Personality and Social Psychology. 108 (4): 553–561. doi:10.1037/pspa0000016. ISSN 1939-1315. PMID 25402677.
  79. Fazio; Jackson; Dunton; Williams. "Implicit Bias". Archived from the original on 2013-10-23.
  80. Gawronski, B., Morrison, M., Phills, C. E., & Galdi, S. (2017). Temporal Stability of Implicit and Explicit Measures: A Longitudinal Analysis. Personality and Social Psychology Bulletin, 43(3), 300–312. https://doi.org/10.1177/0146167216684131
  81. Gawronski, B., Ledgerwood, A., & Eastwick, P. (2020). Implicit Bias and Anti-Discrimination Policy. Policy Insights from the Behavioral and Brain Sciences.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.