Prejudice Can Actually Change How You View Faces

The way we see faces can sometimes be clouded by deeply ingrained prejudices. Ollyy/Shutterstock

What you see when you look at someone may not always be an entirely fair representation of that person, according to a new study published in the journal Nature Neuroscience into how our deepest prejudices – including those that we wish we didn’t have – can cloud the way that our brains process visual stimuli when observing faces.

After subjecting participants to a number of tests designed to reveal their instinctive interpretations of other people's faces – as well as the neurological activity underlying these interpretations – the study authors found that many people automatically classify black faces as "angry" and female faces as "happy", even when this is not the case. Asian faces, meanwhile, tend to be viewed as female – and therefore happy – when glimpsed for just a few milliseconds, regardless of their actual gender.

Based on these findings, the researchers from New York University concluded that the "stereotypes we have learned can change how we visually process another person, [and] this kind of visual stereotyping may only serve to reinforce and possibly exacerbate the biases that exist in the first place."

In the multicultural, cosmopolitan Western world, we like to think of ourselves as progressive free-thinkers, unconstrained by irrational bigotry or prejudice. However, only a fool would deny that underlying stereotypes still linger throughout society, influencing the ways in which many people assess and interact with others.

These stereotypes apply to a wide spectrum of ethnic, gender, and economic demographics, and while most of us are broadminded enough to engage our rationality and reject these as mere clichés, little is known about how they hijack our unconscious cognitive processes.

The fusiform gyrus is thought to be responsible for assigning social categories to faces. Gray, vectorized by Mysid, colored by was_a_bee via Wikimedia Commons

To investigate this, the researchers showed volunteers a sequence of faces representing a range of different races and genders, while also depicting several emotional states such as angry or happy.

Using a computer mouse, participants were asked to immediately click on the correct description of each face, without taking any time to think about it. The researchers used a mouse-tracking technique to measure the hand movements of each participant in the first few hundred milliseconds after viewing each picture.

In doing so, they were able to note which descriptions each person instinctively moved towards, before their rationality kicked in and they deduced the correct label.

At the same time, the team used functional magnetic resonance imaging (fMRI) to monitor the brain activity of each participant, focusing particularly on a brain region called the fusiform gyrus (FG). Previous research has shown that this part of the brain plays a role in distinguishing the social categories of faces, although some scholars have suggested that it may be influenced by deep-seated prejudices and stereotypes, causing it to often misrepresent these faces.

The study authors explain how the mouse-tracking tests revealed a number of common stereotypes. These mistakes were mirrored by the fMRI scans, which revealed that when people viewed black faces, for instance, the activity patterns in their FG were similar to those seen when these people viewed angry faces.

Commenting on this finding, study co-author Jonathan Freeman explained that “many individuals have ingrained stereotypes that associate men as being more aggressive, women as being more appeasing, or Black individuals as being more hostile – though they may not endorse these stereotypes personally.” Intriguingly, the results of this study would appear to suggest that “these sorts of stereotypical associations can shape the basic visual processing of other people, predictably warping how the brain 'sees' a person's face.”

Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.