Advertisement

technologyTechnology
clockPUBLISHED

AI Shows US Stereotypes About Women And Ethnic Minorities Have Changed For The Better

author

Madison Dapcevich

author

Madison Dapcevich

Freelance Writer and Fact-Checker

Madison is a freelance science reporter and full-time fact-checker based in the wild Rocky Mountains of western Montana.

Freelance Writer and Fact-Checker

A Stanford team used special algorithms to detect the evolution of gender and ethnic biases among Americans from 1900 to the present. whiteMocca/Shutterstock

It’s no secret that the words we use can change the way people perceive us. Certain slang can denote someone’s age or demographic, while our vocabulary might indicate a level of intelligence. Using that same train of thought, researchers at Stanford University are using Artificial Intelligence (AI) to track how stereotypes in the US have changed over the last century.

For the most part the study, which is published in PNAS, found society is shifting to have a more positive association with women and some ethnic groups.

Advertisement

It’s a technique known as word embeddings. In analyzing databases of American books, newspapers, and other large datasets, the smart technology applies a geometrical vector to certain words giving them a point in space. The algorithm then maps relationships and associations between words in order to measure changes in the way we talk about each other.

For example, when searching for associations to the word “honorable,” previous research indicated it had a closer tie to the word “man” than it did to the word “woman”. This machine learning allows computers to analyze large quantities of data and find patterns, in this case to find out which stereotypes still apply today and which are lost in the past.

Over the last century, the way we talk about gender and ethnic issues has changed our perception of stereotypes in a way that lines up with major social movements and demographic changes in the US Census. They found “quantifiable shifts” in gender portrayals and other groups.

Take women, for example. Researchers say perspectives have changed mostly for the better over time. Adjectives like “intelligent,” “logical,” and “thoughtful” were more associated with men in the first half of the 20th century, but since the 1960s women’s movement they have increasingly become more associated with women. Although, it’s worth noting a gap still does remain.

Advertisement

A noticeable shift has also occurred in relation to Asians and Asian Americans. In 1910, the words “barbaric,” “monstrous” and “cruel” were mostly associated with Asian last names. Because of immigration increases in the 1960s and 1980s, by the 1990s such words were replaced with ones like “inhibited,” “passive” and “sensitive”.

“The starkness of the change in stereotypes stood out to me,” said author Nikhil Garg in a statement. “When you study history, you learn about propaganda campaigns and these outdated views of foreign groups. But how much the literature produced at the time reflected those stereotypes was hard to appreciate.”


ARTICLE POSTED IN

technologyTechnology
  • tag
  • Stanford,

  • AI,

  • machine learning,

  • word embeddings

FOLLOW ONNEWSGoogele News