LGBTQ Groups Condemn "Dangerous And Flawed" Facial Recognition Intended To Predict Your Sexuality

Shutterstock / Fabrice Epelboin / Twitter

James Felton 12 Sep 2017, 15:51

LGBTQ groups have condemned as "dangerous" an algorithm developed by Stanford University to predict whether you are gay or straight based on your face.

Stanford claim the tech, which uses facial recognition, can distinguish between gay and straight men 81 percent of the time, and 74 percent of the time for women. Several prominent LGBTQ groups have issued a joint statement calling the research "dangerous and flawed" as well as "junk science".

The main concern is that the technology could be used be used to cause "harm to LGBTQ people around the world", as well as problems with the quality of the research itself. 


The Stanford researchers gathered 35,000 photos that had been publicly posted on a US dating website and analyzed them using a "deep neural network", where the AI analyzes visual features. The algorithm was fed information on the self-reported orientation of the people in the photographs and was asked to predict sexuality based on the photos alone.

The research, published in the Journal of Personality and Social Psychology, found that the AI was better than humans at recognizing whether someone was heterosexual or not. Humans, the researchers found, could identify orientation around 54 percent of the time for women and 61 percent of the time for men. 

The authors say that when the algorithm is given five photos of people to review, the accuracy goes up to 83 percent for women and 91 percent for men. 

However, LGBTQ groups and other commentators have said that this research is both flawed and has the potential to be horribly misused.

“At a time where minority groups are being targeted, these reckless findings could serve as weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous,” Jim Halloran, GLAAD’s Chief Digital Officer, said in a statement.

The groups are concerned that such technology, whether it's accurate or not, could be used by brutal regimes to persecute gay people or people they suspect of being gay.

The researchers responded with a statement of their own: "GLAAD and HRC representatives’ knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organizations strive to advocate."

Full Article

We are currently not accepting comments on this article.

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.