AI technology can now identify signs of sexuality - but do we want it to?
Controversial new study deduces the sexuality of people on a dating site with up to 91% accuracy
Researchers from Stanford University have unveiled a computer program that can infer sexual orientation by analysing people’s faces.
The Journal of Personality and Social Psychology study – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% of the time for women – “has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes,” says The Guardian.
The researchers, Michal Kosinski and Yilun Wang, suggest the software does this by picking up on subtle differences in facial structure. The programme relied on 130,741 images of 36,630 men and 170,360 images of 38,593 women downloaded from an American dating website, which makes its profiles public.
The images were then fed into a different piece of software called VGG-Face, which turned the images into a long string of numbers to represent each person; their “faceprint”.
The programme then used a predictive model to find correlations between the “faceprints” and their owners’ sexuality. When the resulting model was run on data which the AI had not seen before, “it far outperformed humans at distinguishing between gay and straight faces,” says The Economist.
Human judges accurately identified orientation only 61% of the time for men and 54% for women.
“When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women,” says The Economist.
Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the researchers wrote.
But the study has its limitations, say the researchers. Outside the lab the accuracy rate would be much lower as the 91% accuracy rate applies only when one of the two men is known to be gay. When the ratio was more similar to that of the real world, in which roughly seven out of 100 men are gay, the programme struggled to predict with the same level of accuracy.
Despite this, “if the goal is to pick a small number of people who are very likely to be gay out of a large group, the system appears able to do so,” says The Economist.
“The point is not that Kosinski and Wang have created software which can reliably determine gay from straight. That was not their goal. Rather, they have demonstrated that such software is possible,” adds the magazine.
Alarmingly, with billions of facial images of people stored on social media sites and in government databases, Kosinski and Wang also believe that “public data could be used to detect people’s sexual orientation without their consent”.
‘Do we want to know?’
The research has unsurprisingly prompted much controversy, with some saying it seems to enforce a binary concept of sexuality.
Others have argued building this kind of software and publicising it is itself controversial given concerns that it could encourage harmful applications.
“Governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations,” says The Guardian.
“With the right data sets,” Kosinski says, “similar AI systems might be trained to spot other intimate traits, such as IQ or political views.”
Brian Brackeen, CEO of Kairos, a face recognition company agrees saying: “AI can tell you anything about anyone with enough data. The question is as a society, do we want to know?”
Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar described the development as “certainly unsettling”.
“Like any new tool, if it gets into the wrong hands, it can be used for ill purposes,” he said.
But Rule argued it was still important to develop and test this technology: “What the authors have done here is to make a very bold statement about how powerful this can be… Now we know that we need protections.”