Media Partners | Contributors | Advertise | Contact | Log in | Tuesday 28 March 2023
182,621 SUBSCRIBERS

A new algorithm can identify your sexuality by photo analysis

RATE THIS ARTICLE

Share This Article:

A recent breakthrough in Artificial Intelligence (AI) suggests how collections of algorithms have the capacity to analyse photographs, which in turn can decipher the sexuality of either sex.

The research paper is entitled The Journal of Personality and Social Psychology and is developed by researchers at Stanford University.

It has proved that when the algorithm is faced with both a heterosexual and a homosexual person, the mechanism can correctly distinguish the sexuality of the person at 83% of the time for women and a significant 91% of the time for men.

It's not the first time AI has been used to analyse images and draw autonomous conclusions. However, despite the algorithm is not entirely accurate on each occasion, the study proved that human judges' conclusions were not ‘much more accurate than random guesses’.

With a sample of 35,000 portrait images of both sexes, creators Yilun Wang and Michal Kosinski also applied this research to Facebook and a similar sample of profile pictures.

The findings identified were found to be similar to the results of the prior research. The outcome of the results was “comparable with the accuracy of spectroscope at detecting breast cancer (88%) or modern diagnostic tools for Parkinson's disease (90%)”.

A major criticism has been found, however, in relation to the privacy of the algorithm, which has raised further social and moral concerns surrounding the study.

In an interview with Motherboard (VICE), a cyber-security researcher stated that "even if we accept the paper's premise that someone can appear visually queer, then the paper still has major ethical issues around participant consent and the overall aim of the research”.

In response to the issues of privacy on the usage of photos, the author wrote that they “used widely available off-the-shelf tools, publicly available data, and standard methods well known to computer vision practitioners”.

They further commented that they “did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats”.




CONTRIBUTOR OF THE MONTH
Ranking:
Articles: 29
Reads: 201906
© 2023 TheNationalStudent.com is a website of Studee Limited | 15 The Woolmarket, Cirencester, Gloucestershire, GL7 2PR, UK | registered in England No 6842641 VAT # 971692974