GUARDIAN: "Face-Reading AI Will be Able to Detect Your Politics and IQ, Professor Says"
09/14/2017
A+
|
a-
Print Friendly and PDF

From The Guardian:

Face-reading AI will be able to detect your politics and IQ, professor says

Professor whose study suggested technology can detect whether a person is gay or straight says programs will soon reveal traits such as criminal predisposition

Your photo could soon reveal your political views, says a Stanford professor.

Sam Levin in San Francisco

Tuesday 12 September 2017 03.00 EDT

Voters have a right to keep their political beliefs private. But according to some researchers, it won’t be long before a computer program can accurately guess whether people are liberal or conservative in an instant. All that will be needed are photos of their faces.

Michal Kosinski – the Stanford University professor who went viral last week for research suggesting that artificial intelligence (AI) can detect whether people are gay or straight based on photos – said sexual orientation was just one of many characteristics that algorithms would be able to predict through facial recognition.

Using photos, AI will be able to identify people’s political views, whether they have high IQs, whether they are predisposed to criminal behavior, whether they have specific personality traits and many other private, personal details that could carry huge social consequences, he said.

Someday computers will be so terrifyingly advanced that they will be able to stereotype this guy just from his photo!

Kosinski outlined the extraordinary and sometimes disturbing applications of facial detection technology that he expects to see in the near future, raising complex ethical questions about the erosion of privacy and the possible misuse of AI to target vulnerable people.

… With Kosinski’s “gaydar” AI, an algorithm used online dating photos to create a program that could correctly identify sexual orientation 91% of the time with men and 83% with women, just by reviewing a handful of photos.

Kosinski’s research is highly controversial, and faced a huge backlash from LGBT rights groups, which argued that the AI was flawed and that anti-LGBT governments could use this type of software to out gay people and persecute them.

In other words, stereotyping well-groomed men with glittering eyes as more likely to be gay is both empirically absurd and dangerous to the many well-groomed men with glittering eyes who are gay.

Someday, artificial intelligence will have progressed to the dystopian extreme where its assessments of individuals are almost as accurate as those of an Italian grandmother.

By the way, a friend of mine likes to tell waitresses at expensive restaurants, after two minutes of chatting about today’s specials, that he thinks he can guess their SAT scores.

He’s pretty accurate at it.

Of course, he personally has a high 4 digit SAT score himself and has earned an 8 or 9 figure net worth in the tech business. Could he program a computer to guess as well as he can? Probably not quite, but he’s also extremely good at systematizing his intuitions, so he’d likely come a lot closer than you or me would.

[Comment at Unz.com]
Print Friendly and PDF