Is AI judging your personality?

From TheUSBreakingNews:

AirBnB wants to know if you have a “Machiavellian” personality before renting you a beach house.

The company may be using software to judge whether you’re trustworthy enough to rent a house based on what you post on Facebook, Twitter and Instagram.They’ll turn the systems loose on social, run the algorithms and get results. For the people on the other end of this process, there will be no transparency into the process — no knowledge whatsoever — and no appeals process.

They’ll turn the systems loose on social, run the algorithms and get results. For the people on the other end of this process, there will be no transparency into the process — no knowledge whatsoever — and no appeals process.

The company owns a patent on technology designed to rate the “personalities” of prospective guests by analyzing their social media activity to decide if they’re a risky guest who might damage a host’s home.

. . . .

The end product of their technology is to assign every AirBnB guest customer a “trustworthiness score.” This will reportedly be based not only on social media activity, but other data found online, including blog posts and legal records.

The technology was developed by Trooly, which AirBnB acquired three years ago. Trooly created an AI-based tool designed to “predict trustworthy relationships and interactions,” and which uses social media as one data source.

The software builds the score based on perceived “personality traits” identified by the software, including some that you could predict – awareness, openness, extraversion, kindness – and some strangers – “narcissism” and “Machiavellianism,” for example. (Interestingly, the software also seeks to get involved in civil litigation, suggesting that now or in the future they can ban people based on the prediction that they are more likely to sue.)

AirBnB has not said whether they use the software or not.

Link to the rest at TheUSBreakingNews

From Stanford Engineering:

Computers can judge personality traits far more precisely than ever believed, according to newly published research.

In fact, they might do so better than one’s friends and colleagues. The study, published Jan. 12 and conducted jointly by researchers at Stanford University and the University of Cambridge, compares the ability of computers and people to make accurate judgments about our personalities. People’s judgments were based on their familiarity with the judged individual, while the computer used digital signals – Facebook “likes.”

. . . .

According to Kosinski, the findings reveal that by mining a person’s Facebook “likes,” a computer was able to predict a person’s personality more accurately than most of the person’s friends and family. Only a person’s spouse came close to matching the computer’s results.

The computer predictions were based on which articles, videos, artists and other items the person had liked on Facebook. The idea was to see how closely a computer prediction could match the subject’s own scores on the five most basic personality dimensions: openness, conscientiousness, extraversion, agreeableness and neuroticism.

The researchers noted, “This is an emphatic demonstration of the ability of a person’s psychological traits to be discovered by an analysis of data, not requiring any person-to-person interaction. It shows that machines can get to know us better than we’d previously thought, a crucial step in interactions between people and computers.”

. . . .

“A future with our habits being an open book may seem dystopian to those who worry about privacy,” they wrote.

Kosinski said, “We hope that consumers, technology developers and policymakers will tackle those challenges by supporting privacy-protecting laws and technologies, and giving the users full control over their digital footprints.”

Link to the rest at Stanford Engineering

PG notes that if something can be done, eventually it will be done by someone.

1 thought on “Is AI judging your personality?”

  1. Eh, I don’t see this lasting very long. Just until someone gets hold of the software, runs it over politicians, and publishes the results.

Comments are closed.