Menu Close

You are what you ‘like’, according to new Facebook personality test

It’s listening, but can it really hear me? Warner Bros

Pretty much all of us work with computers these days. The problem with computers is that while they can complete complex calculations and recall distant details, they’re terrible at sensing how you’re feeling. And that can make them very frustrating.

Machines’ lack of a capacity for empathy is one thing that separates them from humans. It’s also one of the barriers to human-machine communication. For a machine to respond correctly to someone’s emotional state the machine must understand the personality of the person involved. Personality pervades our communication process, to the extent that the same smile can mean different things.

Sherlock Holmes, that great fictional investigator, was expert at inferring someone’s history and personality from minor scraps of evidence. It’s perhaps reassuring that Holmes’ insight and the likes of television psychologists in Cracker or Lie To Me exist only in fiction.

However a recent paper by researchers from the University of Cambridge and Stanford University published in the Proceedings of the National Academy of Sciences in the US describes a computer program capable of making accurate guesses concerning someone’s personality type. These predictions turned out to be more accurate than those of their employer, friends, and even family. This isn’t the first achievement of its kind, but the first to have compared people’s self-reported personality types against external judges, both human and artificial.

The raw material in this case were the Facebook profiles of 86,220 volunteers, from which the “likes” were examined to form a view of the subject’s personality, were compared to the volunteers responses to the “big five” personality traits, extroversion, openness, agreeableness, conscientiousness and neuroticism. Friends of 17,622 volunteers also judged the personality of the volunteer as a comparison.

With a baseline of about 60 likes the algorithms were able to put forward a reasonable guess. With 300 likes or more it tended to make more accurate guesses than most people. In many cases the machine’s ability to identify cases of likely depression or impulsivity were better than that of the people who actually knew the subjects in real life.

While this is a significant improvement in terms of computers comprehending the unspoken and unwritten aspects of human communication, it does raise some interesting ethical questions.

Naturally speaking

Mattersight is a company that provides behavioural analytics software. What’s this? Well, for example, when a customer phones a call centre it listens to them during the call and next time the customer phones, based on their previous tone of voice and other verbal clues, it routes the call to an operator more similar to the caller. Talking to someone like you has been shown to increase sales and reduce complaints. This isn’t just true of conversations between people, but with machines too.

As emotion is naturally carried in our voices, computerised voice technology is one of the areas we’ve adapted to demonstrating it. With a plotline in the television series The Big Bang Theory about dating Apple’s talking assistant Siri, and the film Her in which a man falls in love with the artificial intelligence of his computer, people seem ready to accept that speech technology will be emotionally responsive.

Microsoft’s Cortana (named after the AI from the Halo video game) is an attempt to leapfrog Apple’s Siri and other competing speaking personal assistants such as Jibo and Amazon’s Echo. By mining information about you from the phone and linked services such as social media websites it seeks to become more compatible with you.

A survey revealed 65% of owners use Siri regularly, so the technical difficulties of voice recognition seem to have been mastered. People with emotional attachments to their devices seem to be more forgiving of imperfections – something the new generation of voice-based hardware will require.

Love me, love my robot

Emotional attachment to robots is not just for films. A study of the Roomba automatic floor vacuuming robot famous from YouTube cat videos shows that unlike other vacuums people become very emotionally attached to their Roombas – giving them names, creating custom clothes for them, helping them do their job.

Providing your device with personality is a way of creating customer loyalty. While this was never the intention with the Roomba, people’s willingness to anthropomorphise their machines also means owners were willing to forgive the imperfections of early models. Adding the ability to scan Facebook and other social network profiles to build up a picture from “likes” that identifies your personality type will allow a machine to become a more personable device.

Alternatively it might just be that by ignoring what people say and looking at what they do, the study’s personality algorithm was able to bypass Facebook users’ usual personality curation. People tend to only share positive statements about themselves and as such it is possible to feel that everyone else’s lives are better than your own. It’s known that most users of Facebook experience very negative emotions after using the site. This even reaches the state where the degree to which you think others are happier than you or how much you think life is fair is proportional to the amount of time spent on Facebook.

By studying what people like and their actions on Facebook it’s possible to arrive at a personality assessment based on data that has been less micro-managed, and so the real person stands out from the noise. In essence, it says that machines can establish that you really are what you “like”.

Given that your personality has been shown to predict things like substance use or depression, should you be worried about a digital Sherlock watching you? Well, while the machine was more accurate at predicting depression then friends and even family, everyone’s scores were pretty low (wrong on three out of four occasions) – a step up perhaps, but from a low base.

Want to write?

Write an article and join a growing community of more than 180,900 academics and researchers from 4,919 institutions.

Register now