Sections

Services

Information

UK United Kingdom

Should Facebook have experimented on 689,000 users and tried to make them sad?

Facebook manipulating emotions

In a move that has as many people puzzled as outraged, Facebook has published research that involved a deliberate attempt to manipulate the emotional state of 689,000 of its users. Experimenters from Facebook, Cornell University and the University of California, San Francisco conducted experiments over a week period in January 2012 in which they manipulated the contents of users' News Feeds, screening out posts that had emotional content. The results of the study have recently been published in the Proceedings of the National Academy of Sciences of the USA.

In the experiment, users were split into three groups and posts which contained either positive or negative words were screened from the users' news feed. One of the groups acted as a control and had random posts screened from their feeds. They then counted the percentage of emotion words that the test subjects used in their own posts.

The results showed that there was a very small, but statistically significant result. People who had fewer positive posts shown to them reduced their own use of positive words by 0.1% and increased their use of negative words by 0.04%. Conversely, people who had fewer negative posts shown to them increased their use of positive words by 0.06% and decreased their use of negative words by 0.07%.

The emotional responses shown by the unwitting participants in the study are nothing compared to the sense that Facebook, as a private company, has taken another step too far in the use of its network and created mistrust and resentment in its user community.

Although the experiment may not have breached any of Facebook’s user agreements, it is clear that informed consent was not obtained from the participants of the research. The study itself allegedly received approval by the Institutional Review Boards at the researchers' universities. According to the article’s editor Susan Fiske, this was given on the basis that “Facebook apparently manipulates people’s News Feeds all of the time”.

Professor Fiske, a psychologist at Princeton University who reviewed the paper said that she was “creeped out” by the nature of the research. Despite this, she believed that the regulations had been followed and there wasn’t any reason the paper should not be published.

The ethics of good research

We don’t know the full nature of the ethical clearance that was given to the researchers from their respective universities and so it is hard to comment fully on the nature of the approval they were given for the research to go ahead. If this was indeed on the basis of Facebook’s agreement with its users, then it would be fair to say that this was a very liberal interpretation of informed consent.

Facebook’s Data Use Policy only says that it has the right to use information it receives for research. It does not make explicit that this involves actually carrying out experiments designed to manipulate emotions in their customers, especially not negative ones.

Federal US guidelines on human research provided within the “Common Rule” are quite clear about what is and isn’t acceptable in this type of research. It includes details of how informed consent must be obtained and the information, including the risks and benefits, of them being involved. They must also be allowed to opt out of the research. Although Institutional Review Boards are required for organisations conducting research funded by or on behalf the Government, private companies are also signatories to the regulations.

The fact that the researchers and Facebook did not ask for consent suggests that they knew that there would be a backlash when it became public and that it would be easier to deal with this after the fact.

Right now, the researchers involved are not allowed to answer questions on the research and this is being handled by Facebook itself.

What did the research itself prove?

It is not at all clear that the research actually did say very much concerning the transfer of emotional states via emotional contagion as it stated. The measurement of the frequency of emotion words in very short status updates is clearly not a measure of the overall emotional state of the writer.

Even if it were, the results of the experiment found differences of 1 word in a 1,000 in the number of emotional words used between the experimental and control groups. Remember that this is the number of positive or negative words used not the total number of words written. At the level of the individual, these differences are meaningless and hardly a demonstration of “emotional contagion”.

Big Data brings with it the naive assumption that more data is better when it comes to statistical analysis. The problem is, however, that it actually introduces all sorts of anomalies especially when dealing with extremeley small differences that appear in one single measure at scale.

There may yet be another twist to this story. Given that it would be particularly strange that a prestigious journal would publish what seems to be quite weak research, perhaps this is all part of a bigger experiment to see how society reacts, especially on Facebook, to the idea that Facebook believes that its customers are actually just test subjects to be examined at will.

Join the conversation

12 Comments sorted by

  1. Dave Bradley

    logged in via email @yahoo.com.au

    The odd rock and roll group has been sued as being responsible for the suicide of vulnerable people but I guess face book has a clause in the terms and conditions where you accept responsibility for your own doom, I mean you give away your whole life to them anyway what's a little bit extra.
    Julian Assage and Edward Snowden just used the wrong business model.But then I guess they kind of both have an aversion to hiding stuff in the fine print. bugger, moral principles catch you every time.

    report
  2. Diogo Marzo

    logged in via email @hotmail.com

    So, it looks like they're honing their skills in mass manipulation. Now they can attach those results to purchasing tendencies: emotional eater, compulsive gambler, impulse buyer? Ah, the "free" market at its best. Ethics? What's that?

    report
  3. Paul Hampton-Smith

    logged in via Facebook

    It seems to me to be an entirely unsurprising piece of research. First, that Facebook did the research in the first place: their whole business model is based on manipulating user behaviour. Second that it achieved the result it did. When faced with sad or happy information, people would naturally respond in kind

    report
  4. catherine mcdonald

    logged in via Twitter

    Truly. An outrageous abuse of research ethics. AND it tells us nothing more than we already know. It is possible (easy in fact) to manipulate an audience's emotional states.

    report
  5. Chris Booker

    Research scientist

    Having just had a look over the PNAS paper, this is pretty weak. They're stating that people being exposed to less 'positive' content use fewer 'positive' words in their comments and posts. Well duh! this is hardly emotional manipulation. For example a friend of mine posted the other day that her mother was in a hospice dying of cancer. What would my response be? Awesome? Great news!? What they seem to be observing is simply behaviour appropriate to the context. And as you also point out just measuring the relative quantities of 'positive' vs 'negative' words in posts something can't be used as a measure of their emotional state.

    This is one of those papers that really irks me as a scientist... poorly done research, but yet it's already getting big news coverage. This is sadly a well-worn path...

    report
    1. Peter Anderson-Stewart
      Peter Anderson-Stewart is a Friend of The Conversation.

      Medical scientist

      In reply to Chris Booker

      Good old PNAS (Practically Nearly Almost Science ... as we often referred to it in one of my institutes) often let stuff through not necessarily based on good science but who knows whom on the editorial board ...

      The system really needs to get rid of the journal impact factor method and spend more time on the assessment of the individual papers regardless of which journal they are published in ...

      report
  6. Susan Nolan

    retired

    For my money, facebook doesn't have to try to make me sad.

    It does that already by splattering my timeline with unwanted ads whilst simultaneously not putting some posts (by some strange decision-making process of their own) from my "liked" pages onto my timeline.

    And I'm none too happy about their thinking that they can make me a subject in their so-called research - or is that experiment.

    And what's this mucking about with my friends' posts so that I don't see what my friends wanted me to see? They didn't tell me I had to go to my friend's wall to see their post appearing as they intended it to.

    I just wish I could transfer some of this emotion of mine - preferably contagiously - to facebook. Let them suffer every time they make me suffer!

    report
    1. David Glance

      Director of Innovation, Faculty of Arts, Director of Centre for Software Practice at University of Western Australia

      In reply to Susan Nolan

      I am going to follow up with an article that shows ways in which you can remove ads and control your news feed on Facebook - stay tuned

      report
    2. Graham Gower

      ex engineer, evol biology student

      In reply to David Glance

      Everyone knows how to use firefox and install adblock. How about some ideas for blocking ads on mobile devices?

      report
  7. Robert Tony Brklje
    Robert Tony Brklje is a Friend of The Conversation.

    retired

    “Facebook apparently manipulates people’s News Feeds all of the time” I have got to say that was a real shocker. It will be interesting to find out what is the basis, what are Facebooks policies around manipulating People's Newsfeed and why didn't those researchers public mention that when they commenced their research, I am sure a lot of people would be interested in that.

    report