If you think that Facebook is just a great way to keep in touch with your friends and family, and anybody who moans about privacy violations should simply log off, then think again. Facebook has revealed that in back in January of 2012, it carried out a massive secret psychology experiment on over 600,000 users, during the course of a week, all without their knowledge.
The experiment was to find out how the Facebook users responded to either positive or negative messages, but how could the social media group possibly control what friends and family posted onto people’s news feed?
Well, they couldn’t control the actual postings but they could determine the tone of the person’s feed to emphasize either a positive or negative mood, and this is what then appeared on their status. The response of the user was then monitored and data collected to see whether the positive or negative tone had any impact on the person when they read it.
The results were published in a paper in the Proceedings of the National Academy of Scientists of the United States (PNAS Journal) by the team of Facebook scientists, who concluded that:
“The results show emotional contagion. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
So the experiment showed that we are more likely to be positive if we view a positive status and feel negative if we see a negative one. But the study did not take into account feelings of jealously, low self-esteem or high self-confidence and arrogance. How would a jealous person feel to a status that positively reinforced another’s good fortune? Would a user with low self-esteem feel equally sorry with someone elses bad news?
More importantly for many people however, is not the diagnosing of the results, but the underhand way in which Facebook carried them out. How were they allowed to conduct such a massive experiment in which over 600,000 people took part without their knowledge? Well, when you join up and tick the terms and conditions box at the start, this gives them the right to conduct ‘data analysis, testing, research and service improvement.’
So we know they have a legal right, but is it ethical, and with all experiments, don’t the participants have a right to know why they were involved in the experiment and shouldn’t they have a proper de-briefing at the end? Online magazine The Atlantic wanted answers and emailed the Facebook team, here’s their response:
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
The team at Facebook insist that no personal information was disclosed and that they used a machine to choose the positive and negative posts, so no human ever viewed personal data. Even so, we should remember that millions of people use Facebook daily and now Facebook and other social media sites are aware that people can be manipulated by their very own news feeds, is it time to really log off and live in the real world?