Facebook Violates User Privacy with Psychology Experiments to Increase Revenues

333

We already know that Facebook tracks our activity on the social network, as stated in their data use policy. This Big Brother tendency created much controversy in the past few years, with critics claiming that the social networking site overstepped its boundaries. However, it seems that Facebook has taken the next step in its efforts to become a human research lab, and has begun experimenting with manipulating its users emotions.

For a week in January 2012, Facebook data scientists manipulated the newsfeeds of over 689,000 users by removing either all of the positive posts or all of the negative posts to see how it affects their moods. So if there was a week where you were only seeing videos of cute kittens or articles about disasters around the world, you may have been a guinea pig in the study. Now that the experiment has been revealed, the public’s mood is unanimously “disturbed”

Adam Kramer, the lead researcher on this experiment, found that users’ emotions were affected by emotionally charged content. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; the opposite pattern occurred were negative expressions were reduced,” the paper states. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

According to the paper, the experiment was administered between January 11th and January 18th, 2012. Hundreds of Facebook users unknowingly participated, since they’ve already agreed to the site’s data use policy.

Another observation noted by researchers was that when they removed all emotionally charged posts from a user’s newsfeed, that person did not post as much and became “less expressive”. One can only imagine what Facebook would do with this information – perhaps flood your newsfeed with the most emotional of your friends’ posts if they feel you’re not posting enough to their liking.

A statement released by Facebook on June 28th shows the company’s apathy about the public’s reaction to its new experiment. Facebook doesn’t seem to understand the ethics of emotional manipulation and whether its Terms of Service agreement stands as a legitimate form of informed consent for a study like this. A spokesperson said, “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

An ideal response from Facebook would be to have a separate consent process for users who were willing to participate in the study – like a check box saying that you’re willing to be subjected to random psychological experiments that Facebook’s data researchers conduct in the name of science.

Get Free Updates and Stock Alerts!



*We only send one email per week
Share.

Get Winning Stock Alerts!

Our track record speaks for itself! Our last 7 alerts have delivered combined gains in excess of 300% and there are no signs of slowing down. Join UltimateStockAlerts.com now before you miss out on our next big runner!

We will never sell or share your information.