fbpx Skip to content

Is Facebook toying with your emotions?

In an experiment it conducted earlier this year, Facebook injected the feeds of nearly 700,000 of its (unknowing) users with negative content to see if it would make the posts they wrote more negative.

The researchers believe that it did. The mood of the posts seen in the news feeds of the experiment’s subjects moved like a “contagion” into the posts of said subjects.

Goes both ways

The inverse, too, was true, the researchers say.

It’s surprising that Facebook would mess with the moods of its users, who — allow me to remind you — are its bread and butter. Exposing users to the advertisements of Facebook’s partners — on and off Facebook –  is the social media giant’s only real business.

It’s even more surprising the company would publish the results. The paper appeared in the Proceedings of the National Academy of Sciences (PNAS).

From the paper: “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

Is it valid?

But the PNAS has doubts about the validity of the research outcome. It’s unclear, the PNAS says, whether the negative and positive posts from research subjects were caused only by the manipulation of the news feed, and not by negative interactions with other users.

Facebook data scientist Adam Kramer led the research. Here’s what it says at Kramer’s American Psychological Association page: “D.I. Kramer, PhD, has an enviable subject pool: the world’s roughly 500 million [now well more than a billion] Facebook users.”

The other researchers were Cornell University professor Jeff Hancock and UCSF post-doctoral fellow Jamie Guillory.

The paper says that users provided tacit consent to be used in research studies when they signed up for Facebook and agreed to Facebook’s Data Use Policy.

Does Facebook care about you?

So no legal exposure for Facebook, but definitely some more bad vibes from a company that has demonstrated over and over that the needs of its advertising business always trump the needs of its users.

This experiment takes Facebook’s disregard to another level, as it actively sought to impact the wellbeing of users.

Facebook already has plenty of ways to make people unhappy, from its humblebrags to its envy-inducing profiles. Last year a University of Michigan study told us that Facebook makes many young people depressed. Another study, published by Berlin’s Humboldt University, reported that Facebook often fills users with feelings of envy.

And what is the point of this research? Why is it being conducted? Is it purely an academic exercise, or could it be used by some unscrupulous party to mess with people’s feeds and moods on a regular basis?

Facebook should immediately disclose the names of the Facebook users whose feeds it manipulated.

Topics