When did you last read Facebook’s terms of service?
Despite depositing huge amounts of personal information on the social network, most of us have never given the user agreement more than a cursory glance.
That may explain why so many people feel particularly alarmed by the revelation that researchers from Cornell, the University of California and Facebook experimented with almost 700,000 people’s news feeds back in January 2012 to explore the idea of “emotional contagion”, whether being exposed to specific emotions invokes that mood in readers.
If you see more negative updates, do your contributions become more mopey in turn?
The paper, published in the March issue of the Proceedings of the National Academy of Sciences, a respected US journal, explains that the experiment hid a “small percentage” of emotional words from the news feeds of 689,003 English-speaking Facebook users over the course of a single week. The results suggested that people’s emotions are reinforced by what they read, providing evidence for “emotional contagion” through the written word.
Although Facebook’s algorithms constantly prune what we see in our newsfeeds, the idea that researchers poked around in what posts were presented to users feels very different. While we know that Facebook owns the roads, we kid ourselves that, despite the user agreement, it’s us that control the traffic.
Despite Facebook’s high-flown rhetoric about connecting the world, its ultimately in the business of selling advertising and advertising hinges on manipulating our emotions.
A study of the “emotional contagion” on the social network was always going to be seen as more than intellectual enquiry.
While academics wrestle with the question of whether exploiting Facebook’s data use policy can really be deemed informed consent, users will just add another tick to the column marked “creepy”.
Defining Facebook as a platform gives the impression that it is in a neutral conduit for information but it’s far from it. Along with Google and Twitter, Facebook relies almost entirely on data-driven advertising dollars.
As long as the data is anonymised, Facebook’s policy allows it to do practically anything with the information you share through its service.
With the coming wave of wearables – Facebook’s already snapped up the iOS motion tracking app Moves – even more data is going to flow onto its servers, allowing a fuller picture of our physical and mental states to be sold to advertisers.
Giving something back
In an ideal world, Facebook would share some of its wealth with the users whose data makes it compelling to advertisers. There would be a social networking equivalent of Google AdSense, sharing a small percentage of the cash with those willing to fork over more of their information to be picked through.
But Facebook would never do that as its future growth rests on getting more data and users remaining sanguine about sharing it. If advertising is the elephant in the room, it has to remain covered with a great big sheet.
That the social networking giants don’t allow individual users to earn anything from their own data leaves an opportunity for new platforms though.
Take Line, the fast-growing messaging app, which recently added the ability for users to create and sell the virtual stickers that are massively popular on its platform. Giving customers an extra stake in a service could be a very smart way of making them more comfortable with how their data is used.