A Contagion of Negativity: Why Facebook Made Us Sad

A Contagion of Negativity: Why Facebook Made Us Sad
Image: "Facebook's Infection" [CC] Katie Sayer

So it turns out that, yes, we are impacted emotionally by the streams of social information we consume. Reading a steady stream of happy stories makes us happy, and — in a very tiny way — influences our decisions to share and comment on posts in a positive way. And streams of sad stories? Spoiler alert: they make us sad. But it’s Facebook’s manipulation of our news feeds that makes us very angry, indeed.

That’s the not-exactly-shocking revelation of  a study carried out by Facebook researchers and their academic partners published last week. But you probably already knew that, because as it turns out when people on the Internet discover that they’ve been unwitting participants in a massive A/B test, collectively they lose their minds.

Creepy? Yes. Oh, absolutely. It chafes already that Facebook chooses which of our friends’ posts we do or do not see based on some black-box algorithm, right? Though, of course — without squelching *something* most users’ feeds might soon be overwhelmed with the sheer volume of sharing, liking, commenting, and sponsored posts for Keurig coffee brewers. (Maybe that last one is just me — an occupational hazard.)

Unethical? Well there’s certainly been a vigorous debate (just about everywhere, it seems, except Facebook) and for those keeping score it’s probably dead even, save that many of those who argue Facebook’s experiment is unethical do so somewhat louder and more stridently than their counterparts. (For a thoughtful and informed point of view on the ethics of such experiments, try this post from Science Based Medicine.)

Me, I’m not an ethicist, and talk of Institutional Review Boards and approval committees and informed consent makes me itch. Instead, for me it boils down to one simple thing: Facebook deliberately caused people — their users — emotional harm. While that was not the objective of the study, it was nonetheless the method. And, let’s be honest… some folks are on sort of a short emotional tether, already. Deliberately messing with people for the express purpose of learning whether they’ll get bummed enough by a negative stream of stories to walk away and see fewer of your ads is… well it’s pretty awful.

I will not be surprised to learn this is merely a drop in the bucket of Facebook’s test suite. In fact, I’d suggest they’re *constantly* testing their news streams, their content-driven ads, their personalized fees. It’s what content marketers *do*. But the objective — always — is to make the experience better: more relevant, more rewarding, more useful and usable, more timely, more effective. Sell stuff? Sure. Make an emotional connection with a brand? You betcha. Make you unhappy? Deliberately mess with your emotional well being? Hell, no.

Facebook acted badly. And yet another non-apology apology isn’t enough.