Monday, July 14, 2014

Contagion

I've said it before and I'll say it again: I am so very glad I deleted my Facebook account last August.

I've been mulling over the recent Facebook controversy.  In case you missed it, it has recently come out that, in January of 2012, Facebook manipulated the News Feeds of approximately 689,000 Facebook users, adjusting what they did or did not see in their News Feed, for a week.

The goal was to test--on a large scale--the theory of "emotional contagion."  The resulting article, "Experimental evidence of massive-scale emotional contagion through social networks" was published in Volume 111, Issue number 24 of the Proceedings of the National Academy of Sciences (PNAS) on June 17, 2014.  (I'd get a different acronym for that journal.  Say it out loud: "PNAS."  See what I mean?) 

The concept of "emotional contagion" can best be summed up with an image:


Saturday Night Live's "Debbie Downer" is essentially a comic take on the concept of "emotional contagion."  You're happy, Debbie Downer shows up and is... a downer... and pretty soon, you're not feeling so happy either.

You've probably heard a variation of the theory of emotional contagion coupled with Facebook many times in the past.  It goes like this: people check their Facebook when they're feeling blue and end up feeling even worse.  Several research studies have attempted to explore and explain the effects of Facebook on overall mood: Maria Konnikova's September 13, 2013 article in The New Yorker, "How Facebook Makes Us Unhappy," cites several prominent studies that explore the interconnections between Facebook use and mood (both positive and negative).

The latest concerns have been sparked by a study conducted in January 2012, when members of Facebook's "Core Data Science Team" and "News Feed Team," in conjunction with researchers at Cornell University, conducted an experiment to see if emotional contagion could be spread over a network, without direct interpersonal interaction.

Theories of emotional contagion have been hotly disputed, because it's extremely hard to divorce emotion from context.  So, doubters have suggested that, if someone tells you about their grandma's funeral, you may feel sad, not because you've been "infected" by that person's sadness, per se, but because their story reminds you of your own grandma's funeral.  The sadness you feel isn't "theirs," it's "your own," but it is sparked by your shared experience with another person.

In lab experiments designed to measure the existence and/or effects of emotional contagion, it is extremely hard to rule out the effect of "shared experiences"--or to predict when they might occur.

Similarly, it may be the interaction itself that influences your emotional state, not the precise emotion that the other person is feeling.  A happy person is usually fun to talk to and hang out with--and this can alter your own mood.  Debbie Downer brings everyone down, not so much because she's sad herself, but because her responses in conversations are always about things like death and disease and natural disasters.  Her interaction with others--her contribution to the conversation--shapes the overall mood.

Moreover, as the authors of  June 17th article in PNAS note, "To date ... there is no experimental evidence that emotions or moods are contagious in the absence of direct interaction between experiencer and target" (8788).  The assumption has been that emotional contagion can only occur when people interact directly.

The "benefit" of Facebook (for this experiment, that is), is that direct interpersonal interaction can be eliminated.  You're not chatting with someone or connecting with them in any way, you simply log into Facebook (heaven help you) and your News Feed pops up for you to see.

The News Feed is already filtered.  You don't see everything that's available, you see what one of Facebook's many, many algorithms decides that you "want" to see, based on your Facebook-behavior overall.

So, for one week in January, Facebook decided that they would test what happens when you see what they want you to see--not what they've decided you want to see based on what they see that you've seen.  (Got that?)
The experiment manipulated the extent to which people (N=689,003) were exposed to emotional expressions in their NewsFeed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion. (8788)
To achieve this manipulation, researchers conducted two, parallel experiments: on the one hand, they reduced Facebook users' exposure to positive emotional content in their News Feed,  and on the other hand, they reduced users' exposure to negative emotional content in their News Feed.

This only affected the News Feed: if you were selected to participate in this study and you visited friends' walls or pages, you'd see their postings, positive or negative, even if they had been filtered out of your News Feed.

But of course, most people on Facebook have hundreds of friends.  So this is a bit of a non-issue: you typically rely on the News Feed so you don't have to go to friends' walls or pages.

The researchers found that, when News Feed content was filtered in this way, it appeared that yes, there is evidence of "emotional contagion."
The results show emotional contagion. ...for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks. (8789)
If you're thinking, "Well, I didn't know anything about this, so I guess I wasn't part of the study," not so fast, my Facebook friend.

You didn't know if you and your News Feed were part of the experiment, because Facebook didn't tell you.

Facebook didn't tell you, because they don't have to.

Users agree to let Facebook conduct psychological experiments like this one when they accept the terms of service.  And since Facebook is a private company, they are not bound by Institutional Research Board (IRB) standards, which exist to protect human subjects in research studies.

If you tried to conduct this kind of experiment in an academic setting, you'd have to get IRB approval.  You can't just experiment with people's emotions in order to see what happens, because such experiments have potential real-world consequences that people will have to live with, both in the short-term and in the long-term.

It's called "informed consent" or the "common rule," and it's a federal policy.  You need to know what you're getting into.  To know that, experimenters need to tell you and give you the chance to say, "No thanks."  Researchers are also required to ensure that there are safeguards in place so that you don't unwittingly suffer the effects of an experiment--this is what IRB approval is all about.

Facebook isn't required to do any of this.  They're a private company, and you agreed to use their services.  They told you they'd be collecting data and you gave them permission to use it as they see fit. It's all included under their "Data Use Policy."  Your emotions and your friends' emotions--as manifested in status updates and "likes" and other expressions on Facebook--are "research" and "data," as far as Facebook is concerned.

And your "opt-out" option was given to you up-front: don't sign up for Facebook.  If you do, you've given informed consent--or so they argue.

If you go to this page, scroll down to "How we use the information we receive" and look at the bullet points, it very clearly states, "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

An "Editorial Expression of Concern" accompanies the article in PNAS.  As Editor-in-Chief Inder M. Verma notes, "the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out."

So, now you know.  How does that make you feel?

No comments:

Post a Comment

Ralph Waldo Emerson once wrote, "Life is short, but there is always time for courtesy."