Word travels quickly in the Facebook community, so it doesn’t come as a shock that Facebook users quickly discovered that a study had been published earlier in the month, in an academic journal. As part of the study, two university researchers, adjusted 689,003 users’ News Feeds to positive or negative to see if it would affect their mood. So can Facebook newsfeeds truly affect the way you feel? After this study, it appears that social media updates may alter your mood more than you think.
Facebook has justified their reasoning for conducting such a study – saying that those who sign up for a Facebook agree to their “Data Use Policy”. If you scroll through their policy, you will find a line that reads that your information will be used for, “… research and service improvement…Your trust is important to us, which is why we don’t share information we receive about you with others unless we have: received your permission…(read more here).”
So obviously that gives them our permission to volunteer us in an academic study right? WRONG! There is an underlying issue here – is this an ethical way to go about conducting a study or not? Especially since this study proved that people experienced a “withdrawal effect” in their mood. Facebook is definitely teetering on the line of right and wrong.
Many people are outraged by Facebook’s actions; while others are really not shocked at all with their feed manipulation. There is a fine line that shouldn’t be crossed when it comes to user privacy. “A source familiar with the matter says the study was approved through an internal review process at Facebook, not through a university Institutional Review Board.” When Facebook was asked what their review process was for conducting the study in 2012, one of their spokesperson’s stated, “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
Okay, well that all sounds great…but if this is supposed to be an academic study, then why is the research clause (shown in the photo above) listed under “research to improve our services” category and not under “research for academic purposes” category? Who knows. But maybe Facebook needs to adjust the way they explain their policy to users, instead of leaving people feeling confused and violated. Users should always have a choice to choose whether or not they want to be involved in any type of study.
Adam Kramer, who helped run the study, stated, “[W]e care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid Facebook.” Apparently, Facebook doesn’t see the ethical issue with not being upfront with users about how their information will be shared. Maybe they should start caring more about their users’ privacy rights? Just a thought. It seems they are more focused on whether or not people will be “lead…to avoid Facebook.” But maybe they also don’t understand that when people continually feel violated, they will eventually seek an outlet they can trust.
Do you think there is an ethical issue here? Please share your thoughts with us on any of our social media outlets. We would love to hear from you!
(Source)