Facebook crosses an ethical line

It has just been made public that in January of 2012 Facebook intentionally manipulated the news feeds of almost 700,000 Facebook users. Facebook skewed news feeds to be either predominately positive/happy news or predominately negative/sad news. Facebook then studied how these manipulated users responded. This action by Facebook is highly unethical; and, in my opinion, Facebook’s action is just a few steps away from the grossly unethical Tuskegee syphilis experiment.

Please hear this: It is certainly not my intent to diminish the seriousness of the Tuskegee case and the negligence shown to the 600 African American men in that study. Rather, we can never forget the men in the Tuskegee case and how easily we could return to that place.

A few points:

First, Facebook manipulates news feeds all the time; that is not new information. It is part of target advertising on the social media platform. It is annoying but not unethical. Second, researchers frequently study social media content; that is also not new.

What is new is that Facebook intentionally tampered with the content of news feeds in an attempt to manipulate the emotions and/or reactions of these 700,000 users without their consent.

When we create a Facebook account we agree to a number of stipulations, including data analysis and research. As such, Facebook had the legal right to conduct this study. But it was carried out in an unethical manner.

As a result of the infamous Tuskegee experiment, Institutional Review Boards (IRBs) were established to ensure the protection of human participants. Academic researchers (like myself) are required to submit all research materials involving human participants to these review boards for approval prior to any data collection. But as a private company, Facebook isn’t required to undergo the same review process. But that still does not give them the ethical right to manipulate human participants.

A primary requirement of the IRB is that participants have informed consent, which means they knowingly agree to participate in a study, they know what the risks may be, and they know that they have the right to leave the experiment at any time. This helps to ensure human protection. In addition, while research may potentially involve risks to human participants, the IRB helps to ensure that any potential risks are outweighed by the potential benefits. Facebook participants were not given informed consent. They did not know they were being manipulated, they did not know they faced potential risks, and they were not given the option to quit the experiment.

Second, IRB standards require that human participants be “debriefed” following participation in a research study; participants are given additional information and they are given the opportunity to ask questions and express any concerns. Debriefing is another critical step in ensuring the health and safety of research participants. Facebook participants were not debriefed.

While it might seem outrageous to compare the manipulation of human emotions by a social media platform to the withholding of medical care by the United States government, we must realize that the manipulation of people’s feelings and emotions is incredibly risky—especially in an age when mental health issues run rampant and they are virtually ignored (or shunned) by society and the medical community. Mental health is linked to depression and alcohol and drug abuse/addiction as well as other less discussed addictions like gambling or eating or cutting. Unstable mental health and addiction can lead to depression and, ultimately, violence against self and others. This is especially of concern considering that many Facebook users are highly impressionable teens and young adults.

A final critical point: in academic research our goal is the advancement of knowledge for the greater good. Academic researchers follow a strict protocol to ensure that research involving human participants is carried out in an ethical way. In academic research, we strive to minimize any potential risk.

What is Facebook’s goal? Likely not the advancement of knowledge. It is the advancement of their bottom line. Potential risks don’t outweigh potential benefits.

For more information, see: http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/


One response to “Facebook crosses an ethical line

  1. Pingback: Facebook censors iconic Vietnam War photo | Visual Communication in the Digital Age·

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s