Jump to content

Facebook's Ethically Suspect Psychological Research


quoththeraven
This topic is 3633 days old and is no longer open for new replies.  Replies are automatically disabled after two years of inactivity.  Please create a new topic instead of posting here.  

Recommended Posts

Facebook and some outside scholars conducted research on contagious emotions by manipulating Facebook feeds without obtaining informed consent. Some people received only neutral or positive content in their feeds; others received only neutral or negative content. Their posts were then monitored for indicia of positive or negative emotions to test if emotions are contagious.

 

Facebook apparently relied on its TOS which permits use of information received about users for internal operations, including research. But Facebook wasn't using information about users for internal operations here. It was manipulating users' feeds to see if it had an effect on users' posts and thus, presumably, on their emotions. It's also not clear whether Facebook users under the age of 18 were filtered out of the sample pool.

 

It's also not clear to what extent the Common Rule applicable to federally-funded research involving human subjects applies here. But most institutions treat the Common Rule as best practices that apply irrespective of funding source.

 

The editor of the article in the Proceedings of the National Academy of Sciences in which this was reported has stated that there was IRB (Institutional Review Board) approval for the study, but the reasoning given (that Facebook manipulates feeds all the time for marketing purposes and this is no big deal) suggests that the IRB didn't understand or wasn't fully informed of the nature of the research. It also appears that the academics involved deliberately isolated themselves from the data collection in order to avoid a more searching IRB review.

 

If research requires unsuspecting subjects (which may be true here), the researchers didn't engage in the debriefing afterward that is required, nor did they give subjects the option of withdrawing their data, which is also required.

 

This doesn't mean that the research itself is worthless or purposeless, just that it may well not have been done ethically.

 

Bottom line for Facebookers (of which I am not one): You may be unwitting guinea pigs for psychological experiments.

 

Described in more detail here with links and updates.

 

http://vacuousminx.wordpress.com/2014/06/28/facebooks-indefensible-unscholarly-research/

 

It may not be obvious on the face of it, but the person who posted this link, and whose blog it is, teaches sociology at the university level.

Link to comment
Share on other sites

A more wide-ranging Wall Street Journal article on the same topic entitled "Facebook Experiment Had Few Limits":

 

http://online.wsj.com/articles/facebook-experiments-had-few-limits-1404344378

 

Excerpts:

 

Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real.

 

In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures. In the end, no users lost access permanently.

 

The experiment was the work of Facebook's Data Science team, a group of about three dozen researchers with unique access to one of the world's richest data troves: the movements, musings and emotions of Facebook's 1.3 billion users.

 

The little-known group was thrust into the spotlight this week by reports about a 2012 experiment in which the news feeds of nearly 700,000 Facebook users were manipulated to show more positive or negative posts. The study found that users who saw more positive content were more likely to write positive posts, and vice versa....

 

Until recently, the Data Science group operated with few boundaries, according to a former member of the team and outside researchers. At a university, researchers likely would have been required to obtain consent from participants in such a study. But Facebook relied on users' agreement to its Terms of Service, which at the time said data could be used to improve Facebook's products. Those terms now say that user data may be used for research.

 

"There's no review process, per se," said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. "Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior."

 

He recalled a minor experiment in which he and a product manager ran a test without telling anyone else at the company. Tests were run so often, he said, that some data scientists worried that the same users, who were anonymous, might be used in more than one experiment, tainting the results.

 

It's not just Facebook, either, although given the content on Facebook and how it works versus the content elsewhere, its research has greater potential for manipulation and creepiness.

 

Much of Facebook's research is less controversial than the emotions study, testing features that will prompt users to spend more time on the network and click on more ads. Other Internet companies, including Yahoo Inc., Microsoft Corp., Twitter Inc. and Google Inc. conduct research on their users and their data.

 

The recent ruckus is "a glimpse into a wide-ranging practice," said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies "really do see users as a willing experimental test bed" to be used at the companies' discretion.

 

As for Facebook's supposed internal review (also mentioned in the article), I don't know what good that's supposed to do. I wouldn't trust an internal review of the ethics of social research at a for-profit company any further than I could pick up and throw the researchers. (Meaning: not at all.) This is a way around the IRB (institutional review board) review required of research on human subjects in academia, something that those who are required to go through such protocols (and the attendant paperwork) find maddening.

 

More reporting from Kashmir Hill of Forbes:

 

http://www.forbes.com/sites/kashmirhill/2014/06/29/facebook-doesnt-understand-the-fuss-about-its-emotion-manipulation-study/

 

She notes that "one usable takeaway in the study was that taking all emotional content out of a person's feed caused a 'withdrawal effect.'"

 

Also, I've since discovered that the author of the post I linked to in the previous entry is a professor of political science, not sociology, FWIW.

Link to comment
Share on other sites

I'd love to wax philosophical about privacy, but I'll be honest. My first reaction is envy.

 

The contagion of emotions is not exactly a new idea. A million people have thought of this. But these guys had the right IRB and the right connections, which got them the gold dust data and a publication in the Proceedings of the National Academy of Sciences.

 

As soon as I saw this, I knew one of the authors was affiliated with a medical school, since their IRBs are notoriously looser than others. And giving the Facebook guy a first authorship was a smart move. I can't help it: Meow, meow, meow.

Link to comment
Share on other sites

If you're dumb enough to be on Facebook you deserve whatever comes your way. Afterall' date=' you're giving them all your personal information. Do people not read the TOS?[/quote']

 

I understand that many people feel the same way about Facebook. I do not post anything on Facebook that I do not want to be public knowledge. Above all, Facebook allows me keep in touch with many people from all over the country and the world whom I do not call or write often.

 

As to Mr. Miniver, I do not think this is yet another instance of his constant negativity. Many people agree. I do not.

Link to comment
Share on other sites

If you're dumb enough to be on Facebook you deserve whatever comes your way. Afterall, you're giving them all your personal information. Do people not read the TOS?

 

I don't disagree that one can avoid this by avoiding Facebook. This kind of thing is part of the reason I'm not on it, for example. But in many cases, people do not read TOS; after all, it's not as if they can negotiate different terms. Even reading the TOS wouldn't alert a reasonable reader to the possibility of this type of research. Facebook's current TOS permits use of user data for internal operations, including research. But manipulating feeds goes beyond the use of an accountholder's data and was also arguably not research for internal operations, although there are some potential internal applications. I've also seen suggestions that Facebook's TOS didn't even mention research at the time the research at issue was conducted. [Edit: It's not a suggestion. Facebook's TOS didn't even mention research at the time the study was done, so it was a violation of the TOS on Facebook's part.]

 

Interestingly, even though Facebook has been rightfully dinged for constantly changing privacy settings, leading users to expose data more widely than they wanted to, and other privacy violations, no one is asserting privacy violations here. Facebook rigorously anonymized the data here and analyzed it in the aggregate. The question is whether the wellbeing of those whose feed contained negative content (apparently defined as text with negative words) for the duration of the experiment was affected and if so, to what extent. Kashmir Hill of Forbes suggests that Facebook institute a procedure for users to opt out of such research. That seems like a reasonable alternative.

Link to comment
Share on other sites

I'd love to wax philosophical about privacy, but I'll be honest. My first reaction is envy.

 

The contagion of emotions is not exactly a new idea. A million people have thought of this. But these guys had the right IRB and the right connections, which got them the gold dust data and a publication in the Proceedings of the National Academy of Sciences.

 

As soon as I saw this, I knew one of the authors was affiliated with a medical school, since their IRBs are notoriously looser than others. And giving the Facebook guy a first authorship was a smart move. I can't help it: Meow, meow, meow.

 

There was no IRB and no IRB review, and it was a professor of communication and information science at Cornell and a PhD candidate in that program who were involved, not Cornell Medical School. (And it took several days for all the relevant information to come out, although in part that was because the story broke on Friday.) Not clear, but there might not have been an internal review at Facebook, either. See this additional post by Vacuous Minx (Sunita the poli sci professor) and also the Forbes article above:

 

http://vacuousminx.wordpress.com/2014/06/30/facebooks-emotion-study-mess-summarized-still-awful/

 

I agree that it's interesting research, but it should not have taken place the way it did (for one thing, it wasn't covered by the TOS at the time, not even arguably), and Cornell should not have decided not to engage in an IRB review. Even though the Cornell affiliates didn't collect the data, Cornell acknowledges that they participate in preliminariy discussions of study design. (My translation: They helped guide the study design so as to render the results publishable.) That alone should have mandated an IRB review, particularly since it's clear the Cornell folks were involved in order to pave the way for publication.

 

I'm also puzzled by the Facebook researcher's initial hypothesis that positive emotional content in one's feed would lead to negative emotional states as measured by user output -- it can be argued whether that's a good or accurate measure. I would have hypothesized exactly what the study purports to show: output matches input.

 

Just FYI: while I agree with nearly the whole of Sunita's post, I disagree with some of the prior experiments she lists as ethical lapses. The tearoom trade research could not have been conducted with participant consent, either before or after, and its policy impact (which led some police departments to conclude that it wasn't necessary to monitor sexual activity among men in public restrooms because it wasn't predatory) may have outweighed its ethical challenges. The problem with the Stanford prison experiment had more to do with how it was conducted (and the fact that Zimbardo didn't end it sooner) than with the fact that it was done at all. I think Milgram's research was actually more questionable ethically. But those were also studies that established something important, and I'm not sure how much they differ from the brown eyes/blue eyes experiments (conducted on minors, no less) that demonstrated the effects of racism.

 

But her condemnation of Willowbrook? Totally deserved.

Link to comment
Share on other sites

I strongly dislike the entire concept of Facebook. I really don't understand the charm of making contact with people I don't care enough to phone or email. I'm simply NOT interested in seeing the photographs of the grandchildren of someone who was in my first grade class and who I don't even remember.

Link to comment
Share on other sites

According to 'PR Daily' the average Facebook user is getting older.

 

The average age of the 1.3 billion monthly active users in 2010 was 29, and now is 30.

 

That's because college-aged kids are moving to Twitter and Instagram. Facebook is quickly becoming a place in which 30+ people keep up with college friends and neighbors.

Link to comment
Share on other sites

Some college-age kids use Twitter and Instagram more now. But, all of my college and post-college age Facebook friends have kept their Facebook pages in addition to Twitter and Instagram. I realize it is a very small sample. But, the average age moving from 29 in 2010 to 30 now could be explained by many things. I understand that Facebook would rather the age went down a year, not up a year.

Link to comment
Share on other sites

I strongly dislike the entire concept of Facebook. I really don't understand the charm of making contact with people I don't care enough to phone or email. I'm simply NOT interested in seeing the photographs of the grandchildren of someone who was in my first grade class and who I don't even remember.

 

EXACTLY. It's just like those inane people who send out those annoying Christmas letters at the end of the year telling you about their kids, their cancer, their trips ... if I don't know you well enough to have heard about that stuff during the year what makes them think I have any interest in hearing about it in a form letter at the end of the year?

 

Facebook is just a Christmas Letter online. Every day. Unless you're advertising a business (for which I understand FB can be good), I just can't understand the mentality of people who want to "keep in touch with people from all over the world" on FB. People they've never met, will probably never meet or met once in a bar in 1997. I think the whole concept is rather silly and rather narcissistic.

 

What's that line from California Suite? "I don't need a lifestyle, I have a life!"

Link to comment
Share on other sites

A life posting negative thoughts on message boards is hardly an improvement on posting pics on Facebook. At least with the pics there is often something I want to see. Why be so critical of people who like to post? Facebook or here, you are still an internet poster.

Link to comment
Share on other sites

I just can't understand the mentality of people who want to "keep in touch with people from all over the world" on FB. People they've never met, will probably never meet or met once in a bar in 1997. I think the whole concept is rather silly and rather narcissistic.

 

That may be true of people with 1,000 plus Facebook friends. But, one can easily keep the number of friends down to a manageable number (100-150) and keep in contact with people who are real friends, whether they live in the United States or in another country. Mr. Miniver, you are very quick to jump on things that you know absolutely nothing about.

Link to comment
Share on other sites

Playing devil's advocate here ... As far as content is concerned, Facebook is only as bad as one lets it be. It's possible to only befriend those one wants to and turn down (or not respond to) other friending invitations. (Admittedly, that can cause interpersonal problems, but that's an etiquette/boundary issue not confined to Facebook.) It can be a good way to keep up with people one wants to keep up with as things happen. I know someone who keeps in touch with high school friends and former colleagues that way, and much of the content he sees is lighthearted. (Think lolcats.) As I understand it, content one doesn't want to see can be hidden or blocked.

 

I prefer LiveJournal and Dreamwidth, where I know I can keep personal info to my circle of friends, which facilitates longer, more thoughtful discussion, and where we've chosen to be in touch with each other. The bond I have with them is such that I know I'd be likely to find a place to stay just about anywhere in the world where I have friends with a spare couch or a room, and have. On the other hand, Livejournal and Dreamwidth are deader than this place is, and certainly deader than Facebook. I can go twenty-four hours or more without a change in my friends' list feed. Most of the people I know who used to be on LiveJournal or Dreamwidth but aren't anymore have moved to Tumblr, which in my experience skews younger than Twitter.

Link to comment
Share on other sites

Playing devil's advocate here ... As far as content is concerned, Facebook is only as bad as one lets it be. It's possible to only befriend those one wants to and turn down (or not respond to) other friending invitations. (Admittedly, that can cause interpersonal problems, but that's an etiquette/boundary issue not confined to Facebook.) It can be a good way to keep up with people one wants to keep up with as things happen. I know someone who keeps in touch with high school friends and former colleagues that way, and much of the content he sees is lighthearted. (Think lolcats.) As I understand it, content one doesn't want to see can be hidden or blocked.

 

I prefer LiveJournal and Dreamwidth, where I know I can keep personal info to my circle of friends, which facilitates longer, more thoughtful discussion, and where we've chosen to be in touch with each other. The bond I have with them is such that I know I'd be likely to find a place to stay just about anywhere in the world where I have friends with a spare couch or a room, and have. On the other hand, Livejournal and Dreamwidth are deader than this place is, and certainly deader than Facebook. I can go twenty-four hours or more without a change in my friends' list feed. Most of the people I know who used to be on LiveJournal or Dreamwidth but aren't anymore have moved to Tumblr, which in my experience skews younger than Twitter.

 

I have a post-college age friend who lives in Paris and traveled to eastern Ukraine right after Russia took control of Crimea. He used all of social media, especially twitter, to describe his experiences. He learned quickly who was following his daily tweets closely. So he also sent e-mails to those 18 or 19 people about long videos he posted on Facebook in case we were not checking FB that day. That is just one example on how to use social media wisely -- although I know his parents wish that he had never left Paris. Could we have gotten the same information from newspapers and CNN -- yes almost, but not such personal information about the situation on the ground in the days when it appeared Russia was about to invade eastern Ukraine.

Link to comment
Share on other sites

Frankly Rich your comments reminds me of something my mother was very fond of saying:" If you can't say something nice about a person better to say nothing" I on the other hand have always preferred Alice Roosevelt's statement that: If you can't say something nice about a person come sit next to me"

 

William M I can't imagine have 100-150 friends. I want to keep in close contact with people I consider friends. I enjoy having morning coffee with them, joining them for lunch or dinner, entertaining them in my home. Even being retired it, for me at least, it would be near impossible to do so with 100-150 people. I have two, what I consider, large Christmas parties in December with about 30 to 40 guests at each. Many of these people are not close friends but rather good acquaintances.

Link to comment
Share on other sites

'Look Up' - A spoken word film for an online generation.

 

'Look Up' is a lesson taught to us through a love story, in a world where we continue to find ways to make it easier for us to connect with one another, but always results in us spending more time alone.

 

[video=youtube;Z7dLU6fk9QY]

Link to comment
Share on other sites

I had posted this film on my FB page some months ago when I first saw it.

 

There are those for whom social media (part or all of it) is simply a way of not engaging in life.

 

But there are others who are not controlled by social media but can use it as a way of more efficiently reaching out, contacting, sharing, and being enriched by friends and acquaintances with whom I cannot engage face to face every day or as much as I would like because of geography, time zones, or work.

Link to comment
Share on other sites

I strongly dislike the entire concept of Facebook. I really don't understand the charm of making contact with people I don't care enough to phone or email. I'm simply NOT interested in seeing the photographs of the grandchildren of someone who was in my first grade class and who I don't even remember.

LOL - I remember back in the day, on every sitcom and stand-up routine, the idea of hauling out vacation pictures and/or baby pictures was a punchline. The common response was to avoid at all costs! Ahh...less narcissistic times.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...