Facebook experiment - okay, dodgy, should be illegal, or what?

Article on BoingBoing re. a social psychology experiment done on Facebook.

I’m in two minds about it - I mean, it’s cool in a way, but on the other hand, don’t people go on Facebook looking for something that’s private between friends? Manipulation of this kind seems dodgy to me, even if it’s for a good purpose. It’s grey, but it’s a darker shade of grey than is comfortable.

Nope, fucked up and an abuse of their customers/users. Certainly not considered ethical in the research sense, in that informed consent was not offered, even after the fact.

I know it is their platform and they can do what they want with it, but it shows a disturbing lack of respect and furthermore, who knows where this kind of experiment may lead in the future. I am glad my FB footprint is as absolutely minimal as possible (I only have a barebones profile for connection purposes to old friends).

Here’s the link I sent to some friends earlier today, it has a bit more depth and has been kept up to date as more info has surfaced. Note the initial reports that this had some Federal funding behind it (in particular the Army Research Office) have since been retracted by the University. Make of that what you will.

Definitely not Illegal, but I would call this unethical. People need to be informed of this research.

Cool research though, adds a good study with a significant sample size to the theory that emotional states are transferred from person to person, especially though social networks that are as easily tracked as facebook.

This has a simple fix though.

Facebook just adds a “research” option to their profiles. People can opt-in to research projects in the future. (Kind of like the steam beta opt-in). If you click yes, you open your profile for research purposes. I would be ok with that, and I would probably opt in.

Knowing Facebook, it would probably end up being an opt-out button hidden under some random settings.

True, it is probably already in there somewhere.

I dunno about “definitely” given that there is a law.

I think we can all agree on the ethics of it.

Well… okay and not okay.

They collected “historical data,” which is exempt from review by the IRB. Basically, if anonymized data already exist and you want to mine it, by all means do. How historical is it, though? I mean, was it posted 0.5s ago? The article says:

The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. … Participants were randomly selected based on their User ID, resulting in a total of ∼155,000 participants per condition who posted at least one status update during the experimental period.

So, there was an experiment condition and a control condition, in real time, mapped to a user ID which is unique and very identifiable – not anonymized. However:

no text was seen by the researchers.

Okay, so it’s anonymized text mapped to a positivity score mapped to user IDs. As someone with a Facebook user ID, I’d want to know how they’re keeping their data. I mean, I’d want to know if it’s being kept on an encrypted drive on some schlub’s computer vs posted on a public site for anyone to see.

Moreover, I’d want to know if I were one of the “randomly selected participants” whose news feeds were being manipulated.

They thought they’d piggy-back on Facebook’s terms of use:

As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.

HOLY SHIT. Not okay. If you ask any one of those 689k participants if they remember having consented to this research, I’m pretty sure most would say no.

The point of an IRB isn’t to prevent you from doing your research. It’s to keep people safe, and that includes keeping their identities safe. While self-policing seems like a good idea (and hell, it’s faster!), going through an IRB even for exemption is the way to go.

Jesus.

Edited to add: In addition to the risk of participants being identified because we don’t know the data storage plan, there’s the very real risk of emotional effects by loading someone’s news feed up with nothing but drama. Folks participating in the experiment need to know this up front. And need to be debriefed at the end of the experiment. That’s just sloppy. I hope these guys rot in hell and get their funding pulled.

So what did they do exactly? The study is not clear. Block messages with “emotional words”? Like what, birth announcements?

Facebook was sketchy as fuck before, now doubly so. Glad I left years ago. I hope the media picks up on this big time and lambasts them.

They basically filtered users’ news feeds using used word matching algorithms to classify material as positive or negative in an attempt to determine if it then had an effect on said users’ status updates, thus testing the idea of emotional transference in large groups of people. ie Lot’s of negative updates in a users feed resulted in resultant negative status updates and posts from that person. All without the users’ consent and without informing them afterwards. Guinea pigs in a probably legal, but unethical research study.

The kind of research that in the future, on a suitably large platform (such as Facebook), could be used for all sorts of uncool things to help influence and direct group thinking. Advertising, politics, propaganda, subliminal messaging, and who else knows what.

Not a platform that be trusted, IMO.

The only difference between this and another A/B test done to you dozens of times a day is that it was published publicly. This happens all the time, to everyone by everybody, and you don’t know it and the data never gets released.

The slippery slope is humongous here, especially from a company with as much data as Facebook. But there’s no legal violation and no ethics problem either.

I hereby dub these experiments FBUltra.

They manipulated feeds in order to test for an emotional response in a random sampling of users, possibly including minors. Minors are a protected class; you’re not supposed to do experiments on them without special handling. Facebook users must be at least 13 years old to have an account. Adults are 18.

Maybe a little dated, but here you go:

So, every time Amazon, Google, Zynga, does an A/B test on their website, they should check to see if that user is under 18?

This is more involved than a “what shade of green should this button be to maximize clickthrough”. There’s a line somewhere, and I think we all agree that that kind of A/B test is on the acceptable end. I think we also agree that the line is before “Stanford Prison Experiment” territory.

What I’m arguing is that (from an ethics standpoint), Facebook’s experiment was also on the acceptable side of that line, even if it was atrocious PR that anyone awake at HQ should’ve seen coming.

Actually, I’m not sure that A/B testing without informed consent is on the acceptable side of that line -for university researchers- if conducted before an IRB sees the proposed experiment. I thought most institutions expected researchers to comply with the National Research Act and the Declaration of Helsinki, as a matter of ethics even for work which is not federally funded. Neither of which allow a researcher to make a determination of ‘no foreseable harm’ themselves, an IRB is supposed to determine that the experiment falls outside its purview.

I’d agree that a commercial entity doing A/B testing for optimization is doing nothing wrong, and I think the Facebook affiliated authors/Facebook have not done anything wrong. There is some type of limit, if a/b testing involves my bank rounding my checking account balance down before displaying it for me, I think I’d get upset once I’d noticed :)

“No legal violation”? You can’t possibly know that for sure. I’d be surprised to hear actual legal scholars making those kind of assurances.

Technically legal*

*in some countries of the america continent

No ethical problem at all, huh? This happens all the time, huh? Thanks Facebook PR.

Manipulating people’s private pages without their consent in order to fuck with them emotionally and record the results is a huge breach of ethics. Machiavelli would be proud.

Give me another example of a company fucking with my personal information and communications from my friends and family in order to emotionally manipulate me, please. Testing button placement and website layout is obviously a whole other ballpark.

Also, looks like the British government will be investigating them. And not only did they not apologize, they stood their ground and told everyone to get over it. Shame on you, Facebook. Reinforces my decision to leave two years ago.

Not a legal scholar, but I’m well versed with ethics regulaions. Malderi is probably correct from the details I’ve read about this case in that there’s likely no legal issue (only issue that I could see is the fact that they possibly included kids - they could have also been easily excluded, so I can’t be sure). As for “no ethics problem,” depending on numerous seemingly inconsequential factors, it may also not be in violation of any ethics regulations. That doesn’t mean there’s no ethics problem in my book, but those are two different matters to me.

Yeah, I should clarify, of course. I’m not a lawyer, nor am ethicist. Nor Facebook PR for that matter. Just a guy on the Internet having a good discussion.

We all agree that this was a terrible terrible move on Facebook’s part, and the current backlash is all the proof you need of that. But I still don’t see the actual legal or ethical violation, grandstanding prosecutors with probable upcoming elections notwithstanding.

What’s an ethics regulation? I thought those were called “laws”.