In any modern, first-world country, the government requires legitimate university researchers to go through an independent review board (IRB) when conducting research on human beings. This is due to past abuses by both governments and organizations who have used the guise of “seeking knowledge” to cover up their efforts to manipulate people for their own means and ends.
But you know what? Facebook isn’t a first-world country. So in an effort to better understand how to best monetize your use of their service, they don’t need an IRB’s approval to conduct research on you.
And now, in my opinion, they are outright mocking their users with their latest update to their research practices.
The latest controversy started when Facebook “data scientist” researcher Adam D.I. Kramer published a study that manipulated people’s Facebook newsfeeds without their knowledge or consent. Kramer issued a non-apology after the incident, and then what followed was three months of total silence from Facebook on the issue.
Today, Facebook finally broke their silence and wrote this blog entry about their updated research policies.
Nowhere in the entry is the word “ethics” mentioned. This alone demonstrates that Facebook still doesn’t understand what it did wrong with previous research and has little insight into how social science research is completely different — for a reason — from computer science. Facebook still has no ethics officer.
So what did Facebook actually change? Will new research have to go through an IRB (or whatever an equivalent would be in a for-profit company looking to maximize the monetization of its users)?
As it turns out, not much.
Now, some research that is subjectively determined that may need “additional review” will undergo… additional review. By others at Facebook.
I guess nobody at Facebook has heard the phrase, “conflict of interest”?
Well, at least I’m sure they’re now going to require informed consent of its users to participate in human subject experiments, right?
Nope. Facebook knows that few would willingly give their consent to be manipulated by their algorithms and at the whims of their “data scientists,” so they still won’t bother to ask you for consent.
So what will? Facebook change? Well, apparently, they’ve added “clearer guidelines” because apparently Facebook had few (or unclear?) guidelines previously. Apparently anyone at Facebook could do anything they wanted research-wise, manipulating its users’ experiences whenever and however they wanted. They’ve also added “training,” because, again, apparently Facebook let people with no or little training in human subjects research to conduct experiments on its users. That means you, you poor unsuspecting datapoint.
The new internal review panel has a bunch of people on it, but again, it’s not clear anyone there is actually familiar or has a background in social sciences research and ethics. Since no mention of ethics was found, it’s likely Facebook still doesn’t quite “get” what it did wrong. At least that’s what I take away from this wonderful marketing-speak blog entry.
And you have to love this last swipe at every Facebook user today:
Like most companies today, our products are built based on extensive research, experimentation and testing. […]
We want to do this research in a way that honors the trust you put in us by using Facebook every day.
What they don’t mention is that most companies today who build their products on “extensive research, experimentation and testing” do so under explicit test conditions clearly communicated to its users.
Do you think P&G goes out and changes the formulation of their most popular products on unsuspecting consumers without getting their consent first? Do you think GM or Ford tests changes to their vehicle’s suspension or ergonomics on people who’ve just plunked down $30,000 for one of their cars?? Do you think that Target changes all of its pricing just for you when you walk into the store, just to see how you might react??
The point being, companies do indeed conduct testing and research on their products — but only with the customer’s explicit and informed consent. They don’t do it just to test hypotheses at random.
Nothing in Facebook’s new research policies asks users for their explicit, informed consent to participate in a human subject experiment. And there remains no independent review of their human subject research — nothing even close to a university IRB.
In sum, the new Facebook research policy mocks its users. You, apparently, are nothing more than a marketing datapoint to Facebook, not deserving of the same informed consent any normal research participant would be entitled to. Facebook has made it clear, at least to me, that they are not interested in becoming a legitimate member of the research community. Instead, they’re appear to be happy to remain “data scientists” — and you are simply the data. ((And naturally, there’s nowhere to comment about this blog entry on Facebook. For a social networking company, Facebook ironically appears to dislike any actual social networking with itself.))
Read the new research policy: Research at Facebook
7 comments
Using Facebook as a means of communication and electronic data consumption is complete consent – if an individual does not consent, the individual should not use Facebook. Facebook is unlike other companies in that Facebook provides no tangible product to its users. USERS – not customers.
Facebook provides a tangible service to its users — that’s true. Otherwise people wouldn’t use it.
But whether you call them users or customers, the fact remains that those people are human beings. And laws protect experimental research on human beings in ANY SETTING. It doesn’t matter if it’s online or not, whether money exchanges hands or not.
The point is that Facebook still seemingly does not understand the foundations of what constitute informed consent and why research ethics matter. Even more so in today’s always-on, connected world.
I respectfully disagree with you Dr Grohol when you state “But whether you call them users or customers, the fact remains that those people are human beings.”
Not in a medium that does not have consistent and responsible accountability. It dumbs down the interaction to just being cogs in a giant wheel, and the true physicality of humanity is lost.
The system just makes the interactions “bits”, and electronic contacts. I would hope you would agree to at least some level the term “friends” has been so trivialized and marginalized by this medium that humanity has been lost in many of the interactions on Facebook.
So, if my premise has some validation, why on earth would Facebook rulers think they have accountability and responsibility to their “users”?
Ironic how some of the dialogue in the movie “TRON” has applicability 30 years later. As long as we let the media of the Net be seen as legitimate content without challenge or demand for physical personal defense of what was written/published, the lie continues to sell with impunity.
Which I believe reinforces my hypothesis the Net allows narcissism and antisocial agenda to propagate logarithmically!
Joel Hassman, MD
board certified psychiatrist
NOT a Facebook or Twitter user!
To be fair, Target carries out hundreds of tests. They move things around in a store, they change the prices, they change the music, they change the lighting… all to see if their customers will buy more.
P&G carries out tests by trying different packaging, different bottle shapes.
GM changes their car options and prices.
The ethical question should be, for each of Facebook’s “user tests”: Could they have a measurable effect on their users’ wellbeing? Normally, no. They’re just moving the navigation bar around to see if it makes you click more.
The ethical failing with Facebook’s recent “user test” was that it intentionally tried to make some people more miserable by showing them only the negative things that their friends had said.
That’s why Facebook needs an ethical panel. Because it’s quite possible that their studies could affect the wellbeing of real people.
Virtually all companies that treat their customers with a modicum of respect also use focus groups — not everyday customers — to test out their ideas on.
Companies that deal in the online world too often skip the focus group. And as you said, this is fine when the only testing they’re doing is on usability — how to make their website more friendly, easier to use, etc.
But as you pointed out, when the manipulation is done to purposely impact the emotional state of a company’s users without their knowledge or consent — that’s where the ethical lapse occurred.
Without a truly independent IRB-equivalent panel that has outsiders on it, Facebook has done the bare minimum it could do in order to address this problem. It’s no wonder more and more of its users are already looking for alternative social networks…
Wow! Excellent blog. Agree 100%… Especially after having to actually take the research guidelines tests and certification. For what? So I can be more obsessive about rules and guidelines that really mean nothing. They can do all the research they want, but affecting the well being and emotional state of users without their knowledge is the root of what we aren’t supposed to do. Wonder how many crimes, fights or suicides happened as a result. Don’t let the masses catch wind, or there may be a class action suit… That’s where all change starts in this country. Please pardon the sarcasm.
It’s very simple – we’re not the user, we’re the product.