I like my privacy. I live in the woods. According to my husband, I am the “most paranoid, non-institutionalized” person he has ever met. Needless to say, I don’t have a Facebook. (However, also needless to say, I am a huge online consumer and have a little mental escape clause in my brain for that kind of intellectual inconsistency.) Famously (“famously” in my family, I mean), I was on Facebook for a few weeks and quit in a huff after posting a manifesto about privacy issues, time-sink issues etc. which I posted, thinking I was still on Facebook, and which no one ever saw because I had just taken myself off Facebook in that very instant by deleting my Facebook account (even though you can’t really do that in any ultimate sense, as we all know). Exactly one person saw my screed. One person who happened to be with me, virtually speaking, in that exact instant. My manifesto has become part of our family lore. ‘Mommy can’t use Facebook and is mad at people who can.’
My husband, for example. He is one of those evil social scientists conducting research using Facebook data. We’ve had some interesting conversations the last couple days about the media storm over “unethical” use of Facebook data to “manipulate” people’s emotions. In my marriage, when one of us disagrees heartily with the other, we sometimes crib a line from the Cohen brothers’ movie, Fargo (and you have to say this line in the strongest possible, most caricatured Minnesota accent imaginable, otherwise it’s just not funny): I’m not sure I’m a hundred percent in agreement with your police work, Lou!
So, that’s what I’ve been saying to my husband these past couple days:
I’m not sure I’m a hundred percent in agreement with your police work, Nicholas!
Here are my husband’s arguments about why it is okay, at least in theory (though not necessarily in this particular case) for social scientists to conduct research using massive commercial data sets like Facebook:
- Scientists have to go through an independent ethical review process (called an IRB) that is required of all scientific research in the United States. They publish their work in the name of transparency. Scientists are now vilified for using big data for the public good while advertisers and other corporations do the same stuff, but worse, all the time and no one balks at their profit motive and sneakiness. The U.S. has some of the weakest consumer privacy protections in the world and no one seems to care.
- We’re shocked, shocked to discover gambling in this establishment! Guess what? I repeat: corporations do this every minute of every day. They are watching (and screwing with) us 24/7. It’s called A/B testing, which in my public health days we called a “randomized controlled trial.” They are also in the business of “manipulating” our emotions. Do you like the American Apparel ads with attractive, nearly naked young models or would you prefer to see me in a thong?
- Facebook users are not the customer. You signed away your rights. You are using a free service. Do you believe in unicorns?
- This study is a lightning rod for more legitimate unease about the rise of corporate power and the algorithms in our lives.
- The scientific design was robust.
- The study had minimal risk (and therefore qualified by law for a waiver of individual consent).
- Media alert: Researchers get waivers every day for arguably far scarier-sounding – but not actually scary – privacy violations (medical records, educational records, insurance records etc.) How do you think we know anything about what makes people healthy, what makes people die, what makes children succeed or fail in school? The researchers get waivers because an independent body determines that they are not in fact violating people’s privacy or harming them.
- The effects of the study were small because the “stimulus” was small. People posted one extra word per thousand as a result of the intervention. Come on, folks! Calm down.
- This doesn’t mean the study was silly and pointless, however. Read the paper! Scientifically, the study is still important because it was measuring a weak stimulus on a massive population. The analogy is that a floodlight might cast a blinding light on a person standing next to it, but only a very faint light on someone a mile away. The latter suggests there is a stronger effect with a stronger signal. In other words, the study is not as stupid as people claim. Neither is it as dangerous. (You can’t have it both ways, media carpers!)
- Some of the critics invoked, sure enough, Nazis, but here the problem is that there has been a mission creep in ethical review boards from legitimate protection of subjects (based on egregious cases like the Tuskegee experiments) to a “make-work program for ethicists” (my husband’s words, not mine). Psychologist Dan Gilbert has an amazing study where subjects were asked to remove purple marbles from a collection of blue marbles. Ten percent of the marbles were purple. But as the researchers reduced the percent of purple marbles from ten down to nine, eight, seven, and so forth, the subjects continued to “find” purple marbles that weren’t actually purple and they still removed ten percent of the marbles. Just because. Ethicists are doing the same thing these days with these kinds of outraged responses. They seem to find a fixed number of “purple marbles” in studies even though the protections to subjects are incredibly rigorous.
- Why do we vilify science and revere big business? We need to promote more, not less, scientific inquiry. Big data offer possibilities to solve major societal problems if we can engage honestly in debate and not run screaming at the word “science” or put our heads in the sand when corporations run roughshod over our rights. Yada yada…
Here are my arguments:
- Um, well, uh… feels creepy…
- Did I mention it feels creepy?
- Scientists should be held to higher standards
- Scientists are so arrogant! I don’t like them…They’re mean nerds.
- Did I mention it makes me feel creepy?
- “I’m not sure I’m a hundred percent in agreement with your police work, Lou!”