Articles

Posts Tagged Facebook

Did Facebook and PNAS violate human research protections in an unethical experiment?

Facebookfail

Ed. Note: NOTE ADDENDUM

I daresay that I’m like a lot of you in that I spend a fair bit of time on Facebook. This blog has a Facebook page (which, by the way, you should head on over and Like immediately). I have a Facebook page, several of our bloggers, such as Harriet Hall, Steve Novella, Mark Crislip, Scott Gavura, Paul Ingraham, Jann Bellamy, Kimball Atwood, John Snyder, and Clay Jones, have Facebook pages. It’s a ubiquitous part of life, and arguably part of the reason for our large increase in traffic over the last year. There are many great things about Facebook, although there are a fair number of issues as well, mostly having to do with privacy and a tendency to use automated scripts that can be easily abused by cranks like antivaccine activists to silence skeptics refuting their pseudoscience. Also, of course, every Facebook user has to realize that Facebook makes most of its money through targeted advertising directed at its users; so the more its users reveal the better it is for Facebook, which can more precisely target advertising.

Whatever good and bad things about Facebook there are, however, there’s one thing that I never expected the company to be engaging in, and that’s unethical human subjects research, but if stories and blog posts appearing over the weekend are to be believed, that’s exactly what it did, and, worse, it’s not very good research. The study, entitled “Experimental evidence of massive-scale emotional contagion through social networks“, was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS), and its corresponding (and first) author is Adam D. I. Kramer, who is listed as being part of the Core Data Science Team at Facebook. Co-authors include Jamie E. Guillory at the Center for Tobacco Control Research and Education, University of California, San Francisco and Jeffrey T. Hancock from the Departments of Communication and Information Science, Cornell University, Ithaca, NY. (more…)

Posted in: Clinical Trials, Computers & Internet, Neuroscience/Mental Health, Science and the Media

Leave a Comment (128) →

Facebook’s reporting algorithm abused by antivaccinationists to silence pro-science advocates

This is not what I had wanted to write about for my first post of 2014, but unfortunately it’s necessary—so much so, in fact, that I felt the obligation to crosspost both here and on my not-so-super-secret other blog in order to get this information out to as wide a readership as possible.

I’ve always had a bit of a love-hate relationship with Facebook. On the one hand, I like how easily it lets me stay in contact with family and friends across the country, people whom I would rarely see more than once or twice a year, if even that. On the other hand, I have the same privacy concerns that many other people have with respect to putting personal information, as well as pictures and videos of myself, family, and friends, onto Facebook. Now that I’ve become a (sort of) public figure (or, as I like to refer to myself, a micro-celebrity), I’ve thought that I should cull my friends list to just real friends with whom I have a connection (or at least have met in person or had private e-mail exchanges with) and set up a Facebook page for my public persona, to prevent people whom I don’t know or barely know from divebombing my wall with arguments. As I tell people, I don’t want obnoxious arguments on my Facebook wall; that’s what my blogs are for.

My personal social media preferences aside, Facebook does indeed have many shortcomings, but until something else comes along and steals the same cachet (which is already happening as teens flee Facebook to avoid their parents) and even after, Facebook will remain a major player in social media. That’s why its policies matter. They can matter a lot. I was reminded of this about a week ago when Dorit Reiss (who has of late been the new favored target of the antivaccine movement, likely because she is a lawyer and has been very effective thus far in her young online career opposing the antivaccine movement) published a post entitled Abusing the Algorithm: Using Facebook Reporting to Censor Debate. Because I also pay attention to some Facebook groups designed to counter the antivaccine movement I had already heard a little bit about the problem, but Reiss laid it out in stark detail. Basically, the merry band of antivaccinationists at the Australian Vaccination Network (soon to be renamed because its name is so obviously deceptive, given that it is the most prominent antivaccine group in Australia, that the NSW Department of Fair Trading ordered the anti-vaccine group to change its misleading name) has discovered a quirk in the algorithm Facebook uses to process harassment complaints against users and abused that quirk relentlessly to silence its opponents on Facebook.

I’ll let Reiss explain:

Over the weekend of December 21-22, an unknown person or persons used a new tactic, directed mainly at members of the Australian organization “Stop the Australian Vaccination Network” (The Australian Vaccination Network – AVN – is, in spite of its name, an anti-vaccine organization – see also here; SAVN had been very effective in exposing their agenda and mobilizing against them). In an attempt to silence pro-vaccine voices on Facebook, they went back over old posts and reported for harassment any comment that mentioned one person’s name specifically. Under Facebook’s algorithm, apparently, mentioning someone’s name means that if the comment is reported it can be seen as violating community standards. Which is particularly ironic, since many commentators, when replying to questions or comments from an individual, would use that individual’s name out of courtesy.

Several of the people so reported received 12-hours bans. Some of them in succession.

(more…)

Posted in: Computers & Internet, Public Health, Vaccines

Leave a Comment (30) →