The revelation that UCLA graduate student Michael LaCour faked the relevant data of a major same-sex marriage persuasion study last winter has been followed by growing suspicions about his academic credentials and prior research.
To this we may now add another entry: a journalist and some scientists deliberately set up the least credible study they could and pitched the clickbait-y results — that chocolate helps you lose weight — to international media outlets with regrettable success.
LaCour’s fabrication was high-level statistical fabulism, the latter a pop-culture hoax, but they point to a problem with a media landscape infatuated with data. To be sure, this is hardly the first time studies, false or otherwise, have permeated mainstream media — here’s an article from five years ago ruing their plague as an already tiresome phenomenon, and it wasn’t the first of its kind.
But data journalism has changed the premium we place on studies, largely by its implicit belief than in numbers hide truths about human behavior, experimentally-verified truths that cut through conventional narratives and ideological preconceptions. You may passionately believe X, or have always known Y, but This Study definitively tells you otherwise. The combination of this faith in the data with an internet that increasingly relies on affirmative clicks via social sharing makes for headlines like this:
That article is based on LaCour’s research. (It has no correction attached.) I’m not picking on that one post; it’s merely indicative of an abstract-to-Facebook funnel of which LaCour’s globally-trumpeted study is only the most disappointing example.
The common response to this is to demand journalists better verse themselves in statistical methodology. No doubt, but this misses the point in several ways. Not even journalists/internet content producers with research backgrounds is going to catch every hidden methodological flaws in sophisticated data sets while publishing at internet speed. LaCour’s data was faked, an extreme example, but more common are studies interpreted beyond what their data show; the Strange Case of the Cancer Curing Fart will suffice as an example. Many of these stories begin life in science sections, suggesting subject immersion isn’t the problem.
More pointedly, the models used by studies have grown so complicated some question whether the researchers themselves entirely understand them. Via political scientist Tim Groseclose:
I also believe that there are lots of similar, yet so far undetected, cases like LaCour’s in political science. Over the past five or ten years I have noticed more and more papers written by young political scientists (grad students and assistant professors) that claim to use extremely fancy and complex statistical techniques, yet the authors do not seem to fully understand the techniques that they claim to use. Their descriptions of their statistical methods are often as opaque as the LaCour appendix that I discuss above.
Again, LaCour’s malfeasance is rare, but the extremity illuminates the wider problem: the net-spawned need to interpret and then reduce bogglingly-complex studies into sharable, clickable headlines. Many of the wonkier sites, like New York Times’ Upshot and Washington Post’s Wonk Blog, approach data with a skeptical eye; NYMag’s Science of Us, for instance, was measured in its original write up of LaCour’s study and has been avid about pursuing him since. But with each share and each aggregation, nuance is lost.
By the time they show up on your Facebook wall or Tweetdeck feed, the data have become firm statements of world-ideological truths. And the more attractive those truths to the clicking audience, the more likely we are to encourage them. Which is to say that LaCour’s study wouldn’t have spread so far so fast had it disproven that contact with people made us more empathic toward them, a far less salutary result. His data traveled because it seemed to prove what we wanted to believe was true. In the end, we fell for the data out of the exact flaw the data was meant to correct.
[Image via screengrab]
Have a tip we should know? email@example.com