“Married Couples Live Longer.” “Caffeine Could Impair Women’s Fertility.” “Walnuts Improve Reaction to Stress.”

For centuries, academic research has worked to solve real world problems in fields such as medicine, economics, and science. But what happens when the research results we’ve depended on are flawed?

Earlier this year, an Excel error rocked the economic world when it came to light that data released by Harvard economists Carmen Reinhart and Kenneth Rogoff contained a dramatic error. Reinhart and Rogoff presented evidence that a level of debt higher than 90% slowed economic growth, a contrast to the Keynesian economic policies that advocated increased spending to stimulate growth.

Economist Paul Krugman in a New York Times Op-Ed, addresses the complex issues involved in understanding the implications of flawed academic research. After all, their paper was not simply an esoteric analysis, but a tool used to influence policy.

Europe, reeling in the throes of an economic crisis, took Reinhart and Rogoff’s findings to heart and pressed austerity measures. Results have not been favorable, and three years after their paper was first published, southern Europe is still struggling economically. Unemployment rates in Europe are at all time high, with Spain at an astounding 27.2% unemployment.

The staggering mistake was discovered when Reinhart and Rogoff, pressed by other economists by the anomalies in their findings, finally allowed another team of researchers access to their spreadsheet. It turned out they accidentally left five countries out of an averaging equation. Corrected, their result of -0.1% became 2.2% —a rate not dramatically different than that found at lower debt to GDP ratios.

While Reinhart and Rogoff’s error was far-reaching, at least it was accidental. As noted in this 2011 New York Times article, prominent social psychology professor Diederik Stapel wished he could have said the same. On an evening in 2011, a close friend and colleague delivered the distressing news that Stapel stood accused of research fraud. In a case that stunned the Dutch academic community, the prominent social scientist admitted to years of faking research, resulting in 53 studies being retracted to date. Psychologist Jonathan Schooler explains why this happens in the New York Times piece:

“The big problem is that the culture is such that researchers spin their work in a way that tells a prettier story than what they really found,” said Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “It’s almost like everyone is on steroids, and to compete you have to take steroids as well.”

It is undeniable that Stapel’s data manipulation was crooked and audacious. He became so confident in his deceit that he copied and pasted rows of columns in order to produce the “correct” results, an error that gave away his forgery. Yet, the NYT profile also placed substantial blame on an academic community that creates an environment ripe for exploitation:

If Stapel was solely to blame for making stuff up, the report stated, his peers, journal editors and reviewers of the field’s top journals were to blame for letting him get away with it.

Stapel, ever the social scientist, has found a renewed interest in the source of his own demise. What caused him to act so unethically? What could have prevented him? A tell-all autobiography detailing his downfall, entitled “Derailment,” is already out.

The argument that all this could have been avoided with stricter enforcement is an interesting one, particularly because there is substantial research to the contrary. An economist studying prison sentences for juvenile offenders found no evidence that crime decreased as penalties stiffened. Moreover, in a fascinating experiment, a team of researchers applied game theory to the relationship between enforcement and crime, and discovered it is bounded rationality that explains why the relationship between criminal activity and extremity of punishment is so tenuous. It turns out enforcers reduce inspections when penalties increase, so the net effect on crime is neutral.

If penalties play a small role in wrongdoing, where does that leave us with respect to the academic community? A Science article about the meaning of the Stapel fraud for the social science community offers this insight:

It is almost inconceivable that co-authors who analysed the data intensively, or reviewers of the international “leading journals”, who are deemed to be experts in their field, could have failed to see that a reported experiment would have been almost infeasible in practice, did not notice the reporting of impossible statistical results.

Perhaps we would do well to remember the bounds of our own rationality as we parse new studies, and remember that as readers we have a responsibility to think critically about the new information presented.

In closing, we share a few pieces of fiction that lend an additional angle to the many questions raised in this curation: Vladimir Nabokov’s, The Vane Sisters as well as the eponymous film School Ties.


Additional Readings:


Image credit: Patricia Drury via flickr

About The Author

Avatar photo

Anna Redmond is the author of The Golden Arrow, a fantasy political thriller which draws on historical traditions of holy sex to create a society where women use sex for magic and power. She is also curator and co-founder of Hippo Reads.