Widgetized Section

Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone

Public Administration Scholarship and the Epidemic of Academic Fraud and Dishonesty Part 1 – Symptoms of a Cataclysm

The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.

By Erik Devereux
October 27, 2023

This column is the first in a series on the epidemic of academic fraud and dishonesty being revealed on a daily basis. While the recent accusations leading to the resignation of the president of Stanford University or the accusations of dishonesty lobbied at a Harvard University expert on dishonesty (you could not make that up) have captured recent headlines, there is a much more subtle and widespread problem afflicting those disciplines rooted in social science research methods. This column discusses the outlines of widespread academic fraud and dishonesty in the use of those research methods and then begins to examine how powerful incentives developed over the past 30 years to fuel this behavior among academics.

Several years ago, the Association for Public Policy Analysis and Management (APPAM) hosted a special conference in Washington, DC at which some young scholars from the West Coast presented an eye-opening paper on, “p hacking,” The “p” in question is the probability value associated with a test of statistical significance. Right away, I am guessing that those reading this column already are thinking about the magical “p = 0.05” commonly used as the threshold for publishable research using associational statistics. Just as a refresher, what that threshold means is that a relationship between variables in a dataset has a 5 in 100 chance of being the result of randomness. The 0.05 threshold is completely arbitrary and actually is generally favorable to the researcher compared with much tighter thresholds used in fields like particle physics (anyone in the social sciences care to go for six sigma?).

The APPAM conference paper showed there is a high and unlikely percentage of published papers across a wide range of social science journals reporting a p value of exactly 0.050 rather than, say, 0.047 or 0.039. This spike of p values occurs because of academic fraud and dishonesty. Knowing that they must achieve that magic 0.05 to be published, researchers using statistical methods are manipulating their data and their models until they get there. If they were smarter, they would not stop at 0.05 but strive for a range below that. But they most often do not—thus the “spike” in published studies at the 0.05 level. I note that the APPAM conference paper provided analysis that identified the worst offenders in terms of fields and academic journals, and suggested the same approach easily could identify the worst offenders among individual researchers.

Any sound version of scientific method makes it very clear that manipulating datasets and models to achieve p = 0.05 undermines the conduct of science and makes the reported research results valueless. Unfortunately, this means that a careful review of papers published in the cherished journals of any social science field including public administration will conclude that many papers are severely compromised by researchers manipulating their data and models.

So why have the systems in place that seek to prevent academic fraud and dishonesty failed? Here in brief are ideas to be discussed in greater length through this series:

  • The “publish or perish” pressures on academic researchers have increased to the point that they undermine integrity.
  • Peer review has essentially collapsed as a means for preventing academic fraud and dishonesty.
  • Researchers in the social sciences barely understand the rudiments of science but are well instructed in grinding out publishable statistics from quantitative datasets.
  • Journals have created incentives for academic fraud and dishonesty through their widespread disinterest in publishing negative findings.

The good news (for science at least if not for scientists) is that there are solutions to how research is published that may go a long way to restoring the integrity of the process. I will discuss those near the end of this series.

I begin with the first point above regarding the pressures on academic researchers. When I entered my PhD program in Government at the University of Texas at Austin in the fall of 1985, the faculty informed the incoming doctoral students that we were going to experience a wide-open academic job market the likes of which had not been seen for decades. The putative source of this demand for our services was the pending retirement of a large faculty cohort hired immediately after World War II. Well, nothing of the sort happened. In fact, just as I was entering the academic job market in 1990, the market began a dramatic collapse that continues to unfold to this day. Two related trends represent this change: the elimination of tenure track jobs and the widespread use of adjunct faculty as a substitute.

Several developments caused the collapse of traditional academic employment including court rulings that mandatory retirement of tenured faculty was illegal in the United States, the sharp reduction in state governments funding their state universities, the rise of voraciously competitive for-profit universities and the decision by the federal government to greatly increase the amount of borrowing to pay for college under the subsidized loan program. You may not perceive a direct line between these developments and academic fraud and dishonesty. This series will draw such lines for you in sharp relief. Stay tuned!


Author: Erik Devereux is a consultant to nonprofits and higher education and is an executive-in-residence at Hood College in Frederick, Maryland. He has a B.S. from the Massachusetts Institute of Technology (Political Science, 1985) and a Ph.D. from the University of Texas at Austin (Government, 1993). He is the author of Methods of Policy Analysis: Creating, Deploying, and Assessing Theories of Change (available for free here). Email: [email protected]. Twitter: @eadevereux.

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...

Leave a Reply

Your email address will not be published. Required fields are marked *