Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone
The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.
By Craig Orgeron
February 28, 2017
The well-worn line – Water, water everywhere, / Nor any drop to drink – from Coleridge’s 18th century poem, “The Rime of the Ancient Mariner,” brings to full view the quandary of leveraging data to shed light on complex public policy challenges. The rapid, exponential growth of information in the public sector, with scant analytical tools or skills, challenges governments to identify and solve real world, and often intractable policy problems. Data, unused and unsuited for analysis, surrounds public entities. The vital importance of information sharing, data governance and predictive analytics is regularly highlighted, as in the recently published “Top Ten Priorities for 2017” by the National Association for State Chief Information Officers (NASCIO), which cited data management and analytics as seventh on the list, after joining the “Top Ten” from NASCIO for the first time in 2015.
In Daniel Kahneman’s recent book, Thinking, Fast and Slow, the Israeli-American psychologist details a lifetime of work into the inner workings of the psychology of judgment and decisionmaking. Kahneman points to the irrational nature of many judgements and decisions, despite access to information for statistical analysis. Of note was Kahneman’s reference to the availability cascade, first developed by Timur Kuran and Cass Sunstein. The availability cascade is a self-perpetuating and often escalating narrative that can complicate valid debate in the public policy arena simply through increasing availability of a topic in public discourse. This cascade of availability often skews policy debates, and often trumps the use of and reliance on thoughtful data analysis.
Kahneman’s text further points to the reliance on biased judgements as an imperfect tool for human decisionmaking. For public policy makers, frugal times often beckon the use of various tools to squeeze efficiencies from an already taxed bureaucracy. In the public sector, doing more with less may be the new operational norm. Per the National Association of State Budget Officers (NASBO), both federal and state coffers have seen anemic growth in fiscal year 2016, as compared to previous years. So it goes, with low growth, the pressure to deliver faster and better services to citizens, as well as offer robust policy solutions, continues to increase. Therein lies the expanding potential of investing in and pursuing a data analytics strategy. NASCIO noted in the 2015 publication, “Data the Lifeblood of State Government,” that 35 percent of states’ governments self-reported the pursuit of big data initiatives. Also of import in the NASCIO report is the recognition that explosive growth in the creation of data has direct ties to the rise of the Internet of Things, unmanned aerial systems, as well as body-worn cameras.
Given that many states have initiated big data initiatives, the resistance to information sharing can create an intractable inertia. In a follow-up 2016 report, NASCIO describes opposition to early adoption of big data initiatives often slow, or stall progress. Likewise, Stephen Goldsmith cites a Missouri data analytics initiative as improving service and reducing cost. Making the point that data sharing arrangements can be problematic, Professor Goldsmith references a 2014 project in Indiana focused on the reduction of infant mortality. The Indiana effort, often cited and well regarded, encountered initial legal difficulty in the release and conflation of disparate data sets. The preliminary legal concerns, successfully negotiated, resulted in definitive strategies to mitigate risks for infant mortality. In a similar vein, Susan K. Urahn, Executive Vice President and Chief Program Officer for the Pew Charitable Trusts, is leading an effort which seeks to uncover advanced methods to identifying, collecting and fully utilizing data to influence policy based decisionmaking.
As noted by Professor Goldsmith, as well as the work in progress at Pew, both acknowledging the difficulty that comes with sharing information in a cross-jurisdictional context, the primary aim is to validate the reliance on data analytics, rather than intuition, in policy debates. As noted by Kahneman in Thinking, Fast and Slow, the reliance on intuition leads to faulty decisionmaking, a similar conclusion as that of Dan Ariely in Predictably Irrational. Often driven emotionally, decisionmaking in the public sector pays negligible attention to robust analysis, likely a result of the very norms of political ethos. Politics may have much in common with baseball, filled with ritual, hunches and gut feelings as the apparatus of decisionmaking. The buried biases, which defined baseball, are uncovered in the 2003 book by Michael Lewis, Moneyball. Lewis’s latest effort, the newly released The Undoing Project, centers on the decisionmaking process, and brings the conversation full circle highlighting the life’s work of two Israeli psychologists, Daniel Kahneman and Amos Tversky, pioneers in the science of decisionmaking.
As highlighted in the solutions from Missouri and Indiana, cited by Professor Goldsmith, the collaborative nature of securing the pertinent data for analysis is a challenge in public sector. Equally challenging is forgoing the reliance on educated guesses, as Billy Beane, the general manager of the Oakland Athletics, did in pivoting to the use of data and statistics to build a winning baseball team. Conceivably, government could forge norms to avoid drowning in data, and instead harness models to share information critical to solving the vexing policy issues of the day – perhaps, a suitably intriguing next topic for a well-crafted treatment by Michael Lewis.
Author: Craig P. Orgeron, Ph.D., CPM, Executive Director, MS Department of Information Technology Services