Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone
The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.
By Bill Brantley
October 6, 2015
Big data and data analytics have been widely embraced in public administration. Many public administration programs have added more data analysis classes with some schools offering concentrations, certifications or degrees in data science. Arizona State University has formed a policy informatics network for researchers and practitioners. The White House, through the Evidence and Evaluation function of the Office of Management and Budget, requires agencies to create budget requests and manage programs based on rigorous data analysis. Both the Partnership for Public Service and the IBM Center for the Business of Government regularly produce publications on how to use data science to better manage public agencies and programs.
This is not to say that data analytics is new to public administration. Good policy analysis has always relied on data and statistical analysis to understand policy issues and formulate appropriate policy responses. What has changed is the ability to collect data. Collecting data in the past could be very expensive, time-consuming and limited to the endurance of the often-human research subjects. Today, agencies have access to vast amounts of data created by the clicks and comments of millions of online users. This data deluge will soon be overwhelmed by the larger wave of data coming from the devices connected to the Internet of Things.
The analytical tools have also become more powerful with an increase in raw computing power and the new data science techniques from big data researchers. Data can be crunched, sliced, diced and visualized with point-and-click ease. There are many free open-source tools to analyze and visualize data along with even more free training resources suitable for even the most novice user. Never before has it been easier to use data to make good evidence-based decisions. Even so, what is missing that can turn all this data and analysis into something useful?
The federal government collects a great amount of workforce data on its two-million-plus civilian employees. There is the transactional data, which details who works where, for how long and for what pay along, work history, educational history and training information. There is the annual Federal Employee Viewpoint Survey, which captures employee perceptions of their work, managers and agency. The last seven years of my career have had me immersed in federal workforce data to create reports and visualizations that are snapshots of the current state of the civilian workforce.
Even so, what is known about the federal workforce? There have been studies on best places in the federal government to work, attitudes about leadership and level of employee engagement in the various agencies. Some agencies, such as the Federal Deposit Insurance Corporation (FDIC), created a program to develop trust in the work environment. This was after the insights gained from having low employee engagement scores. The insights allowed the FDIC to develop a strategy to move the agency eventually into one of the top three places to work in the federal government. The FDIC used the data and the analysis to generate insights upon which to build a strategy. Insights are the real value of data analytics.
Marco Vriens in his book, The Insights Advantage, defines insights as “[t]houghts, facts, data, or analysis of facts and data that induce meaning and further understanding of a business challenge and create an urgency to act or rethink a business challenge in terms of its problems or solutions.” He explains how the proper management of insights leads to five advantages: avoiding mistakes, early warning signals, increased efficiency, growth and competitive advantage. Vriens advocates implementing a formal insights management process to capture, retain and disseminate insights created by the data analytics activities.
The first step is to define the problem area with the next step being to determine what is already known. Then, data analysis projects are devised with clearly specified objectives. The research from the data analysis is synthesized, interpreted and reported to decision makers. To this process, Vriens adds a unique twist: the insights database.
Realizing that it is rare for the right insight to line up with the right need at the right time, Vriens suggests building a database of insights composed of three levels. The foundational level is a complete description of the insight and the methodology behind its discovery. The second level is a summary of the insight and what problem areas in which it could apply. The third level is sentence-or-two highlights that are reported to decision makers on a periodic basis so that the decision makers are aware of the insights and their potential value.
Governments produce an amazing amount of information and have incredible analytical talent. What is needed are defined management processes to collect, evaluate and disseminate these insights on a regular basis for the maximum impact on all levels of government. Not just information sharing; insight sharing.
Author: Bill Brantley teaches at the University of Maryland (College Park) and the University of Louisville. He also works as a Federal employee for the U.S. Patent and Trademark Office. All opinions are his own and do not reflect the opinions of his employers. He can be reached at http://about.me/bbrantley.