Widgetized Section

Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone

Beyond Admiring the Problem: The Importance of Data Use

The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.

By Anita Larson
April 24, 2015

Since the introduction of new public management and the Clinton administration’s National Performance Review in the early 1990s, it is now commonplace for publicly-funded programs to collect and report data. This takes the form of a variety of metrics including number of people served, where they live or other attributes.

In response to a desire by communities to understand the outcomes of public investments in programs, the late 1990s and early 2000s brought accountability-style data reports that shared more than information about the target population served. Expectations were that reports show “changed lives” (or filled potholes). Likewise, philanthropy moved away from funding good intentions to expecting agencies to show not only who they served, but whether or not they had made a difference. This trend was also taken up by federal and state agency grant-makers as they funded large-scale evaluations for initiatives like Temporary Assistance to Needy Families (TANF), Juvenile Detention Alternative Initiatives (JDAI) and many family home visiting models.

Larson - performance

To most public sector staff and leaders, these performance measurement concepts are not new. Report cards, benchmarking reports and other forms of performance reporting – now made more accessible to the public thanks to the Internet – are commonplace at all levels of government. But when they repeatedly show the same results, reports become just another way to admire the problem. This frustrates politicians and community members and also lowers staff morale.

Most agencies do not have a choice about whether to report their program data. But what if these reports provided the foundation for something more meaningful, supporting internal work that can make a real difference?

In my work in planning, evaluation and guiding internal data use, I have found the following three simple steps foundational and applicable to agencies of any size. These steps help move programs toward a more mature data-informed culture that can begin making tangible differences in program outcomes.

Model Your Work and Add Measures That Make Sense (to You)

Every funder requires reports. Sometimes the reports seem incomplete, superficial or only reveal one view of an agency’s performance or process. Agencies should always explore other, more meaningful ways to measure their own performance. Ask yourself (and your staff), “How do we know when we’ve done well?” If you are not sure about this (and this happens)? Consider creating a logic model of your program or services and identify what short- and long-term changes (or outcomes) you expect as a result of your work. This can begin to clarify what you might measure that makes more sense and is actionable.

Data Quality

We all have data around our offices, on our servers and our desktops. But is it good data?  Take time to ensure that your data are high quality. For example, if you need to mail information to program participants (or do geographic analysis of your clients) consider validating addresses for accuracy. Consider making cross checks between paper records and the electronic file where you store data. These relatively simple steps also extend to the quality of more complex data such as multi-domain assessment scores for children or psycho-social evaluation data on offenders in corrections systems.

Internal Use

One of the more powerful steps a program can take is to identify the data that can be useful on a regular basis to staff. This is the type of data that helps staff make adjustments to practice, correct errors, shorten timelines or explore new solutions – before the next reporting period. It is this step that can make the difference in changing the trajectories of the most challenging problems – not only because the agency is not waiting around until next year’s data becomes available, but because this step taps the practice wisdom and insights of direct service staff. This step, like the logic model, can help clarify the program’s reasonable contribution to the problem, which can also guide outcome measurement.

One of the most common models for this activity is the Plan-Do-Study-Act model. As a leader, involving staff in identifying the metrics they want to use each week, month or quarter to improve their performance not only conveys the message that you respect their expertise, but it also ensures their buy-in with benefits to morale, particularly if data are not used against them. Examples from higher education, youth programming, and early care illustrate how programs create a data-informed culture and how leadership matters.

Using data more effectively internally helps to address persistent problems, whether it is lowering rates of homelessness, narrowing achievement gaps or increasing vaccination rates. Buffalo, New York’s Clean Sweep program shows how data informed work, in a community and across multiple service domains, can ultimately have meaningful, lasting and positive impacts on well-being and health.


Author: Anita M. Larson, DPA serves as part-time faculty in the public administration program at Hamline University. Email: [email protected].

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
Loading...

Leave a Reply

Your email address will not be published. Required fields are marked *