Widgetized Section

Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone

How Causal Artificial Intelligence Can Aid Public Policy Decision Making

The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.

By Bill Brantley
December 1, 2023

It was a wild weekend in the artificial intelligence (AI) world starting on November 17, 2023, when the OpenAI board (the company behind ChatGPT) forced the chief executive officer (CEO), Sam Altman, out. Five days later, Sam Altman is back as CEO of OpenAI with a new board. Why did the previous board fire Sam Altman?

Some reports center around an OpenAI project called Q* (pronounced Q-star), an AI model that can perform simple math. Current AI models, generative AI, can only work with the data they are trained on while Q* can reason about abstract concepts.

“’If it [AI] has the ability to logically reason and reason about abstract concepts, which right now is what it really struggles with, that’s a pretty tremendous leap,’ said Charles Higgins, a cofounder of the AI-training startup Tromero who’s also a Ph.D. candidate in AI safety. He added, ‘Maths is about symbolically reasoning — saying, for example, “If X is bigger than Y and Y is bigger than Z, then X is bigger than Z.” Language models traditionally really struggle at that because they don’t logically reason, they just have what are effectively intuitions.’”

The tremendous leap of Q* was alarming enough that several OpenAI researchers raised concerns about its potential to threaten humanity.

Causal Inference and Causal AI

When I heard about Q*, I thought the breakthrough involved causal AI. Causal AI models rely on causal inference. Causal inference is a methodological approach to determine whether a specific cause-and-effect relationship exists between variables. It goes beyond mere association or correlation, aiming to identify whether a change in one variable (cause) directly produces a change in another (effect).

In essence, causal inference tries to answer the question: “What would have happened to the outcome if the cause had been different?” Causal inference requires distinguishing between causation and mere correlation. Methods like randomized control trials (RCTs), observational studies and statistical models like regression analysis are often employed to infer causality. However, in many real-world scenarios, especially in public policy, RCTs are not practical or ethical, thus necessitating alternative causal inference methods.

Causal inference methods are used in public administration for policy analysis and evaluation. For example, counterfactual analysis involves considering what would have happened without the policy or intervention. It’s a conceptual approach that compares the actual outcome with a hypothetical scenario. Another approach is regression discontinuity design, used when policies are implemented based on a cutoff. For example, if a scholarship is awarded to students with grades above a certain threshold, the outcomes of students just above and below this threshold can be compared.

The Power of Counterfactuals in Public Administration

Counterfactuals play a significant role in various aspects of public administration. For instance, analyzing “what-if” scenarios helps evaluate the potential impacts of policy decisions and strategize for future actions (Roese, 1999). Counterfactual thoughts can influence behavior through content-specific pathways, playing a role in regulating behavior and improving performance (Epstude & Roese, 2008). This aspect can be significant in public administration for enhancing organizational efficiency and decision-making quality.

In public health, a key component of public administration, counterfactual frameworks and statistical methods are used to develop interventions to reduce disparities. These methods clarify scientific questions and guide analyses in public health research (Glymour & Spiegelman, 2017). Counterfactual analysis is also used in historical studies to infer alternative outcomes of policy decisions, which can provide valuable insights for current and future policy-making processes (Sylvan & Majeski, 1998).

The Great Potential of an AI That Can Reason About the Future

Causal AI is artificial intelligence focusing on understanding and using cause-and-effect relationships to make decisions or predictions. Unlike traditional AI, which often relies on spotting patterns in data, causal AI tries to understand how different elements or events influence each other. This understanding allows it to make more informed and accurate predictions or decisions. Causal AI models are built by modeling a system and using do-calculus (created by Dr. Judea Pearl) to analyze the cause-and-effect relationships in the model.

Counterfactuals are an essential part of causal AI. Counterfactuals are essentially “what-if” scenarios. They allow the AI to consider what would happen if certain conditions were different, even if they haven’t occurred. For example, a causal AI might use counterfactual reasoning to answer questions like, “What would have happened to sales if we had reduced the price of this product?” Essentially, causal AI can produce multiple scenarios for the future. How detailed the future scenarios are depends on the complexity of the causal model.

Humans excel in mental time travel, which is how we use counterfactuals to plan our future actions. If a causal AI model can also mentally time travel, we would have a powerful tool to simulate and optimize the effects of public policy decisions. “Ultimately, knowing the ‘why’ behind complex problems helps us to understand how the world really operates and, in turn, to identify the right actions to achieve desired outcomes. We may yet find that an ounce of causal AI is worth a pound of prediction.”

Author: Dr. Bill Brantley is the President and Chief Learning Officer for BAS2A – an instructional design consultancy for state and local governments. He also teaches at the University of Louisville and the University of Maryland. All opinions are his own and do not reflect the views of his employers. You can reach him at https://www.linkedin.com/in/billbrantley/.

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5.00 out of 5)

Leave a Reply

Your email address will not be published. Required fields are marked *