Abstract

A high-stakes decision requires deep thought to understand the complex factors that stop a situation from becoming worse. Such decisions are carried out under high pressure, with a lack of information, and in limited time. This research applies Causal Artificial Intelligence to high-stakes decisions, aiming to encode causal assumptions based on human-like intelligence, and thereby produce interpretable and argumentative knowledge. We develop a Causal Bayesian Networks model based on causal science using d-separation and do-operations to discover the causal graph aligned with cognitive understanding. Causal odd ratios are used to measure the causal assumptions integrated with the real-world data to prove the proposed causal model compatibility. Causal effect relationships in the model are verified based on causal P-values and causal confident intervals and approved less than 1% by random chance. It shows that the causal model can encode cognitive understanding as precise, robust relationships. The concept of model design allows software agents to imitate human intelligence by inferring potential knowledge and be employed in high-stakes decision applications.

Highlights

  • Critical events are unexpected situations that severely affect citizens, infrastructure, and government

  • This research used Causal AI for high-stakes decisionmaking by utilizing causal science to encode human-like intelligence

  • Causal encoding based on d-separation and dooperation was applied to model causal assumptions as represented by Causal Bayesian Networks (CBNs) with Causal OR, Causal P-Value, and Causal CI used to discover causal effects by measuring the commonsense behind a graph

Read more

Summary

INTRODUCTION

Critical events are unexpected situations that severely affect citizens (e.g., by causing serious injury or death), infrastructure (e.g., via transportation damage or communications failure), and government (e.g., with economic crises or financial loss). Formosa [6] proposed an approach for traffic conflicts using proactive safety management strategies, while Anbarasan et al [7] introduced a technique for highstakes events during flood disasters Both support highperformance accuracy for better decision-making, but current deep learning focuses on detection and explanation performance rather than on supporting high-stakes decisions. Causal AI lets machine learning describe the cognitive reasons for predicted output based on human-like interpretations [9] It aims to produce reasons for "Why" and "How" events happen given current evidence regardless of outcomes, and so synthesizes plausible arguments and interpretations that decision-makers can utilize. Critical event interpretation should take advantage of Causal AIbased machine learning to produce practical knowledge for high-stakes decisions This needs causal knowledge produced by human-like intelligent agents, which will help interpret the events that may critically influence the future

HIGH-STAKES DECISION MAKING
BAYESIAN NETWORKS
CRITICAL THINKING
CAUSAL BAYESIAN NETWORKS FOR HIGHSTAKES DECISIONS
CAUSAL QUESTIONS BASED ON CONCLUSIONS FOR HIGH-STAKES DECISIONS
THE CAUSAL CONCEPT FOR HIGH-STAKES DECISION
CAUSAL MACHINE LEARNING
CAUSAL ENCODING FOR HIGH-STAKES DECISIONS
CASE STUDY
CAUSAL EFFECT MODELLING
DATA PREPROCESSING
A TWEET USING OUR APPROACH
EXPERIMENT SETUP
MEASUREMENT METRICS
DATASET
RESULTS
DISCUSSIONS
CONCLUSIONS
Findings
Barredo Arrieta et al, “Explainable Explainable Artificial
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call