Research tutorial

Evidential Reasoning and Learning

Date: 24 July 2022, 15:30

Lecturers: Federico Cerutti, Lance M. Kaplan

Location: Vienna, Austria, Messe Wien Exhibition and Congress Center, Lehar 1

Original page

When: Sun, July 24 2022, 1530h.

Where: Vienna, Austria, Messe Wien Exhibition and Congress Center, Lehar 1.

Who: Federico Cerutti and Lance M. Kaplan.

Slides

To cite the content of this tutorial, please refer to: Federico Cerutti, Lance Kaplan, and Murat Sensoy, "Evidential Reasoning and Learning: a Survey", IJCAI 2022.

Short Description of the Tutorial

When collaborating with an AI system, we need to assess when to trust its recommendations. Suppose we mistakenly trust it in regions where it is likely to err. In that case, catastrophic failures may occur, hence the need for Bayesian approaches for reasoning and learning to determine the confidence, or epistemic uncertainty, in the probabilities of the queried outcome. Pure Bayesian methods, however, suffer from high computational costs, so the tutorial introduces efficient approximations based on updating hypotheses when further evidence is collected.

The tutorial gives PhD students and early-stage researchers a gentle introduction to evidential reasoning and learning, surveying current research outcomes and the open questions that remain unresolved.

Description of the Tutorial

The session starts by distinguishing between two major sources of uncertainty: aleatory uncertainty, tied to inherent randomness in a process, and epistemic uncertainty, tied to the model user's lack of knowledge and therefore reducible with further data. This distinction is then used to frame reasoning and learning problems where both kinds of uncertainty matter.

The main technical focus is on uncertain probabilities represented by beta or Dirichlet distributions. Unlike surveys that only cover epistemic uncertainty in deep learning, this tutorial addresses reasoning in the presence of uncertainty as well as learning from complete and partial data. It uses probabilistic circuits as a unifying framework, discusses how to quantify uncertainty over their parameters, and closes by examining how uncertain probabilities can be elicited from experts or learned from raw data.

Detailed Outline

A primer in Bayesian Statistics

  • Fundamentals of statistics and Bayes.
  • Beta and Dirichlet distributions as uncertain probabilities.

Evidential Reasoning

  • From logic to probabilistic circuits.
  • Probabilistic circuits as a unifying method for probabilistic reasoning.
  • Probabilistic circuits with uncertain probabilities.

Evidential Parameter Learning

  • Learning with complete observations.
  • Learning with partial observations: preliminary proposals and discussions.

Ascertain Evidence from the Real World

  • Intelligence analysis and uncertainty.
  • Evidential Deep Learning.
  • Alternative proposals.

Summary and conclusion