15 - 16 September 2021

Live Online Training (via Zoom)

Bayesian Updating (Diagnostic Theory-Based Evaluation)

The UK Evaluation Society is pleased to be working with CECAN Ltd to deliver this day and a half Bayesian Updating Masterclass


Tutor:
Dr Barbara Befani, CECAN Research Fellow, has been developing evaluation methods for 15 years.  Her interests include 1) evaluation quality; 2) methodological appropriateness and comparative advantages and weaknesses of different evaluation methods; 3) causal inference frameworks for impact evaluation; and 4) specific hybrid, quali-quanti methodologies, like Qualitative Comparative Analysis (QCA) and Process Tracing (in particular its Bayesian formalisation, which she is extending to all forms of Theory-Based Evaluation).

Course Details:
This course covers the theory and practice of diagnostic evaluation and Bayesian Updating.  It addresses its epistemological and mathematical basis, its appropriateness and strengths/weaknesses in comparison with other methods, and its application steps.  The method is focused on the empirical testing of propositions, including theories, causal statements and explanations, and its application steps will be addressed in three different modules, each coming with specific group work / exercises:
1) The formulation of theories / propositions and their implications for empirical testing;
2) The design of data collection and the ranking of observations in terms of probative value and
updating direction;
3) The estimation of probabilities for use in the Bayes formula and the updating of confidence in
the existence or non-existence of the theories / statements.

The course includes frontal presentations and case illustration with prepared examples as well as more interactive discussion of evaluations participants are familiar with.

  1. Plan of the day
  2. Day 1 – Wed 8 Sep (Full Day 09:30 – 17:00)
    09:30 – 11:00 Overview of Theory-Based Evaluation, Process Tracing, and other methods
    based on generative causality. Theoretical Introduction to BU / DE (Pt. 1)
    11:00 – 11:20 Break
    11:20 – 12:45 Theoretical Introduction to BU / DE (Pt. 2), presentation of its application steps
    12:45 – 13:45 Lunch
    13:45 – 15:10 Application Step One: Developing Testable Theories (with group work)
    15:10 – 15:30 Break
    15:30 – 17:00 Application Step Two: Designing data collection seeking conclusive tests (Pt. 1) (with group work)
  3. Day 2 – Thu 9 Sep (Half Day: 09:30 – 12:45)
    09:30 – 11:00 Application Step Two: Designing data collection seeking conclusive tests (Pt. 2)
    (with group work)
    11.00 – 11:20 Break
    11:20 – 12:45 Application Step Three: Assessing the evidence strength and updating confidence in theory (with group work)

Learning Outcomes

By the end of this session, participants will :

  • Gain a new perspective on Theory-Based Evaluation and what it takes to improve its credibility and reliability
  • Understand the connection between theory development and empirical observations
  • Learn to use the Bayes formula to update confidence in theories and claims
  • Gain an in-depth understanding of Process Tracing.

Intended Audience:
Policy analysts, evaluation professionals, doctoral students, and academic researchers.

Pre-reading:
Befani B. Quality of quality: A diagnostic approach to qualitative evaluation. Evaluation.
2020;26(3):333-349. doi:10.1177/1356389019898223
Befani B. Diagnostic evaluation and Bayesian Updating: Practical solutions to common
problems. Evaluation. 2020;26(4):499-515. doi:10.1177/1356389020958213
Befani, B., & D’Errico, S. (2020). Letting evidence speak for itself: Measuring confidence in
mechanisms. In J. Schmitt (Ed.), Causal Mechanisms in Program Evaluation. New Directions for
Evaluation, 167, 27– 43. https://onlinelibrary.wiley.com/doi/full/10.1002/ev.20420

Fees

£330 for Non-Members of UK Evaluation Society
£270 for Members
£210 for Student Members

To apply for this course, please register here