10 - 12 September 2024

Online (Zoom), 1300 - 1700 BST

Contribution Analysis (September 2024)

A pragmatic approach to operationalise theory-based evaluation


Course Summary

Contribution Analysis has emerged as a structured approach to address the real-world challenges of evaluating impacts of complex interventions. It consists of a step-wise, iterative process of refining theories of change. Mixed method research designs are used to verify the critical and contested assumptions (‘causal hotspots’). The course explains the approach and introduces you to methods that are often used to assess the contribution of a program and reflect on the importance of the change that results from that contribution.



This is an intermediate course that targets relatively experienced independent evaluators, programme implementers, and evaluation commissioners who need to understand or design appropriate mixed-methods evaluations in conditions of complexity.


Learning Outcomes:

Participants will learn how to:

  • Recognise why contribution is different than attribution and implications for evaluation design.
  • Choose appropriate methods to select and verify ‘causal hotspots’ in the theory of change.
  • Critically assess the strengths and weaknesses of several methods in order to be able to combine them purposefully in a robust evaluation design.



Giel Ton is Research Fellow at the Institute of Development Studies and the Centre for Development Impact He specialises in the design of mixed-methods impact evaluations in private-sector development programmes. He promotes contribution analysis as an overarching approach of theory-based evaluation and a stepwise process to identify the hotspots where additional data collection and reflection are needed. He has a special interest in the effectiveness of programmes that aim to improve governance and coordination in agricultural value chains and empower smallholder farmers in collective action. He has published about the methodological challenges of impact evaluation of development interventions in the journal Evaluation and the IDS Bulletin. He has advanced skills in both qualitative and econometric research.

Drew Koleros is a Senior Researcher at Mathematica in the United States, with expertise in designing and delivering mixed-methods evaluations and programme monitoring, evaluation, and learning systems. He brings particular expertise in using theory-based approaches that integrate complexity concepts and systems thinking into programme and evaluation design processes. Drew has also supported multiple teams in designing strategy-level theories of change to inform their monitoring, evaluation, and learning initiatives. He has published on his work in designing theory-based evaluation approaches in the American Journal of Evaluation, the Canadian Journal of Program Evaluation, and in multiple practitioner settings.


Access Details:

This course is offered fully online. Access details will be provided after the booking window closes on 9 September 1000 BST.

If you have any alternative support needs in order to effectively participate in this training, please let us know before booking or as soon as possible after booking by emailing our team on hello@evaluation.org.uk . We look forward to assisting you in whatever way we can.



Refunds will only be provided for participant cancellations received at least 14 days prior to the event. UKES reserves the right to cancel the event, in which case a full refund would be provided.



Members                      £300 + VAT *
Student Members*       £250 + VAT

  • Members have priority booking though 17 June 2024.

Non-Members             £350 + VAT *

  • Non-member booking opens on 18 June 2024.

All bookings close at 1000 BST on 9 September 2024.


*Student bursaries available for full-time students at UK institutions. Apply here: https://www.evaluation.org.uk/student-bursaries-for-ukes-training-courses/