About

UK Evaluation Society Annual Conference 2022

 

Re-imagining Evidence and Evaluation – Politics, Context, Challenges


Conference Theme

The practice of evaluation, and the broad, diverse evaluation community, has a role to play in informing the responses to these challenges. We need to be adaptable and nimble. We need to navigate the political dimension of using evidence and evaluation in decision-making. The evidence we use needs to be responsive to the environments in which we work, building on new technologies and possibilities as they arise. Evaluative practice needs to be authentic and robust enough to keep relevant as we address the processes and challenges of rapid change. We need to see when change requires adjustment or adaptation or when it is at the level of transformation.

The UK Evaluation Society Conference 2022 builds on some of these debates and discussions to explore the politics, contexts, capabilities, theories and data the evaluation community uses to respond to these challenges and interrogate what further we can do to inform and encourage evidence-based decision making, in both local and global contexts, on these fast-moving, complex issues.

Welcome!

The UK Evaluation Society Annual Conference is a great place to meet and network with fellow evaluators and those working in the wider evaluation field, to keep updated and to hear from others how they have navigated similar issues as you!  Bring your Calling Cards!  Use the different spaces we have created to your advantage!  Share your experience, contribute to shaping the future of the evaluation agenda!

HYBRID FORMAT of the 3 Day CONFERENCE

Hybrid format is a first for the Society. and offers you a broader range of opportunities to engage with the Conference ! We look forward to your feedback, and as ever ways to improve. There will be a feedback form circulated at the end and various opportunities along the way to offer your thoughts.

Online Tuesday May 24th and Wednesday 25th

The pandemic has certainly helped to speed up online engagement across the evaluation community. At this 2022 Conference the Society is offering two days of online sessions. (Tuesday May 24th and Wednesday May 25th ) We hope you can attend these ‘live ‘to take most benefit and indeed be able to join in the Q and A. Recordings of the presentations will be accessible to Delegates after the Conference and provide a ‘canon’ of information and experience for your reference. This kind of reach was not the norm before the pandemic where your participation at sessions was limited to those you could attend in-person.

In-person Thursday May 26th One Birdcage Walk, Westminster, central London

The pandemic and attendant concerns have made it more difficult to meet in person and build those connections we thrive upon and indeed need to build our sense of community. We heard the desire to meet, hence the third day (Thursday 26th May) is an in-person day at the central London venue – One Birdcage Walk, Westminster. This is a spacious setting, just off Parliament Square, where Delegates can network with ease. The main sessions will also be live-streamed and recorded. However, we hope you will take the opportunity to come, meet and exchange face to face!

Some SESSIONS to look out for :

IGNITE  – 11:50 – 12:05 Tuesday 24th May
How do you disseminate evaluation learning?
Ignite sessions are participatory, snappy, fast, and fun! Those joining are posed a question, and have 3 minutes to provide their opinions and thoughts as we move around the virtual room. Any leftover time is used to dig deeper into comments others have shared that are of interest. This Ignite session is focused on learning from evaluation, with the key question posed being how you disseminate learning from your evaluation.

FISHBOWL discussion – 14:40 – 15:10 Wednesday 25th May
Experiences of facilitating the development of a theory of change.  This session is an open, discursive, facilitated conversation.
Join us to discuss this hot topic of facilitating theory of change and share your experiences whether you have done this with large numbers of people, under tough time constraints, or where there is little shared vision between stakeholders, or indeed where it has gone smoothly and productively !

HOW DID I GET HERE ? During breaks on Thursday 26th May

The career path of an evaluator is often multivariate, and rich in different roles in which we grow our expertise. We have all had points when we have considered a career change, or indeed been newer to the field and wondered how we become the next leader in evaluation. How Did I Get Here ? is a free-flow opportunity to speak with members of the UKES Council whose careers have had interesting pathways, and learn more about the choices they made and how they arrived at their current position.

BIRDS of a FEATHER Lunch Break Thursday May 26th
‘How can Evaluation develop a ‘voice’ on national and/or global challenges ?’
How would this be done ? What would we seek to say ?
‘Birds of a Feather’ brings us together to informally share thoughts on this key topic.

Keynote Speaker

Professor Ruth Boaden

Independent Adviser on Evaluation, Greater Manchester Health and Social Care Partnership
Honorary Professor, Alliance Manchester Business School

May 26th One Birdcage Walk Westminster, London

Also, we are delighted to welcome:

Baroness Helena Kennedy, QC will be In Conversation with Professor Eliot Stern at the UK Evaluation Society Conference, May 26th, One BirdCage Walk, Westminster, London

Baroness Helena Kennedy, QC who will be In Conversation with Professor Eliot Stern at the UK Evaluation Society Conference

May 26th, One BirdCage Walk, Westminster, London

Further details on Round Table Exchange

Evaluative Practice Fit for the Future – is Evaluation Ready for the Data Revolution?

Thursday May 26th 0915 – 1030 BST. Moderator: Professor Murray Saunders

Introduction to UK Evaluation Society Round Table Exchange Series

The Round Table Exchange (RTE) at the Conference is the third in a series of exchanges offered by the UKES (blogs from RTE 1 on the changing evaluation context and RTE 2 on commissioning, resourcing and procuring evaluation are available on the UKES website. www.evaluation.org.uk Please contribute to the debate!). RTEs are intended to provoke and encourage debate on important aspects of evaluative practice as we look to the future. The RTE series aims to provide an opportunity for individuals involved in evaluative practice from diverse standpoints, to share and exchange their views and with broader virtual participants. RTEs also provide resources for onward debate through a joint blog published on the UKES website shortly after the event.

This RTE complements other aspects of the UKES Conference theme on re-imagining evidence. This focus is being emphasised because of the recent acceleration of interest, often prompted by new technologies, in the form and scope of data and how it may be used as evaluative evidence. This interest might focus on wholly new forms and scope of data and opportunities for use but it should not occlude or obscure perennial challenges which remain central to effective and principled evaluation.

Round Table Exchange at Conference – Is Evaluation Ready for the Data Revolution ?

The UN Secretary General’s Independent Expert Advisory Group on a Data Revolution for Sustainable Development asserted in 2015 that,

“Better data and statistics will help governments track progress and make sure their decisions are evidence-based; they can also strengthen accountability. This is not just about governments. International agencies, CSOs and the private sector should be involved. A true data revolution would draw on existing and new sources of data to fully integrate statistics into decision making, promote open access to, and use of, data and ensure increased support for statistical systems. ” (HLP Report, P23)

The strong inference we can take from this statement is that the ‘data revolution’ is mainly  about statistics and very large data sets which are available from public and private sources. This is enabled by new technologies in communication and computing (including social media and the mining of commercial on-line environments for descriptions of large scale tendencies in viewing, spending/consumption, economic and cultural behaviour). However, we might see this as a somewhat ‘reductive’ statement, revised to a more expansive statement in which the Advisory Group argues that ‘most people are in broad agreement that the ‘data revolution’ refers to the transformative actions needed to respond to the demands of a complex development agenda, improvements in how data is produced and used; closing data gaps to prevent discrimination; building capacity and data literacy in “small data” and big data analytics; modernizing systems of data collection; liberating data to promote transparency and accountability; and developing new targets and indicators.’[1]

‘Data revolution’ has also been used in recent years to refer to the ever-expanding availability of new (big) data such as social media content, financial transactions data or satellite imagery data. In addition to the increasing availability of data, it also refers to the advances in data management, data analytics and computational capabilities which allow researchers and others to process and analyze these data (as well as other existing data) in new ways for a variety of purposes.

The ‘data revolution’ is about more than just size, although of course size matters! It is also about being clear about differences between information, data and evidence and how we talk about the attribution of value.  We need to understand this data environment and what exactly is changing and how this dimension of evaluative practice might, in turn, change to be more effective in the future?

We should also be aware of other shifts in the firmament, informed not necessarily by technologies and techniques but by socio political considerations and re-appraisals of what counts as authoritative evidence, the different forms it can take and how it is produced. The failure to enable or even acknowledge a more expansive approach to evidence and the data on which it is based, has been the essence of the critique on the science and evidence used to justify action and non-action (e.g., use of masks) in early approaches to the covid19 pandemic[2].

Other issues which need to be included in our consideration of the ‘data revolution’ which centre on challenges associated with, for example, using remote sensing data, mobile phone data, and various internet data. Big data might arrive with us as too varied, too unorganised, and ultimately too uncertain and unverifiable. The challenge may be how to integrate or synthesise key messages from this data chaos! There is a view which suggests that data is often presented “as is” in evaluations, without any discussion of their validity and how they were verified. Where data are used, we need to include a discussion of how observations were made or whether measurements were reliable.

With very large data come powerful responsibilities, to (mis)quote Spider-man. Data science tends to focus too much on statistical manipulations of big data and deriving conclusions from mathematical calculations without a forensic examination of the intersection between the instrument deriving the data and the data sources themselves (this interface often remains opaque or hidden but might yield quite flawed data).  Fundamentally, data needs to help us to make ‘sense’ of a situation, not confuse or obfuscate.

The ‘data revolution’ is also notable in the domain of policy evaluation. While many evaluation functions and evaluation professionals have yet to explore this new opportunity space, others have started their journey into questioning, managing and analyzing (big) data and other forms of data in the framework of evaluative analysis. The covid-19 pandemic, while restricting opportunities for empirical data collection in evaluation, has problematized to some extent the pace of innovation in evaluation, including the use of new data as well as the use of data science techniques to process and analyze ‘old and new’ data.

This Round Table Exchange brings together evaluators, evaluation scholars and researchers to discuss the potential and limitations of integrating new approaches in converting data into evaluative resources.  There is huge potential for using data science techniques and recasting our understanding of evidence in evaluation in terms of enhancing the authenticity, efficiency, accuracy, depth and/or breadth of data analysis.  The discussion will enlighten how to make that potential a reality.  Panelists will also identify any adjustments or changes they deem necessary to ensure this aspect of evaluative practice is fit for the future.

The discussion will focus on the following questions:

  • what are the (hidden) biases, challenges and ethical issues when applying these techniques?
  • what are the lessons learned in terms of successfully integrating data science into the way we attribute value and worth (evaluative practice)?
  • to what extent and in what ways is the integration of data science applications in evaluation likely to change the practice of evaluation?