UK Evaluation Society

Window Reflections – Part Two

Second blog covering delegates’ reflections from the UK Evaluation Society ‘Windows on Evaluation Matters’ special online event held between 23rd and 27th November 2020


Window #4: ‘The Covid-19 pandemic has spotlighted the use of evidence in Government decision-making: Does the public care about how evidence is presented and used?’

Presentation and panel discussion supported by the Campaign for Social Science with Kelly Beaver (Ipsos MORI), Dr David Halpern (Behavioural Insights Team), Tracey Brown OBE (Sense about Science), Dr Audrey MacDougall (Scottish Government) and Dr Siobhan Campbell (UK Department of Transport)

Reflections from Meera Craston (Director, Social Policy Evaluation & Advisory, Ipsos MORI)

The presentation by the Managing Director of Public Affairs at Ipsos MORI, Kelly Beaver, highlighted a growing appetite amongst the British public for transparent, evidence-based decision-making on the part of Government. This shift in view has come about as a result of the spotlight placed on the evidence profession during the COVID-19 pandemic, which has required decisions to be made at pace and outside of political bias. Taken alongside Michael Gove’s Ditchley Foundation speech earlier in the year (read more here), this rising prominence of experts provides a real opportunity to firmly embed the routine use of transparent and high-quality evidence across Government policy making.

Capitalising effectively on this opportunity will require careful thought on the part of evidence professionals about how to achieve what I like to call the holy grail, that is the development of evidence that is tangibly used to inform decision making. This requires the alignment of several factors, which include: (1) the demonstration of rigour, integrity and transparency in the development and delivery of evidence-related work; (2) the building of trust and collaboration between evidence professionals and their clients, which in my opinion, should include a broad sense of the term ‘client’ (from policy makers / analysts to local communities and end-beneficiaries, i.e. the public); and (3) explicit consideration of how evidence is presented to ensure key messages are clear, accessible, easy to interpret and audience-appropriate.

Although we as an industry often achieve (1) and most of (2), it is (3) that is often missed, which reduces the impact of our work. This was very pertinently portrayed in the example Kelly used of the COVID-19 related ‘R’ rating that has been used as the key statistic of the pandemic but interpreted by the pubic in a myriad of different ways as a result of poor presentation. This type of issue is relatively easy to correct and needs to now form part of the role of all evidence professionals, which should not only include the generation of evidence but also its effective curation.

Window #5: ‘In the Hotseat: Evaluation – Let’s Go Gardening! Leadership in the Evaluation Ecosystem’ with Dugan Fraser (Director, CLEAR Anglophone Africa)

Reflections from Matthew Baumann (Evaluation Consultant, Matthew Baumann Associates)

I enjoyed Dugan Fraser’s talk on leadership and what it means to him in his role as Director of the Centre for Learning on Evaluation and Results (CLEAR) in Anglophone Africa, based in Johannesburg. He spoke from the heart with personal reflections on evaluation leadership in the context of CLEAR’s remit of growing evaluation ecosystems in African countries. I also appreciated Dugan’s personal reflections on being a white male leading a black team in a country coming to terms with a mission of decolonisation.

I’m sure that in most contexts, development or other, ‘systemic leadership’ is a useful model of leadership. Systemic leaders don’t attempt to ‘own’ or ‘command’ the systems in the settings they are engaged with, but rather facilitate growth of the system –intervening here, supporting there, nurturing relationships, and supporting others to find solutions to challenges. Dugan likened his use of systemic leadership as being like a gardener: sometimes a gardener must act – plant, feed, prune to catalyse, promote, strengthen and shape growth. But at least as important, is knowing when to step back and let the garden do its own growing, allowing plants and vegetation to co-evolve to a new future.

I’d like to think in the roles I’ve played in UK Central Government evaluation and elsewhere that I’ve been involved a bit as a gardener. Dugan’s talk made me reflect on my time in DECC (now BEIS) – this was a place full of opportunity – relatively fertile soils, ample feed, and light, and mostly predictable temperatures. The setting was apt for catalysing, nurturing and supporting growth and, as we developed, we were increasingly able to step back to see a great deal of natural growth and new directions. But in fact, there was also a fair bit of fishing involved too…

In their excellent article “Teaching people to fish? Building the evaluation capability of public sector organizations” McDonald, Rogers and Kefford (2003[1]) develop the individually focused epigram “give someone a fish and they eat for a day, teach them to fish and they eat for a lifetime,” extending it to a systems-level in which “as well as individuals skilled in catching fish, we need the equipment to successfully fish, an effective distribution system, people who want to eat fish, and an entire fishing system that is sustainable.” The point being that there is a need to focus on working with the whole system and to do so in a strategic and planned way – making sure all parts of the system are evolving towards the desired ‘system’.

Whilst the fishing analogy refers to the structure of the underlying evaluation system that’s needed and the parts of the system that need to be in place, I found the gardening analogy provided a complementary perspective about leadership style and how to personally engage with these parts of the system.

[1] Teaching people to fish? Building the evaluation capability of public sector organizations ,Bron McDonald, Patricia Rogers and Bruce Kefford, Evaluation, 2003; 9; 9

Continuing the learning

If you didn’t attend ‘Windows on Evaluation Matters’ but are interested in the topics that were covered, the UK Evaluation Society organises events throughout the year and you can keep up to date with these by viewing our events calendar – link here – and by following us on social media. Some of these events are for members-only so it is often more cost-effective to become a member of the Society – join us today – more info about member benefits here.