Eight ways for large donor funds to maximise success through evaluative thinking

At Tetra Tech, we work every day to maximise the impact of the programmes we deliver. This is particularly true when it comes to our Monitoring, Evaluation, Research and Learning Practice, where our teams carefully examine what works – and what doesn’t – to support vulnerable communities around the globe and share what they have learned along the way.

Ahead of next Britain’s flagship evaluation event next month – the UK Evaluation Society Conference (UKES), taking place from 24-26 May 2022 – Fionn O’Sullivan, Principal at Tetra Tech, looks back at his presentation from last year’s conference, reflecting on how our evaluation on the £1.2 billion UK Prosperity Fund (2016-2021) helped us build evaluative thinking into the fund’s day-to-day operations.

Aid initiatives are more successful when reflection and evaluation are an integral part of their everyday operations, instead of treating evaluation as an external exercise and ‘a necessary evil’.

Tetra Tech led the delivery of the Evaluation and Learning service for the UK Prosperity Fund, which operated from 2016 to 2021, aiming to reduce poverty by supporting inclusive economic growth, delivering 27 separate programmes in 46 countries. The service engaged over 100 different evaluators to deliver 122 evaluation products during the life of the programme.

Evaluative thinking can be understood as an ongoing process of reflection, where all think and learn critically about the evidence underpinning their actions. We aimed to promote this in the Prosperity Fund in the following ways:

1. Aligning annual evaluations with annual programme cycles.

This broke with traditional UK Aid practice of mid-term and final evaluations. It proved valuable where programmes were moving forward at pace, helping managers understand what was working and what needed adjustment. However, conducting annual evaluations when programme delivery was slow proved less useful.

2. Evaluators and programme teams jointly identifying learning needs for each evaluation cycle.

This ensured evaluations were closely focused on programme team needs and contributed to their engagement with evaluators. While this worked very well at the programme level, the diverse focus of different programme evaluations made evaluating the portfolio as a whole difficult.

3. Each programme could draw on 25 days of flexible support each year.

These days were provided by senior evaluators familiar with the UK’s Foreign, Commonwelath and Development Office (FCDO) and the specific country and sector context for the programme. This proved valuable in helping programme teams investigate specific issues. Initial concerns about compromising the independence of evaluators were not realised in practice.

4. Building ‘learning touchpoints’ into each evaluation cycle.

These provided up to five formal points for evaluators and programme teams to consider issues, evidence and findings. This worked very well and underlined that learning (and acting on it) was a key aim.

5. Packaging findings in different formats.

While we produced traditional, detailed evaluation reports with the evidence to back up specific findings, we also developed a suite of other products including summaries, case studies, presentations and videos to communicate key findings. This helped address the needs of different users but required significant resources (such as professional copy-writers) and took time to get right.

6. Using an online platform – PFLearning – to host events and communities of practice and provide guidance and resources.

This had over 600 users and was cited as an example of good practice in UK Aid by the Independent Commission for Aid Impact (ICAI). It was costly to build and run but proved worthwhile given the size of the Fund and the spread of its programmes across the globe.

7. Supporting Communities of Practice.

These brought programme team staff together with thematic specialists through events, discussion and online bulletin boards. The Prosperity Fund Gender and Inclusion Community of Practice had 64 members and helped improve the performance of the Fund in this area.

8. An adaptive Evaluation and Learning Service.

Our service offer had to adapt considerably over time, in response to significant changes in the UK aid budget as well as the COVID-19 pandemic. This adaptation helped deliver value for money but required investment in project management systems as well as significant internal communication with our dispersed team of evaluators.