All UK aid-funded programmes have committed to contributing to gender equality and social inclusion. Monitoring, evaluation and learning (MEL) professionals can play a key role in ensuring that programmes deliver benefits to women and other excluded groups. In this blog, Liisa Kytola, Senior Consultant in our Monitoring, Evaluation, Research and Learning practice, highlights ways in which MEL can drive accountability and improvements to gender and inclusion (G&I).
Liisa derives her insights from two large MEL contracts that Tetra Tech led: the independent evaluation and learning contract of the UK Prosperity Fund (2016-2021), which promoted inclusive economic growth around the world. And the MEL contract of the West Africa Conflict, Stability and Security Fund (CSSF, 2018-2022) which provides development and security support to West-African countries at risk of conflict or instability.
Asking the right questions to measure progress on gender and inclusion
To assess progress on G&I on the Prosperity Fund, we ensured that our evaluations asked targeted G&I questions. Our evaluators and programme teams discussed which evaluation questions were priorities for each year to ensure that we focused on programme concerns and learning needs. Through ‘non-negotiable’ G&I-focused evaluation questions, we ensured that we reviewed and reflected on progress on G&I annually.
We developed a bespoke Prosperity Fund G&I scorecard that provided a framework for analysis in annual programme evaluations. The scorecard assessed programmes on G&I across seven dimensions, using a traffic light scheme. These dimensions focused on issues and processes necessary for a gender-responsive programme, from analysis and design to team expertise and monitoring. The focus changed with each cycle of evaluation moving from design and set up to implementation and results.
On CSSF, we integrated relevant G&I questions across all third-party monitoring assignments and evaluations. We also conducted gender analyses to help inform programme and project design.
Using Theory of Change and Result Framework development to increase emphasis on gender and inclusion
On CSSF West Africa, we used Theory of Change and Result Framework development discussions to encourage projects to consider the G&I dimensions of their work, as well as any additional analysis and data they needed to collect to measure progress. As a result, the Theory of Change set out specific outcomes on G&I and strategies to achieve these. We also included G&I-related indicators across programme and project-level Result Frameworks. These processes set firm G&I commitments that programmes and projects could be held to account on.
Reporting and communicating findings to promote accountability to G&I
We reported the findings and recommendations from our G&I scorecard assessments on the Prosperity Fund back to stakeholders in programme and portfolio-level reports. To provide a portfolio overview of progress and gaps, we synthesised the findings of all programme assessments in thematic G&I evaluation reports. These were useful in communicating progress and issues across the portfolio to fund-level managers and advisers.
On CSSF, the MEL team provided a ‘challenge function’ during the quarterly reporting process. This included a review of G&I evidence presented in progress reports, and making recommendations to improve delivery and reporting.
Facilitating reflection and use of evaluation evidence
Although the Prosperity Fund G&I scorecard was initially developed as an evaluation tool, some programme teams started using it as a self-assessment tool during annual reviews. We supported programme teams to use the scorecard – and through this, built their capacity on G&I concepts.
Finally, we wanted to ensure that our evaluations informed programme and project adaptation. To achieve this, we organised learning ‘touch points’ with programme teams after our evaluations. We validated the G&I Scorecard assessment findings and recommendations and facilitated practical discussions on the steps that programmes needed to take to strengthen their work. We also presented our findings to portfolio-level social development advisers. This helped advisors identify where action was needed, and provide targeted support to programmes.