Monitoring and evaluation (M&E) has become integral to the work of education service providers, government and donor agencies. There is a high level of support for implementing M&E. However, the issue is no longer about promoting M&E but rather promoting quality M&E.
What makes for quality M&E ?
Quality M&E occurs at various levels:
- Drafting terms of references or commissioning documents;
- Clarity of evaluation questions and alignment of questions to evaluation purpose;
- Clarificatory workshops that describe programme inputs/outputs/outcomes and indicators;
- Evaluation Design and methodological considerations: Careful thought must be applied whether the design of the evaluation will fit the purpose — there must be an alignment between evaluation questions, design, and methodology;
- Type of evaluation designs: The design can be experimental/quasi-experimental and not experimental. Decide what design to use, as well as when, why, and how. There must be robust debates about these designs;
- Methods of data collection: What data will be collected, and how and when will it be collected?
- Sampling and instrument development; and
- Data collection, analysis and write-up.
Utilisation of Findings and Recommendations
- Improving the utilisation of M&E is very important. It gives meaning to the notion of a learning organisation.
- The strength of evaluations lies in the ability to use findings and recommendations: this is dependent on the structure of the report. A simple report is better, while a long version is not recommended unless it is aimed at researchers who need a higher level of detail. Different versions are needed for different audiences.
- Be clear about what lessons, recommendations, and findings can inform project designs and project implementation.
Recommended resources and toolkits
- Evidence-based development: How can practitioners use research, evidence and existing data better in planning and implementing development projects? Working with a framework can itself generate evidence. Rigour in understanding and documenting motivations and implementation will help CSI practitioners apply the lessons they’ve learned in future projects.
- Improving the quality of evaluations: a systematic and objective engagement with the design, implementation and results of projects/programmes. Provides a seven-step outline creating an M&E process for donors and funders. Trialogue Workshop with Fatima Adam of the Zenex Foundation (May 2016).
- Quality Evaluations in Education Interventions: Dr Fatima Adam, Zenex Foundation, provides a six-step outline for designing an evaluation, then discusses the strengths and weaknesses of the RCT design.
- Adapting M&E in a Pandemic: Monitoring and Evaluation (M&E) practices have had to shift their focus during the pandemic. Gail Campbell, CEO of the Zenex Foundation, explains how her organisation made adaptations in the face of Covid-19, and how M&E can deliver greater impact in the education sector.
- Decision Framework for the Design and Implementation of School Interventions Evaluation Research Agency advocates for the Knowledge-to-Action Cycle, a conceptual and heuristic framework for translating knowledge in a particular field into useful and actionable knowledge. It consists of a knowledge creation process model that has been added to a planned-action cycle: Knowledge creation is understood to consist of three phases: (1) knowledge inquiry; (2) knowledge synthesis; and (3) the construction of knowledge tools and products.
- Evaluative Thinking: While monitoring and evaluation (M&E) is a growing practice in social development, it is still a largely siloed process, data from which is not always used to inform programmatic improvements for greater overall impact. Dr Fatima Adam, programme director of research and evaluations at Zenex Foundation, explains that evaluative thinking is about embedding an organisational culture of holistic and responsive approaches to work. Read ‘Evaluative thinking’ published in the Trialogue Business in Society Handbook.
- Opinion: Why evaluation is key to rebuilding better and achieving the 2030 Agenda In a broad sense, M&E can generate and use evaluative evidence to ensure more effective policy-making, particularly when it comes to national development strategies.