By Stephen Taylor and Tshegofatso Thulare
The Department of Basic Education (DBE) has commissioned a number of evaluations in recent years. In the process, we have learned a great deal about how to improve policy and programme design as well as about how to conduct better evaluations in the education sector. Notable evaluations include an impact evaluation of the introduction of the Grade R programme (2013), implementation evaluations of the Funza Lushaka Bursary Programme (2016), the National School Nutrition Programme (2016) and the National Curriculum Statement Grade R to 12: Focusing on the Curriculum and Assessment Policy Statements (2017), as well as the Early Grade Reading Study (EGRS). These were all commissioned in partnership with the Department of Planning, Monitoring and Evaluation (DPME)
The support of the DPME’s National Evaluation System (NES) as well as that received from numerous partners including UNICEF, the Zenex Foundation, USAID, Anglo-American and the International Initiative for Impact Evaluation, has been invaluable.
If the findings of an evaluation are to be accepted and used to improve practice, it is crucial to gain the support and involvement of key departmental personnel (senior officials and those directly responsible for the programme being evaluated) from an early stage in designing and commissioning the evaluation. One advantage to our role as commissioners of evaluations is that we have been able to facilitate access to key programme officials and sources of data, as well as to communicate findings to the relevant decision-making forums such as Cabinet, the Council of Education Ministers, and the Heads of Education Departments (HEDCOM). Ensuring that evaluation findings actually influence policy and programmes is never a seamless process, but we have found that the system of requiring the Director-General to sign a management response as well as an improvement plan has provided a useful mechanism for following up on evaluation results. Further benefits include fostering an evaluative culture in the sector, strengthened capacity through exposure to evaluation methodologies and practices, and a greater appreciation of the value of good data and evidence.
Things haven’t always gone smoothly. It requires capacity and coordination to develop appropriate Terms of References; procurement processes weren’t always responsive to the needs of completing timely evaluations; limited budgets sometimes restricted the scope of an evaluation; the pool of high-quality service providers proved to be rather narrow; and certain findings were not always well received. A lot of work needs to be done to improve the supply of evaluators and organisations to meet the scale of government projects and BBBEE imperatives.
Overall, the benefits to the sector clearly outweigh any obstacles which have been experienced – now we need to capitalise on these gains. We would highlight three priorities in the next phase. Firstly, the government needs to lead by institutionalising the relatively young National Evaluation System. Secondly, funding mechanisms must be created to ensure that rigorous evaluation becomes a routine process within large-budget programmes. Thirdly, we need to recruit and train a wider pool of young evaluators.
-Tshegofatso Thulare is Researcher and Project Manager for EGRS
-Stephen Taylor is Director Research, Monitoring and Evaluation at the Department of Basic Education
Source: Lessons and Reflections about Implementing M&E in South Africa: An anthology of articles by Zenex staff, M&E experts and NGOs. http://www.zenexfoundation.