South Africa's National Evaluation System: Department of Performance, Monitoring & Evaluation update

Standing Committee on Appropriations

28 January 2014
Chairperson: Mr E Sogoni (ANC)
Share this page:

Meeting Summary

The Department of Performance, Monitoring and Evaluation briefed the Committee on the progress of South Africa’s National Evaluation System (NES). During the 2011/12 financial year 13% of departments were doing evaluations and 19% during 2012/13. It was anticipated that more than 20% of departments would be doing evaluations during 2013/14. In two years the whole system was now established and 38 evaluations were completed, underway or about to start.

There were seven evaluations during 2012/13. One evaluation from the National School Nutrition Programme was stopped and reallocated to 2014/15. In 2013/14 there were a total of 15 evaluations (Cabinet agreed to drop one) and 15 would be done in 2014/15. The Early Childhood Development evaluation was completed in June last year and has been made available on the DPME website. Four other evaluations from the Departments of Trade and Industry, Basic Education and Rural Development have had final reports approved that would be presented to Cabinet by February/March and should be available to Parliament in April. A further 18 evaluations have been underway from 2012/13 and 2013/14, including one evaluation not under the National Evaluation Plan. Of these, three would be completed within a few weeks and 15 were underway. A total of 15 Terms of Reference from 2014/15 had mostly been developed and procurement had started with some. The aim was for most evaluations to be underway by April 2014.

A key challenge that the Department faced was that departments often did not make use of evaluations. To ensure the use of evaluations, participating departments were encouraged to own the evaluation concept and the process, learning was also encouraged and a cross-government evaluation technical working group had been created. Priority interventions which needed to be evaluated included large interventions (over R500 million), interventions linked to Outcomes 12-14 and interventions of significant public interest, such as key front-line services.

Evaluations were done to:
• Improve policy or programme performance. This aimed to provide feedback to programme managers
• Improve accountability. Where was public spending going? Was this spending making a difference?
• Improve decision making. Should the intervention be continued? Should the implementation process be changed?
• Generate knowledge about what worked and what did not with regards to public policy, programme, function or organisation.

The advantages of evaluations being in the National Evaluation System include Cabinet approval, political support from Cabinet and the Department of Performance, Monitoring and Evaluation, co-funding from Department of Performance, Monitoring and Evaluation and exposure to the national evaluation system’s guidelines, standards, steering committees and training.

The current progress made on programme planning and design evaluation could potentially have a big impact and build capacity in departments.

The Department of Performance, Monitoring and Evaluation said Parliamentary Committees could request departments to present the evaluation results to them or present improvement plans to them as well as progress reports related to the improvement plans. Committees could also suggest priority areas within departments for evaluation or request departments to evaluate specific policies or programme.

A call would go out in March for proposals for evaluations for 2015/16 to 2017/18. The closing date for submissions was 30 June.

Members expressed concern over departments’ poor attitude towards evaluations. Why were there procurement delays? Where did the problems lie which kept us from moving forward? Which departments were not performing and ignoring evaluation reports?

They wanted know if there have been follow-ups with the departments after evaluations have been done and why punitive action had been sacrificed for the sake of learning and education.
 

Meeting report

Update on South Africa’s National Evaluation System
Dr Ian Goldman, Head: Evaluation and Research, Department of Performance, Monitoring and Evaluation (DPME), briefed the Committee on the progress of South Africa’s National Evaluation System (NES). Evaluations sought to answer questions such as the impact of grade R on subsequent learner performance, how the Business Process Services Programme has been working, whether the Land Recapitalisation and Development Programme had developed beneficiaries and whether the Comprehensive Rural Development Programme had achieved its goals.

He said 13% of departments were doing evaluations in the 2011/12 financial year and 19% during 2012/13. He anticipated that the latest results would reveal that more than 20% of departments were doing evaluations.

Of the total 37 evaluations under the NES and one evaluation not under the NES, five were completed, three would be finished in the “next few weeks”, 15 were underway and 15 terms of references (TORs) were being developed. He said departments were using evaluation results to inform planning, policy-making and budgeting.

Progress with National Evaluation Plan evaluations
The 2012/13 National Evaluation Plan (NEP) was approved in June 2012, the 2013/14 NEP in November 2012 and the 2014/15 NEP in November 2013. There were seven evaluations during 2012/13. One evaluation from the National School Nutrition Programme was stopped and reallocated to 2014/15. In 2013/14 there were a total of 15 evaluations (Cabinet agreed to drop one) and 15 in 2014/15.

The Early Childhood Development evaluation was completed in June last year and has been made available on the DPME website. Four other evaluations from the Departments of Trade and Industry, Basic Education and Rural Development (Recapitalisation and Development Programme and the Comprehensive Rural Development Programme

 

 

) have had final reports approved that would be presented to Cabinet by February/March and should be available to Parliament in April. A further 18 evaluations have been underway from 2012/13 and 2013/14, including one evaluation not under the NEP. Of these, three would be completed within a few weeks and 15 were underway. A total of 15 TORs from 2014/15 have mostly been developed and procurement have started with some. The aim was that most of the evaluations should be underway by April 2014.

Reasons for delays
Goldman said that while some evaluations were straightforward, others were taking longer than planned. This happened because some departments, such as Human Settlements, took more than 12 months to procure, while the DPME procured within two months. Other reasons for delays were challenges with lack of data and departments that took on board evaluations and delayed getting it to cluster and Cabinet.

Challenges
He said the key challenge they faced was that completed evaluations were often not used by departments, resulting in money being wasted. The following suggestions were made to ensure the use of evaluations:
• Departments were encouraged to own the evaluation concept and the process. This could only happen if they request the evaluation, it must not be imposed on them;
• The focus must be on learning rather than punitive action, otherwise departments would just “game” the system. Punish people not because they made a mistake, but if they do not learn from their mistakes.
• The selection of evaluations by a cross-government evaluation technical working group based on scale or because of its strategic or innovative appeal.
• Evaluations must be believed – seen as credible.
• There should be follow-ups so as to ensure improvement plans

Why evaluate?
Goldman said evaluations were done to:
• Improve policy or programme performance. This aimed to provide feedback to programme managers
• Improve accountability. Where was public spending going? Was this spending making a difference?
• Improve decision making. Should the intervention be continued? Should the implementation process be changed?
• Generate knowledge about what worked and what did not with regards to public policy, programme, function or organisation.

Priority interventions which needed to be evaluated were:
• Large interventions (e.g. over R500 million) or interventions that covered a large proportion of the population which have not had major evaluations for five years.
• Interventions linked to 12-14 outcomes
• Interventions of strategic importance which needed to be successful.
• Innovative interventions
• Interventions of significant public interest, such as key front-line services

Advantages of evaluations being in the NEP
Evaluations that were in the NEP would be forwarded to and approved by Cabinet (with improvement plans). Departments would also benefit from political support from Cabinet and DPME, as well as assistance with emerging problems. There would also be co-funding available from DPME or if necessary DPME would assist with the sourcing of donor funding. Another benefit was exposure to the national evaluation system’s guidelines, standards, steering committees and training. All evaluations would be published on the DPME website.

Progress with the system
This included 12 guidelines and templates which ranged from TORs to improvement plans plus six draft ones which would be finalised in February. There were significant progress made in planning implementation programmes and design evaluation, as well as improvement in standards for evaluations and competences. Four courses have been developed, which included courses for Director General and Deputy-Director Generals in how to use evidence. One more course had been developed and would be piloted by March. Over 600 government staff have been trained so far. Study tours have also been organised for Committee members to Canada, US, Kenya and Uganda. An evaluation panel was developed with 42 organisations which led to the procurement process being simplified. An evaluation repository – 70 quality assessed evaluations - was created on the DPME website. Gauteng and Western Cape provinces have developed provincial evaluation plans, while department evaluation plans for, among others, the Department of Trade and Industry and municipal evaluation plans for Tshwane have also been developed.

Use of evaluations by Parliament
The repository would provide 70 evaluations which could be used as a source of evidence. Stage evaluations would be presented to Portfolio Committees and once a final report have been approved departments would be given one month to provide a management response to findings and recommendations. Once management responses have been received departments needed to develop improvement plans.

The use of evaluations also presented Committees with an opportunity to interrogate what department were doing. The next evaluations would be presented to Portfolio Committees by March/April 2014. In the meantime, Committees could request departments to brief them on progress with evaluations, their results and the development and the implementation of improvement plans based on the results.

Committees could also suggest priority areas within departments for evaluation or request departments to evaluate specific policies or programmes.

A call would go out in March for proposals for evaluations for 2015/16 to 2017/18. The closing date for submissions was 30 June 2013.

Discussion
Mr J Gelderbloem (ANC) asked if the presentation was also made to Cabinet.

Dr Goldman replied that the department did present to Cabinet, but that the presentation was much shorter. They received a very positive response.

Mr Gelderbloem asked what the Department’s relationship was with the School of Government.

Mr Ian Phillips, Director General: DPME, replied that officials from the School of Government have already approached them regarding the content of their courses. We advised them to include evaluation and monitoring issues and the use of evidence in their courses. We were therefore optimistic that monitoring and evaluation would be a key part of the curricula delivered by the School of Government.

Mr Gelderbloem asked if there have been follow-ups with the departments after evaluations have been done.

Mr Goldman replied that they would request a six-month progress report.

Mr Gelderbloem asked if the Department had been cooperating with the South African Local Government Association (SALGA).

Mr Goldman replied that it was important to do evaluations on local government. But the Department did not have the time to focus on local government at the moment. This year the Department’s focus would be on getting our quality right and the year after we would focus on extending our reach and that would include approaching metropolitans to do evaluation plans. It would also be helpful if local government approached us for evaluations.

Mr M Swart (DA) said in order for the Committee to approve budgets they needed to know which department were not performing and ignoring evaluation reports. There cannot be improvement if that happened and if the relevant Portfolio Committee also did nothing about it. Would it be possible to get a summary of those evaluations that have been done so that we could be informed of what the Portfolio Committee had done and what the results of those evaluations were?

Mr Goldman replied that the evaluation update that was sent to the Standing Committee on Appropriations every two month would enable the Committee to see at what stage the different evaluations are at. The Department was thinking of producing summarised reports of those evaluations (1, 5 and 25 page reports) which were readable and could be used effectively.

Mr Swart asked referred to the slide on page 9 of the presentation regarding the evaluation on the Recapitalisation and Development Programme by the Department of Rural Development. He asked if that evaluation also covered the Accelerated Schools Infrastructure Delivery Initiative (ASIDI) under the Department of Basic Education in order for us to ascertain whether that initiative was working or not.

Ms Nolwazi Gasa, DPME Deputy-Director General: Outcomes Monitoring and Evaluation Branch, replied that the ASIDI initiative was not covered under the recapitalisation evaluation.

Mr Goldman replied that the recapitalisation evaluation was about the land reform programmes that were failing.

Ms R Mashigo (ANC) asked for more clarity regarding the procurement delays. Why did this happen?

Mr Goldman replied that some departments were very bureaucratic and that it sometimes took weeks to get something as simple as a signature. He felt that an evaluation of supply chain processes across government was needed.

Ms Mashigo asked whether the evaluation process took more than a year to be finalised.

Mr Goldman replied that all in all it took about two years from the time the Department received a call to the time it has gone to cluster.

Mr L Ramatlakane (COPE) congratulated the department on their well-delivered presentation but was worried that interventions were still needed to “get there”. He asked where the problem lay which kept us from moving forward.

Mr Goldman replied that there were still a number of departments that have not submitted proposals.

Mr Ramatlakane said he did not agree with the Department’s stance to forego punitive action for the sake of learning and education. Public service required enforcement to implement directives.

Mr Goldman replied that the issue of learning was meant to set boundaries and to let departments play their roles. People must not be scared of making mistakes, but they have to learn from their mistakes.

Mr Ramatlakane bemoaned the fact that evaluations were not linked to organisations. How do you navigate this?

Mr Goldman replied that the department did not cover organisations as it could not do everything. But he felt that it was very important to do organisations – the question was not if, but when.

Mr Sogoni was worried that Parliament was not initiating evaluations and that departments seemed to do oversight and not the Committees. He recommended that every Member of Parliament should attend the workshops in order to understand why these evaluations needed to be done.

Mr Sogoni asked where the people who do evaluations got their information from, especially regarding the Early Childhood Development evaluation. Who did they speak to? Who gave them feedback?

Mr Goldman replied that in most cases they would go out and talk to the beneficiaries.

Mr Sogoni asked how often independent evaluations were done.

Mr Goldman replied that the guideline for programmes on page 24 gave a good indication of how often evaluations should be done. But departments should also be think for themselves how often an evaluation should be done.

Mr Sogoni asked if there was a relationship between last year’s matric results and the Early Childhood Development evaluation done on the impact of grade R on subsequent learner performance.

Mr Goldman replied that the Early Childhood Development evaluation linked Grade R to learner performance, but not matric results.

Mr Sogoni asked what the department’s concerns were regarding capacity and staff complement. He wondered if outsourcing was the answer.

Mr Goldman replied that each evaluation took between 20 and 30 days to ensure that it works. Evaluations should probably be done independently but the Department still needed staff to monitor the evaluations. He said they are trying to get six interns on board to build capacity.

The meeting was adjourned
 

Share this page: