Department of Performance Monitoring and Evaluation on its Third Quarter Performance for 2013/14

Standing Committee on Appropriations

18 February 2014
Chairperson: Mr E Sogoni (ANC)
Share this page:

Meeting Summary

The Department of Performance Monitoring and Evaluation (DPME) was asked how participation by government departments in the Management Performance Assessment Tool (MPAT) could be ensured and made a requirement.  Participation was voluntary at present, but the Department had received a good response from all national and provincial departments in the last and current cycle. 

The DPME was told that its report needed more detail so that the Committee, which was tasked with oversight on issues in the Department, could understand its successes, its challenges and the reasons for them. The Department responded that its report was somewhat of an accounting report and that in future, a narrative would be provided on progress in relation to its programmes.

Asked about progress in relation to the Presidential hotline -- and also what had been done about its challenges -- the Department reported that there had been improvements and the latest resolution rate was 94%.  It produced reports based on the data received from the hotline.  Concerns had been directed to specific departments where the issues and trends could be discussed.

The Committee asked if the Municipal Assessment Tool (MAT) had been adopted by the South African Local Government Association (SALGA) as a tool that needed the participation of all municipalities. It was suggested that a common understanding needed to be developed on what was being evaluated. The Department responded that the issue of evaluation should be discussed at another time, but in short, the Department would evaluate programmes, activities, targets and designs.  It would be important to educate new incoming Members of the Committee on the task of performance monitoring and evaluation, so that they could understand what each component meant and was responsible for.

The Department was asked whether it had done an analysis on the functionality and performance of its monitoring and evaluation units in the Premier’s Office. The Department said it had been a number of years since such an analysis took place, and this needed to be corrected.

In the Department’s report it had been highlighted that only seven departments had submitted their Annual Performance Plans, and this was questioned. The Department said its report referred only to the third quarter and, as the year progressed, more reports would be obtained from departments. 

Meeting report

Briefing by Department of Performance Monitoring and Evaluation

Dr Sean Phillips, Director-General, Department of Performance Monitoring and Evaluation (DPME), said that in Programme 1: Administration, 12 out of the 18 quarter targets were achieved, five targets were partially achieved and one target was not achieved.

Targets achieved:

·         77% of activities in the communication plan were achieved, against the target of 75% (10 of 13 activities).

·         A vacancy rate of 7% was recorded, against the target of 10%.  195 posts were filled, out of 209 posts funded, as at 31 December 2013.

·         87% of workplace skills plan targets were achieved, against the target of 60%.

·         There was a 91% achievement of the systems standards, against the target of 50%.

 

The DPME partially achieved a number of targets.  Its Risk Policy and Strategy was reviewed, and aligned with the changes in the regulatory and operational environment. The Risk Management Committee requested that more work be done on the risk policy and strategy document.  This was completed and would be submitted to the Committee on 21 February 2014 for approval.

The Department targeted to achieve 80% of its internal audit projects, and 70% were completed by the end of the quarter.

The Department partially achieved the mid-term performance assessment of staff.  It set a 100% target, and 96% of staff submitted on time, but three staff members submitted late due to absence or leave and four staff members did not submit, as disputes on performance assets and maternity leave emerged, for example.

The Department targeted to achieve 100% approval of its required policies and plans by the Director-General by October 2013, and 38% -- three out of eight policies -- were approved. Five policies and plans were in draft form and would be achieved in the fourth quarter. It was targeted that 60% of activities in the annual Management Performance Assessment Tool (MPAT) improvement plan would be achieved.  Most of the activities of the MPAT improvement plan were being completed and would be achieved by the end of the financial year.    

For Programme 2: Outcomes Monitoring and Evaluation for the quarter, nine targets were achieved out of 16, three targets were partially achieved and four targets were not achieved. The Department prepared 51 briefing notes on Cabinet memoranda against the target of 30, 13 briefing notes and reports on executive monitoring and evaluation initiatives were prepared, against the target of 10.  The National Evaluation Plan for 2014/15 was approved by Cabinet on 4 December 2013. The Department partially achieved completion of a guideline for evaluation, and this would be completed by the end of the financial year. The five municipal self-assessments were not achieved. The Department targeted to to select three provincial programmes for evaluation, and work had been done with Limpopo, Eastern Cape and Mpumalanga, but was ongoing.

The Department did not achieve the target to submit a Cabinet memorandum for approval of the Municipal Assessment tool by end of October, as the pilot phase to redefine the tool took longer than expected and would be submitted after the elections. The draft report on Municipal Assessment, took longer than expected. The Department has worked with the Department of Cooperative Governance & Traditional Affairs, National Treasury and others on the draft report, as these stakeholders should approve the content of the tool before it was rolled out. The target to finalise the review report for the Twenty Year Review, in conjunction with the President, would be printed and published in the fourth quarter.  The Department targeted to hold a quarterly data forum meeting and produce a quarterly report, but this was not achieved.

In Programme 3: Monitoring and Evaluation Systems, five targets were achieved out of six, and one target was partially achieve. The Department held forums with National Departments in Provinces on monitoring and evaluation, to collaborate with them and to promote monitoring and evaluation good practices. The DMPE also held capacity development activities and learning events. The Department aimed to hold five sectoral workshops to align reporting requirements, but only one workshop was held.  

In Programme 4: Public Service Oversight, 16 targets were achieved out of 19, and three targets were partially achieved.  

Targets Achieved:

·         All 155 national and provincial departments had their heads of department (HODs) sign off the MPAT self-assessments.

·         Two MPAT workshops were convened.  The target was over-achieved in the second quarter, as during the provincial and national launch, learning workshops were convened for the Eastern Cape, Free State, Limpopo, Mpumalanga, Northern Cape and Western Cape.

·         Two monitoring reports on indicators of the quality of service delivery were submitted to the Forum of South African Directors-General (FOSAD) against the target of one.

·         Three FOSAD deliverables targeted for improvements were achieved.

·         Front Service Delivery Monitoring (FSDM) implementation tools were updated and presented to the Provincial and Monitoring and Evaluation (M&E) workshop held on 7 November 2013.

·         19 sites were monitored, reports completed and quality assured. The target was 45 for the quarter, but the targets had been exceeded in Q1 (37) and Q2  (67).

·         The draft annual report for monitoring visits between Jan 2013 and Dec 2013, was presented to the DG for comments by 31 December 2013.

·         Draft sector reports were done for the SA Social Security Agency (SASSA), the Departments of Cooperative Governance and Traditional Affairs, Justice, Home Affairs, Transport, Education, and Health, and the SA Police Service (SAPS).

·         Two case studies were produced and published on FSDM practices for the quarter.

·         Draft baseline case studies at pilot sites were conducted in SAPS, Departments of Health and Social Development (DOH and DSD) and SASSA facilities in Msinga and Maluti-a-Phofung. Desktop research was conducted and discussions held with facility management, staff and community groups.

·         Two learning publications were produced and placed on the website -- a paper for the SA Monitoring and Evaluation Association (SAMEA) Conference, and an article in the Public Service Manager.

·         Hotline case resolution performance reports were submitted to FOSAD  in November 2013.

·         A quarterly progress report on weaker performing Hotline departments and provinces was produced and signed off in November 2013.

·         Four Presidential Hotline case studies were produced for the quarter

·         Service delivery complaints trends reports were produced by 30 November 2013.  Two research reports were completed on education and land and housing-related complaints

·         A Hotline Satisfaction Survey outcomes for Q3 was produced

The Department does follow-up visits to sites where frontline service delivery monitoring has been done, and 69 monitoring visits were targeted and 63 were achieved. It was targeted that a report should be submitted to the Governance and Administration cluster by 30 November 2013, but the submission was too late. A quarterly progress report on the implementation of the improvement plan was supposed to be submitted by the end of December.  The deadline was missed, but the improvement plan was signed off on 8 January.

The Department was confident that it would achieve at least 85% of its targets by end of the financial year.

A comparison graph, showing the DPME’s expenditure for 2013/14 by month, from June to January, indicated that its performance was better than the previous year, and that there would not be a big spike in March expenditure this year, as was the case for the 2012/13 financial year.  For 2013/14, the Department was on track to come within the 2% guideline of the total budget from National Treasury -- between 98% and 102% of the budget by end of financial year.

Due to changes in its structure, in its recruitment plan, at the end of the third quarter the Department had approved 225 posts, of which 209 were funded. By December 2013, 195 posts had been filled. The vacancy rate of the Department was around 6% to 7%.  There would always be vacancies as people leave when they are promoted and get appointments elsewhere. The DPME was on track with its recruitment processes.

Monthly expenditure monitoring meetings have improved expenditure planning and forecasting. Measures have been put in place to relocate savings to priority spending areas -- for example, evaluations, municipal performance assessments and information and communication technology (ICT) equipment.

Discussion

Mr M Swart (DA) said it was well known that he was retiring soon, and this would be the last time he would see the Department. He said that it was a pleasure to work with the Department, as they have been open and honest and did a good job.

The Chairperson said that Mr Swart was part of the team, so he could not speak for himself.

The Chairperson said that the report needed more detail.  The Committee needed to know exactly what has happening in the Department, its successes and challenges and the reasons for the challenges. The report should be understandable to the Committee, as the Committee has to do oversight on the issues in the Department.

Dr Phillips said the report was somewhat of an accounting report.  In future, reports would offer a narrative on progress in relation to all its programmes.

The Chairperson noted the MPAT improvement plans and progress mentioned by the Director-General concerning the Auditor General’s report.  He also wanted a report on the departments that were not doing well on their assessment, to understand progress based on the improvement plans.

He asked what progress there had been with the presidential hotline, and what were the challenges?  There had been challenges in the Eastern Cape, and he asked what was done about the challenges and the response from the parties concerning the MPAT improvement plans.          

Dr Phillips said that there has been progress in the last quarter regarding the presidential hotline, and the latest resolution rate was 94%. The Department was also proceeding with all the improvement initiatives reported by the hotline satisfaction surveys, the engagements with sectors and the production of sector plans.  For example, the Department would produce a report for the Department of Education (DoE) based on all the hotline cases related to education, so that the DoE could look at the trends and what key issues were being raised.

The Chairperson asked how one ensured that people participated and saw the importance of participating in the MPAT process, so that it became a requirement for departments, whether provincial or national, to participate. Two workshops were supposed to be convened, and the Free State withdrew because it was voluntary. How does one make it a requirement?

Dr Phillips noted that the Department does an assessment on the quality of administration and management in national departments, provinces and municipalities.  National departments and provinces had one tool, but municipalities had a separate tool, as there was a different in legal framework.  National departments and provinces were assessed by the MPAT tool.  It was not legally compulsory, but in the last cycle and in the current one, all 156 national and provincial departments had participated. He noted that Gauteng had participated in the national workshop, and for that reason a workshop was not convened in Gauteng.

The Chairperson said it would have been good to hear whether the Municipal Assessment Tool (MAT) had been adopted by the South African Local Government Association (SALGA) as a tool that needed to be used, so that all municipalities realised that they had to participate. These tools were meant to assist governance, so participation should be encouraged.

Dr Phillips said that the municipal assessments had started later than the national and provincial assessments, and the Department had been working with SALGA as one of the stakeholders to ensure that they agreed with the content of the MAT tool. The Department was having good cooperation with municipalities where the testing and piloting of the MAT was being done. Other municipalities had asked to be included in the piloting phase, still on a voluntary basis. 

The Chairperson said that a common understanding needed to be developed on what the evaluation was for. It was important for Parliament and the Department that there should be one vision on evaluation.

Mr Obed Bapela, Deputy Minister, Department of Performance Monitoring and Evaluation, said that the issue of what was being evaluated should be discussed at another point.   The Department would evaluate programmes, activities, targets and designs.  These issues would be grouped and evaluated.  He said that the country wanted the Department to evaluate the performance of an individual, which would be the Minister, because of the performance agreement that was signed with the President. The DPME looked at the performance of departments to monitor and evaluate what they said and were committed to do, for each programme in their strategic plans.

Mr G Snell (ANC) said Members were happy with the report. Had the Department done an analysis to see how the monitoring and evaluation units in the Premier’s office were structured, to give effect to the work they were tasked to perform.

He noted that if the counterparts of the DPME that had been established at the Provincial level were not as dedicated as the national department, then the impact that was visible nationally would not be the same in the provinces. How were these sister departments assessed on their establishment and functioning?

Dr Phillips noted that between five and ten years ago, research was done on the monitoring and evaluation units in the provinces, and a report was produced. The Department has a regular forum with the monitoring and evaluation units in the Premier’s Office.   On the Department’s website, a guideline was provided on the roles that monitoring and evaluation units should play in the provinces, which had been developed with these units. The guidelines helped the units in the Provinces motivate for the capacity they needed. He noted that Mr Clement Madale, Director: Office Management and Content Support, had worked with the Department of Public Service and Administration (DPSA) to produce a guideline structure for monitoring and evaluation units in the Premier’s Office. It had been said that an assessment on the functionality, performance and work of the monitoring and evaluation units had not taken place.  This should take place.

The Chairperson said that the DPME was supposed to ensure that all Annual Performance Plans (APPs) were aligned to the National Development Plan (NDP) in order to deliver and implement government priorities.

Mr Bapela said that the Department, through Cabinet, was to prepare the APP instrument going forward. The plan was that the NDP would be integrated across government and then in all departments, starting with the medium-term strategic framework.  The next administration would look at it, and adopt or amend it.  The Department had put in the work so that the next administration would not have to go through long planning processes.

The Chairperson noted that the report highlighted that seven departments had submitted APPs and others had not. He asked for information on this issue.

Dr Phillips said that the seven departments noted in the report referred only to the current quarter.  More comments had been made on the APPs of other departments, and the number of reports for the whole year would be much higher.

Mr Bapela thanked the committee for commending the Department on the work it had done. He assured the Committee that at the end of the financial year, the 85% target would have been achieved.  The Chairperson had looked at the entire year, rather than at the third quarter, which was what the Committee was being briefed on.

He noted that there had been voluntary participation by a number of the departments, but this did not mean that everything was excellent. The Department had been looking at its internal capacity to assist in ensuring that the mandate was achieved. The Department was also looking at the reliability of data, and conducted quality assurances to verify information. In government, data had been identified as a challenge, and he hoped the next administration would deal with that concern.

The Chairperson said he thought the Department would have commented on his point of monitoring and evaluating as a country. He had commented that maybe a common approach on what exactly was being evaluated should be discussed. He was interested in the reports of the presidential hotline regarding queries of the Department, and would have liked to have seen one or two cases.

Mr Bapela suggested to the Director-General that there may be a need to educate new Members of the Committee on the task of performance monitoring and evaluation -- what each component stood for, and what each component did.  It would be key to broadening the understanding of Members of these issues.    

The Chairperson said that the Committee had a good relationship with the Department.

The committee adopted the minutes of its meetings on 29 January, 4 February and 5 February 2014.

The meeting was adjourned.                           
 

Present

  • We don't have attendance info for this committee meeting
Share this page: