Department of Performance Monitoring and Evaluation 2013 Strategic & Annual Performance Plans

Standing Committee on Appropriations

23 April 2013
Chairperson: Mr E Sogoni (ANC)
Share this page:

Meeting Summary

The Department of Performance Monitoring and Evaluation (DPME) presented its strategic and annual plan for the year 2013/14. During the 2010/11 year, DPME was part of the Presidency vote focusing on government-wide monitoring and evaluation frameworks and had put in place performance and delivery agreements related to the twelve government priorities. In the 2011/12 financial year, DPME had its own budget, which widened to include monitoring of the implementation of the delivery agreements and other initiatives. In April 2012, the DPME submitted a revised Strategic Plan, but there had been no further substantial changes to the mandate, so it was not necessary to update that further. An overview of the programmes was given, to include programmes on national priorities, management performance of monitoring and evaluation, front line service delivery monitoring. The four programmes were summarised, and key priorities for each were described. In Programme 1: Administration, these included the service delivery improvement programme and government-wide M&E IT guidelines and support. Key priorities for Programme 2: Outcomes Monitoring and Evaluation, included monitoring of the implementation of the delivery agreements for the 12 outcomes, evaluation of major programmes, review of government performance, local government performance monitoring tools and reporting, improved data quality for outcomes, and capacity building for evaluations. Programme 3: Monitoring and Evaluation Systems focused on capacity building across the whole of government, and development indicators. Programme 4: Public Sector oversight included priorities of management performance assessment, front-line service delivery monitoring, citizen based monitoring, and Presidential hotline performance improvement programme.

The MTEF budget allocation was provided, with a comparison to the previous year. There had been reductions in the baseline, following a Cabinet decision in 2012, but additional funding for salary increases. The budget allocations for compensation, with the number of funded posts, for goods and services and capital assets were all provided. DPME would not be able to fill as many posts as it had hoped, which would impact on the amount of work it would be able to do. It was aiming to spend R12 million for independent evaluators, R5 million on surveys and training providers, R1.7 for co-sources internal audit expertise, R15 million on computers. It would need significant capital expenditures for its own ICT system, but other line items had been reduced.

Members asked a number of questions around the use of consultants, filling of posts, the vacancy rate, and the internal audit capacity, since the question was asked whether the DPME should not rather be using its own audit staff. SITA’s management of the Presidential hotline, custodianship for monitoring and evaluation across government, and the specific work done with the Department of Education were questioned. Members sought further clarity on some ambitious targets and timeframes. They enquired what was being done about and monitoring of municipalities. They noted that it was important for the DPME to have the necessary capacity itself to meet its targets.
 

Meeting report

Department of Performance Monitoring and Evaluation: Strategic plan and Annual Performance Plan 2013/14
Dr Sean Phillips, Director General, Department of Performance Monitoring and Evaluation, set out the background for this department (DPME or the Department), noting that during 2010/11, Performance Monitoring and Evaluation (PME) was part of the Presidency vote, focusing on the government-wide monitoring and evaluation (M&E) framework, and putting in place performance and delivery agreements related to the twelve priority outcomes of Government. In 2011/12, the DPME had its own budget for the first time, and its focus widened to included monitoring of the implementation of the delivery agreements. It also undertook additional initiatives of management performance assessment of national and provincial departments, the National Evaluation Policy Framework (NEPF), national and provincial evaluation plans, and evaluations, front-line service delivery monitoring programme, the Presidential Hotline, and citizen based monitoring. In April 2012, the DPME submitted a revised Strategic Plan, which took into account the increased emphasis and focus on evaluation (after Cabinet approved the NEPF on 23 November 2011), with further improvements to the previous plan, based on comments from internal auditors and the Auditor General, including refinements of the DPME outputs, indicators and targets. The revised plan further clarified the DPME’s mandate, and additional mandate of managing the Presidential Hotline. Under the guidance of the revised Strategic Plan, the DPME developed the 2013/14 Annual Performance Plan (APP). Since April 2012, there had been no further substantial changes to the DPME’s mandate, and therefore the DPME did not submit a revised strategic plan in 2013.

Mr Phillips outlined that the various programmes included M&E of national priorities, management performance M&E, M&E of frontline service delivery, and Custodianship for M&E across government. M&E of national priorities focused on outcomes and impacts of government work in priority areas, plans for the twelve priority outcomes (delivery outcomes), monitoring progress against the plans, and evaluation in order to inform improvements to programmes, policies and plans.  Management performance M&E focused on moderated self assessment, quality of management practices in individual departments, and aimed to drive a process of continuous improvement of management practices, and improved government performance and service delivery. The M&E frontline service delivery focused on monitoring the experiences of citizens who were obtaining services, the Presidential hotline, unannounced visits, request and improvements, and citizen based monitoring, and Custodianship for M&E across government, which was focused on developing capacity of national and provincial departments and municipalities to carry M&E themselves. Further, it aimed to develop a management culture of continuous improvement based on M&E, and address problems with data quality and information management, and national evaluation systems.

Mr Phillips noted the four main programmes of the DPME. Programme 1: Administration had to do with providing strategic leadership and management, as administrative support, covering HR, financial management and IT services. Programme 2: Outcomes Monitoring and Evaluation (OME) had to do with co-ordination and management of the outcomes-oriented performance monitoring and evaluation system. Programme 3: M&E System Coordination and Support focused on creation of policy platforms for the government-wide M&E system, on M&E capacity building, provision of data support to the Department and clients. Programme 4: Public Sector Oversight focused on performance monitoring of individual institutions, Frontline Service Delivery (FSD), and management of the Hotline.

Mr Phillips said that each programme had its own key priorities, outputs, targets and intended outcomes, which he described. The outputs of Programme 1 included service delivery programmes and government-wide M&E IT guideline and support. The target was to integrate areas of weaknesses identified through Monitoring and Performance Assessment Tools (MPAT) and audit findings in the Service Delivery Improvement Plan (SDIP) through quarterly monitoring. It was aiming for a clean audit report and improved service delivery. The outcomes of Programme 2 included monitoring of the implementation of the delivery agreement for the twelve government outcomes, evaluations of major programmes, local government performance monitoring, capacity building for evaluation, and review of government performance. The targets included monitoring the implementation of outcomes and producing quarterly reports and briefing notes for Cabinet, evaluating all major programmes identified in the evaluation plan, producing the 20-year review, and translation of the National Development Plan (NDP) into the 2014-2019 Medium Term Strategic Framework (MTSF) and delivery agreements. Programme 2 aimed to achieve better programmes which in turn achieved better impacts. The  MTSF would be completed by April 2014, providing the first five-year building block of the NDP.

Programme 3 outputs included the whole of government M&E capacity building and development indicators. The targets were to conduct a survey of various elements of M&E systems in national and provincial departments, and develop two new or revised guidelines, with M&E forum meetings held on a quarterly basis, to coordinate key stakeholders. The intention was to develop and publish development indicators by March 2014. The intended outcomes included improving M&E capacity and systems in national and provincial government departments, improving coordination of M&E practice, and informing the public about trends in keys areas of development.

Programme 4 had outputs of management performance assessment, frontline service delivery monitoring (FSDM), citizens-based monitoring and the Presidential Hotline performance improvement programme. The targets were  to conduct annual MPAT assessments and have at least 60% of national departments and 80% of provincial departments MPAT scores signed off by their Heads of Department (HoDs) by the end of the third quarter. It was hoped that 50% service delivery sites which had been visited twice would show an improvement in score for a least two of the eight assessment areas by the end of March 2014, and would develop and implement improvement plan in line with the timeframes. Citizens-based pilot studies should be done in three departments, with six monthly progress report. The intended outcomes were improved management practice, improved services to citizens, and improved rate of resolution of citizens’ concerns (see attached documents for more details).

Mr Phillips summarised the MTEF budget allocation for the years covering 2012 to 2015. There was additional funding for salary increases, but there were also reductions in the baseline in line with the Cabinet decision in 2012. The DPME had requested an increased budget in order to increase its capacity to monitor municipalities. However, due to the rising deficit and the ongoing economic crisis, Cabinet decided to decrease the budget by 1%, 2% and 3% respectively over the MTEF. Therefore, Cabinet had decided that there should be limits on the increase in personnel budget, which meant that the DPME was capped in the personnel budget and could not allocate over the baselines.

Mr Philips provided a more detailed overview, by line items. DPME reported that in the 2012/13 financial year the Department had 197 funded posts, of which 173 were filled, leaving 24 (12%) vacant. The vacancy was attributed to normal staff turnover of 8% (14 people), the time of two to three months taken to complete compulsory pre-employment screening, delays in the Department of Public Service and Administration (DPSA) Directive on implementation of the Public Service Bargaining Council resolution 1 of 2012, relating to salary levels 9/10 and 11/12, and the fact that three posts created to capacitate the HoD Assessment Unit were delayed until approval of policy by the Minister of Public Service and Administration.

Mr Phillips said that all but two of the 208 funded posts had been advertised for 2013/14, and were in various stage of being filled. The remaining two were only funded after December 2013, and would be advertised in July 2013. DPME had initially hoped, during the 2012 planning cycle, to increase to 236 posts, but had only been able to fund an increase from 197 posts in 2012/13, to 214. That would have an impact on the number of municipal assessments it could carry out and on the number of evaluations conducted or supported.

Mr Phillips then outlined the budget allocations for goods and services, from 2012 to 2015. The main expenditure related to professional services evaluations, at R12 million for independent evaluators, whose skills varied depending on evaluation topics, M&E capacity building which would include R5 million for conducting surveys, and training providers. R1.7 million was budgeted for co-sourced internal audit expertise, as it was not efficient for a small department to recruit on a fulltime basis. Other expenditure items included computer services of R15.7 million, and travel and subsistence. State Information Technology Agency (SITA) managed the Presidential hotline (see attached document for more details). DPME required significant capital expenditure in 2012/13 and 2013/14 to establish its own ICT system, which explained the increased expenditure. However, it was set to reduce expenditure over the MTEF, in relation to maintenance of ICT system, replacement of equipment and improvements of other systems.

Discussion
Mr M Swart (DA) asked if it was not possible to use the system of peer-review once the programmes were implemented.

Mr S Van Dyk (DA) noted that it was important for the DPME to know the ins and outs of all the departments.

Mr van Dyk wanted some clarity on the overview of the DPME background. He asked the DPME if it was satisfied with the way it exercised monitoring and evaluation, and what the end results were of monitoring and evaluation done so far.

Mr G Snell (ANC) noted that through legislation the Committee already know that things had been established and started.

Mr L Ramatlakane (COPE) noted that the economic crisis was well understood. He wanted more clarity on Custodianship hotlines, on SITA’s management of Presidential hotline, the local government pilot, and the filling of posts.

Mr Ramatlakane asked why there was a delay in DPSA directive on implementation of PSCBC resolution 1 of 2012, and asked for a timeframe for the filling of the vacant posts.

Mr Ramatlakane asked how many internal auditors the DPME employed, and how much it spent on consultants.

The Chairperson questioned when the internal and external audits were planned, and why the DPME did not have its own capacity.

Mr J Gelderblom (ANC) also asked for more information on monitoring of local government and municipalities.

The Chairperson noted that it was important for the DPME to have the necessary capacity itself to meet its targets. There had to be a good balance between policy and implementation. He noted that the DPME had requested an increase of budget. 

The Chairperson asked exactly how it was envisaged, and to what extent, that M&E was going to help departments to improve, and where there was any reference development of management standards.

The Chairperson thought that some of the targets might be too ambitious targets, and wanted an  overview of 2013/14 budget and MTEF estimates as indicated on page 11 of the Annual Performance plan (APP).

Mr Ramatlakane wanted clarity on estimated performance targets 2013/14 as indicated in the Annual Performance Plan (APP).

The Chairperson asked if there was an alignment of the APP to the service delivery agreement.

Mr N Singh (IFP) asked how much monitoring and evaluation had been done in the education sector. He asked about the role of DPME and other Departments.

Ms L Mashigo (ANC) wanted to know if the issue of libraries in schools across the country was the responsibility of DPME.

Ms Mashigo asked what influence the M&E had on the review of budget.

Mr Phillips replied that the work of the DPME had always been guided by the current legal framework such as Public Finance Management Act (PFMA) and Public Service Act (PSA). However, the legal framework had been a complex issue, and there would be a change if there was a need, with more focus on that framework. There were a number of systems in place that held government departments accountable, and the DPME had been doing more than producing information. The DPME’s monitoring reports went to Cabinet. The assessment had been done on municipalities, top management, and individuals (self assessment). Indicators for service delivery had been put in place and assessed. The results of the local government assessment would be made available publicly.

The DPME had been working with DPSA and National Treasury, and workshops had been organised with other stakeholders. With all the programmes that the DPME had been doing, the DPME was aware of the dilemma of monitoring and evaluation, and it had emphasised that all good managers should continue doing their own monitoring and evaluation for their Departments. In fact, the ultimate goal of the DPME was to help Departments to conduct their own monitoring and evaluation. The DPME had been selective and strategic when doing monitoring and evaluation as it focused on key areas of each government Department. The key challenges had been the quality of service delivery on the ground.

In relation to targets, he noted that the DPME needed more capacity to meet its targets, and that was why it requested budget increases. In principle, the DPME tried to use its own internal staff, but believed that in many cases it was more cost-effective to use consultants. In regard to the vacancies, he noted that all currently-funded posts had been advertised, and other would be filled soon. More details were in the APP.

Ms Kaarjal Soorju, Director: Human Resource Management, DPME, replied that six posts had been filled, and two more were in the process of being filled.

Mr Phillips noted that SITA had managed the Presidential hotline for some times, and the aim was to increase the improvement of SITA’s management for the hotline. He noted that the DPME evaluation system was fairly new, and it had initiated 15 evaluation programmes. According to the National Evaluation Plan, every department was allowed to evaluate itself, and it was also allowed to use external evaluators. Therefore, all departments had M&E unit, but the problem lay with top management of each department. Much of the management assessment was done in collaboration with government departments.

The DPME had engaged with Auditor General to try to obtain a clean audit report. The DPME had one full-time internal auditor. It had done cost comparisons on whether it was more effective to spend on consultants in this unit, rather than employing two or three full-time internal auditors.

Mr Phillips explained that the President had been doing monitoring and evaluation on his Ministers.

Mr Phillips noted that it was not the responsibility of the DPME to do monitoring and evaluation on schools, as this responsibility vested in the Department of Basic Education. However, the DPME had visited some schools to check whether teachers came to school on time.

In concluding, the Chairperson noted that some of the information was included in the APP, and the Committee would engage with the DPME further.

The meeting was adjourned.
 

Present

  • We don't have attendance info for this committee meeting

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: