ATC131022: Budgetary Review and Recommendation Report of the Standing Committee on Appropriations on the Department in Presidency for performance monitoring and evaluation 22 October 2013

Standing Committee on Appropriations

Budgetary Review and Recommendation Report of the Standing Committee on Appropriations on the Department in Presidency for performance monitoring and evaluation, dated 22 October 2013

The Standing Committee on Appropriations, having assessed the service delivery performance of the Department of Performance Monitoring and Evaluation (the Department), reports as follows:

1. INTRODUCTION

1.1. The Role of the  Committee

In terms of section 4(4) of the Money Bills Amendment Procedure and Related Matters Act, No. 9 0f 2009, the mandate of the Standing Committee on Appropriations (the Committee) is to “consider and report on the following:

·         spending issues;

·         amendments to the Division of Revenue Bill, the Appropriation Bill, Supplementary Appropriation Bills and Adjustment Appropriation Bill;

·         recommendations of the Financial and Fiscal Commission, including those referred to in the Intergovernmental Fiscal Relations Act, 1997 (Act No. 97 of 1997);

·         reports on actual expenditure published by the National Treasury; and

·         any other related matter set out in the [above-mentioned] Act”.

In addition to the above, the Committee has been tasked to oversee the activities of the Department of Performance Monitoring and Evaluation, as well as the National Youth Development Agency in terms of National Assembly Rule 199(b) as from 1 November 2011.

1.2. The Mandate of the Department

The Department was promulgated in January 2010 in line with Section 85(2) (c) of the Constitution. As from 1 April 2011 the Department was assigned Vote 6 after being removed from Vote 1: The Presidency. The mandate of the Department of  Performance Monitoring and Evaluation (the Department) is derived from Section 85 (2) (c) of the Constitution of the Republic of South Africa (the Constitution) which states that the President exercises executive authority, together with the other members of the Cabinet, by coordinating the functions of State Departments and administrations.

The Department has the following mandate:

·         to facilitate the development of plans or delivery agreements for the cross cutting priorities or outcomes of government and monitor and evaluate the implementation of these plans;

·         monitor the performance of individual national and provincial government and municipalities;

·         monitor frontline service delivery;

·         manage the Presidential Hotline;

·         carry out evaluations of major and strategic government programmes;

·         promote good monitoring and evaluation (M&E) practices in government; and

·         provide support to delivery institutions to address backlogs in delivery.

1.3. Purpose of Budget Review Recommendation Report (BRRR)

Section 5 of the Money Bills Amendment Procedures and Related Matters Act, No 9 of 2009 (the Act), states that the National Assembly, through its Committees, must annually assess the performance of each national department with reference to the following:

·         The medium term estimates of expenditure of each national department, its strategic priorities and measurable objectives, as tabled in the National Assembly with the national budget;

·         Prevailing strategic plans;

·         The expenditure report relating to such department published by the National Treasury in terms of section 32 reports of the Public Finance Management Act, No 1 of 1999 (PFMA), as amended in 2009;

·         The financial statements and annual report of such department;

·         The report of the Committee on Public Accounts relating to the department; and

·         Any other information requested by or presented to a House or Parliament.

Committees must submit the Budgetary Review and Recommendation Report (BRRR) annually to the National Assembly. The BRRR assesses the effectiveness and efficiency of a department’s use and forward allocation of available resources and may include recommendation on the use of resources in the medium term.

Committees must submit the BRRR after the adoption of the budget and before the adoption of the reports on the Medium Term Budget Policy Statement (MTBPS) by the respective Houses in November of each year.

1.4. Methodology

The Committee has scrutinised and interrogated all available documents as outlined in Section 5 of the Act. The Committee has assessed the performance of the Department in the 2012/13 financial year, as well as performance in the first quarter and second quarter of the 2013/14 financial year. Key issues emanating from various engagements between the Committee and the Department were also considered.

2. OVERVIEW OF THE POLICY FOCUS AREAS

Monitoring and evaluation of key priority outcomes is aimed at increasing the strategic focus of government and implementing the constitutional imperative for cooperative governance. This is done through the development of Ministerial Performance Agreements and interdepartmental and intergovernmental Delivery Agreements, as well as regular monitoring of the implementation of the Delivery Agreements.

Cabinet approved a National Evaluation Policy Framework in November 2011. The focus of the National Evaluation Framework is to improve performance of government, institutional accountability and decision making. In the 2012/13 Annual Performance Plan (APP), the Department of Performance Monitoring and Evaluation (the Department) indicated that for the 2012/13 financial year and over the Medium Term Expenditure Framework (MTEF) period there will be an increased focus on evaluations. A National Evaluation Plan was developed and approved by Cabinet in December 2012.

The four strategic goals outlined in the Department’s 2011-2015 Strategic Plan remain the same, i.e. there has been no policy shift in the 2012/13 APP and 2013/14 APP. The four strategic goals of the Department as contained in its Strategic Plan 2011-2015 are as follows:

·         An efficient and effective department that complies with legislation, policy and good corporate governance principles;

·         To advance the strategic agenda of government through the development and implementation of the delivery agreements for the outcomes, monitoring and reporting on progress and evaluating impact;

·         To promote Monitoring and Evaluation practice through a coordinated policy platform, quality capacity building and credible data systems; and

·         To monitor the quality of management practices in departments and the quality of front line service delivery.

New priority programmes in the areas of local government and service delivery assessments were identified for roll out in 2013/14 and will be carried out through the Municipal Assessment Tool programme and the Citizen Based Monitoring programme. For the 2013/14 financial year the Department carried out organisational streamlining, reallocation and integration in order to improve on the provision of its services. In particular, some of the functions of the sub-programme, Monitoring and Evaluation Data Support in Programme 3, were shifted to Programme 2: Outcome Monitoring and Evaluation in order to improve accountability and integration of functions supporting the attainment of outcomes and Geographic Information System moved to Programme 1: Administration.  In its 2013/14 First Quarter Performance Report, the Department indicated that Programme 3 will be abolished with effect from 1 April 2014.

The priorities of the Department are in line with the National Development Plan’s (NDP) objectives which advocate for a Government that is accountable and transparent. The Department has been working with the National Planning Commission (NPC) on developing an implementation plan for the NDP. The Department indicated that the main focus area for the 2013/14 financial year will be on streamlining the NDP into the work of Government. The Department has started the process of drafting the Medium Term Strategic Framework (MTSF) for the new electorate cycle.

The 12 priority outcomes of Government are as follows:

·         Improved quality of education;

·         A long and healthy life for all South Africans;

·         All people in South Africa are and feel safe;

·         Decent employment through inclusive economic growth;

·         A skilled and capable workforce to support an inclusive economic growth;

·         An efficient, competitive and responsive economic infrastructure network;

·         Vibrant, equitable and sustainable rural communities with food security for all;

·         Sustainable human settlement and improved quality of household life;

·         A responsive, accountable, effective and efficient local government system;

·         Environmental assets and natural resources that are well protected and continually enhanced;

·         Create a better South Africa and contribute to a better and safer Africa and World; and

·         An efficient, effective and development-oriented public service and an empowered, fair and inclusive citizenship.

3. SUMMARY OF PREVIOUS KEY COMMITTEE RECOMMENDATIONS

3.1. 2012 BRRR Recommendations

The Committee having assessed and reviewed the performance of the Department in the 2012 Budget Review Recommendation Process made the following recommendations:

(a)     The Department engages all relevant stakeholders and investigates best-practices in respect of legislation that focuses on results-based outcomes, namely Results Bill, in order to effectively and efficiently delivers on its services.

(b)     The Department fills all funded vacant positions and secures funding for any other positions that remain unfunded.

(c)     The Department expedites the process of establishing its own internal Information Technology infrastructure capacity.

(d)     The Department expedites the process of ensuring the availability of credible and reliable performance data and information across government and the publishing thereof on their website with an emphasis on municipalities especially those that lack even the basic data systems.

(e)     The Department ensures that it fully utilises the resources allocated to it in order to ensure that its financial performance is consistent with its quarterly expenditure projections.

3.2. 2013/14 Budget Vote Recommendations

The Committee having considered the Annual Performance Plan and the Budget of the Department of Performance Monitoring and Evaluation for the 2013/14 financial year made the following recommendations:

(a)     That the Department provide measurable and achievable performance targets.

(b)     That the Department consider utilising shared services for its internal audit function.

(c)     That local level structure such as ward committees, community development workers and community police forums and other community based structures be integrated within the government-wide Monitoring and Evaluation Policy Framework.

3.3. Oversight Visit to Presidential Hotline May 2013

The Committee having undertaken a site inspection of the Presidential Hotline and engaged with the Department made the following recommendations:

(a)     That the Department ensures that all capacity challenges (i.e. vacancies and system limitations) related to the Presidential Hotline’s Service Delivery Level Agreement with the State Information Technology Agency (SITA) are addressed so as to improve resolution outcomes.

(b)     That the Department improves the categorisation and structure of Presidential Hotline cases captured in order to enhance the usefulness of the data.

4. OVERVIEW AND ASSESSMENT OF FINANCIAL PERFORMANCE

Since its inception as Budget Vote (Vote 6) in April 2011, the Department has been implementing its mandate through four programmes. The Presidential Hotline was transferred to the Department from the Presidency with effect from 1 October 2011; in line with overall mandate of a government-wide performance monitoring and evaluation. This section provides an overview of spending trends for the period 2009/10 to 2015/16, an assessment of financial performance for 2012/13 and financial performance for the first and second quarter of 2013/14.

4.1. Overview of Vote Allocations and Spending Trends (2009/10 to 2015/16)

Table 1: Overview of Expenditure Performance

Audited Outcomes

Main

Adjusted

Audited Actual

Medium Term Estimates

Programmes

2009/10

2010/11

2011/12

2012/13

2012/13

2012/13

2013/14

2014/15

2015/16

Administration

-

-

33.1

59.8

59.8

53.8

56.9

55.9

55.5

Outcomes Monitoring and Evaluation

-

-

22.6

37.5

48.5

48.6

61.2

69.9

77.3

M&E Systems Coordination and Support

10.4

40.5

9.4

18.9

13.3

12.6

17.3

17.5

17.9

Public Sector Oversight

3

6.8

30.5

57.8

52.4

45.1

57.3

58.8

60

Total

13.4

47.3

95.6

174.2

174.1

160.2

192.7

202

210.7

Current Payments

11.8

40.2

89.8

160.6

185.5

149.9

183.3

195.3

205.3

Compensation of Employees

6.2

26.1

54.4

93.1

89

83

108.5

114.9

118.1

Goods and Services

5.6

14.1

35.3

67.5

69.6

66.9

75.4

80.3

87.2

Payment for Capital Assets

1.6

7.1

5.8

13.5

15.5

10

8.9

6.7

5.4

Buildings and other fixed structures

-

-

0.1

-

-

-

-

-

-

Machinery and Equipment

1.6

7.1

4.7

11

10.3

8.2

6.1

4.9

3.9

Software and other intangible assets

-

-

1.1

2.5

5.2

1.8

2.9

1.9

1.6

Total

13.4

47.3

95.6

174.2

174.1

160.2

192.7

202

210.7

Source: DPME and National Treasury

Expenditure increased from R13.4 million in 2009/10 to R95.6 million in 2011/12. The Department was a chief directorate within the Policy Coordination and Advisory Services unit in the Presidency until 31 March 2011. The increase in budget allocations to R174.1 million in 2012/13 was for the Department to deliver the full scope of outputs contained in its mandate and strategic objectives. The budget increases from the audited final expenditure of R160.2 million in 2012/13 to a funding allocation of R192.7 million in 2013/14 and increases moderately in the medium term to reach R210.7 million in 2015/16. The increase in budget allocations from 2012/13 is largely due to the carrying out of evaluation projects as the department embeds the outcomes approach and increases in goods and services which are to support the institutional performance and the Frontline Service Delivery programme.

4.2. Financial Performance 2012/13

In the 2012/13 financial year the Department was allocated an adjusted budget of
R174.1 million and spent R160.2 million or 92.2 per cent of its allocated budget. The Department under spent by R13.9 million or 7.8 per cent in 2012/13. This is an increase in under expenditure from R3.3 million or 3.4 per cent in 2011/12 to R13.9 million or 7.8 per cent in 2012/13.

Table 2: Appropriations Statement for 2012/13

Programme

Final Appropriation 2012/13

Actual Expenditure2012/13

Virements

Variance

Spending as percentage of Final Appropriation

Final Appropriation 2011/12

Actual Expenditure 2011/12

Administration

59 840

53 821

-

6 019

90

35 021

33 087

Outcomes Monitoring and Evaluation

52 589

48 643

4 050

3 946

92

29 511

28 927

M&E Systems Coordination and Support

13 378

12 607

8

771

94

3 091

3 034

Public Sector Oversight

48 352

45 165

(4 058)

3 187

93

28 597

27 793

Total

174 159

160 236

-

13 923

92

96 220

92 841

Compensation of Employees

85 827

83 009

-

2 818

97

52 983

51 673

Goods and Services

72 528

66 902

-

5 626

92

35 598

35 325

Payments for Capital Assets

15 585

10 106

-

5 479

65

7 591

5 813

Total

174 159

160 236

-

13 923

92

96 220

92 841

Source: DPME

Programme 1 was allocated a final appropriation of R59.8 million and spent R53.8 million, resulting in an under expenditure of R6.0 million. The reason for the under-expenditure was the delay in finalising installation of the Department’s Information Communications Technology infrastructure.

Programme 2 was allocated a final appropriation of R52.5 million but only spent
R48.6 million, resulting in an under expenditure of R3.9 million. The reason for the under-expenditure was due to the 20-year review project and research project on concurrent norms and standards not finalised as planned at the end of March 2013.

Programme 3 was allocated a final appropriation of R13.3 million and spent R12.6 million, resulting in an under expenditure R0.78 million. Programme 4 was allocated a final appropriation of R48.3 million and spent R45.1 million thus saving by R3.2 million. The saving was a result of less than expected running costs for the State Information Technology Agency call centre and toll-free number costs of Presidential Hotline.

For the 2012/13 financial year, compensation of employees spent R83.0 million out of a final appropriated amount of R85.8 million and thus under spent by R2.8 million. The reasons for under spending on personnel includes challenges in finding suitable candidates for some specialist positions; delays in implementation of the Bargaining Council Resolution regarding salary levels 9/10 and 11/12 which was budgeted for; delays in obtaining pre-employment security screening results which delayed some of the appointments and delays in implementation of the Heads of Department (HoD) assessment programme as the Department was and is still awaiting the issuing of the new Performance Management Development System (PMDS) policy for HoDs by the Department of Public Services and Administration.

Goods and Services spent R66.9 million out of a budgeted allocation of R72.5 million and thus under spent by R5.6 million.  The reason for the under-expenditure was due to the lower than expected actual SITA costs for increased call centre operators and services; and lower actual toll-free number costs. Furthermore, the Department had committed R5.1 million for the 20-year review and the Norms and Standards project which was not spent as the request for the project was signed off only after beginning of the 2012/13 financial year.

With regards to payments for capital assets, machinery and equipment spent R8.2 million from an allocated budget of R11.5 million thus under-spending by R3.3 million whilst software and other intangible assets spent only R1.8 million out of R4.0 million thus under-spending by R2.1 million. The reason for the under-expenditure was the delayed agreement with SITA to the Information Communications Technology (ICT) infrastructure plan. The migration to the ICT system was completed by 31 March 2013.  The Department indicated that migration to own telephone system is to be completed by end of June 2013. In addition, under-spending on ICT projects was due to delays in the delivery of equipment and delays in the finalisation of Programme of Action, Management Performance Assessment Tool (MPAT) and Frontline Service Delivery (FSD) systems.

With regards to virements and shifts for 2012/13, virement amount of R.4.05 million was transferred from current payments in Programme 4 to current payments in Programme 2. Funds realised from Programme 4 under Presidential Hotline (SITA) and due to lower than expected call costs were moved to goods and services under Programme 2 to fund the 20 year review and research on norms and standards for concurrent functions. In addition, funds shifted from compensation of employees were moved to the ICT unit for additional software licences.

The Department’s 2012/13 Annual Report shows that irregular expenditure was incurred in the finance lease expenditure item (R455 000) and was automatically condoned through Treasury Practise note 5 of 2006/07. Fruitless and wasteful expenditure amounting to
R65 000 was incurred on travel-related cancellations and no-shows. R37 000 was condoned as it was a result of unforeseeable and unavoidable circumstances and R4 000 was recovered from officials. The remainder was still under investigation at the time of the report but the amount has since been condoned or recovered.

In terms of in year expenditure performance for 2012/13, the Department had spent 13.4 per cent in the first quarter, 33.2 per cent in the second quarter, 55.2 per cent in the third quarter and 92.2 per cent at the end of the 2012/13 financial year. For the first three quarters of the financial year the Department was behind projected spending by an average variance of 39 per cent. This indicates that quarterly spending performance was not in line with budget projections by a significant margin with a large portion of the budget only spent in the last quarter of the year.

The Department indicated that the reason for the increase in expenditure towards the end of the 2012/13 financial year was largely due to delayed expenditure on IT equipment, evaluations and research projects.  The department reported that engagements with SITA on the planning for the Departments IT infrastructure were only finalised in June 2012 thus the procurement of IT equipment could only commence thereafter. The National Evaluation Policy Framework was approved in November 2011 and the 2012/13 National Evaluation Plan was only approved in June 2012 thus evaluations largely took place towards the end of the financial year. In addition, DPME was requested to carry out the 20 year review and the norms and standards research project after the beginning of the 2012/13 financial year therefore expenditure on these two projects largely took place towards the end of the financial year.

4.3. Financial Performance for First and Second Quarter - 2013/14

For the 2013/14 financial year the Department received a budget allocation of
R192.7 million. The main expenditure items for 2013/14 include the rollout of evaluations, monitoring and evaluation capacity building, bolstering the Internal Audit unit, computer services and travel and subsistence.

Table 3.1: 2013/14 First Quarter Expenditure

2013/14

Programme

Main Budget

Planned Expenditure

Actual Expenditure

Variance

Administration

56 915

14 009

10 846

3 163

Outcomes Monitoring and Evaluation

61 231

15 668

11 707

3 961

M&E Systems Coordination and Support

17 251

4 412

2 558

1 854

Public Sector Oversight

57 348

14 763

11 206

3 557

Total

192 745

48 852

36 317

12 535

Economic Classification

Main Budget

Planned Expenditure

Actual Expenditure

Variance

Current Payments

183 835

46 280

35 004

11 276

Compensation of Employees

108 472

26 267

23 614

2 653

Goods and Services

75 363

20 013

11 390

8 623

Transfers

-

-

23

-23

Payments for Capital Assets

8 910

2 572

1 290

1 282

Total

192 745

48 852

36 317

12 535

Source: National Treasury

The Department spent R36.3 million or 18.8 per cent in the 1st quarter of the 2013/14 financial year. Therefore, the Department spent R36.3 million from a projected spending allocation of R48.8 million. This resulted in lagged expenditure or variance of R12.5 million or 26 per cent. This represents a significant variance between budget projections and actual expenditure.  All programmes under spent their budget relative to projections.

The administration programme spent R10.8 million or 19 per cent of its funded allocation of
R56.9 million. The main cost drivers within Programme 1: Administration is compensation of employees, payments to SITA for computer services and expenditure on capital assets. . The under expenditure can be attributed to outstanding invoices and delays in the delivery of the ICT equipment procured under this programme.

Programme 2: Outcomes M&E spent R11.7 million or 19.1 per cent in the first quarter of 2013/14. The main cost drivers are compensation of employees and evaluations. The under expenditure in this programme was due to unfilled funded vacancies though the Department indicated that all vacant posts have been advertised.

Programme 3: M&E systems spent R2.5 million or 14.5 per cent in the first quarter of the financial year. A large portion of expenditure in the M&E systems programme was on compensation of employees and goods and services (business and advisory consultancy services, and travel and subsistence). Furthermore, the evaluations system project did not commence as per planned schedule and the organisational restructuring process delayed expenditure on goods and services.

Programme 4: Public Sector Oversight spent R11.2 million or 11.2 per cent in the first quarter of 2012/13. The main expenditure drivers within the Public Sector Oversight programme are the SITA costs for the Presidential Hotline. The under expenditure in this programme was due to outstanding invoices for the SITA Hotline services in Programme.

In terms of economic classification, expenditure on compensation of employees was R35 million or 19 per cent as at end of June 2013, R23.6 million or 21.8 per cent had been spent on goods and services and R1.3 million or 14.5 per cent had been spent on payments for capital assets.

In its quarterly performance report submission, the Department reported that the request for the roll-over of R8.4 million which was mainly for the 20 Year Review and Norms and Standards project was not approved in the 2013/14 budget roll-over process. The Department stated that it will reallocate funds within the current budget to fund the projects.

Table 3.2: 2013/14 Second Quarter Expenditure

2013/14

Programme

Main Budget

Planned Expenditure

Actual Expenditure

Variance

Administration

56 915

28 981

26 371

2 610

Outcomes Monitoring and Evaluation

61 231

32 119

27 360

4 759

M&E Systems Coordination and Support

17 251

9 807

4 338

5 469

Public Sector Oversight

57 348

29 265

26 673

2 592

Total

192 745

100 172

84 742

15 430

Economic Classification

Main Budget

Planned Expenditure

Actual Expenditure

Variance

Current Payments

183 835

46 280

35 004

11 276

Compensation of Employees

108 472

52 612

51 052

1 560

Goods and Services

75 363

43 048

29 966

13 082

Transfers

-

-

66

-66

Payments for Capital Assets

8 910

4 512

3 658

854

Total

192 745

100 172

84 742

15 430

Source: National Treasury

For the period until September 2013, the Department had spent R84.7 million or 44 per cent of its 2013/14 main appropriation of R192.7 million. This represents a variance of R15.4 million against the first six months’ budget projection of R100.2 million.

A contributing factor to the variance in actual expenditure relative to projections is the organisational restructuring process currently underway within the department which will result in the abolishment of Programme 3: M&E Systems Coordination and Support. The functions of the programme are to be integrated into other programmes within the vote. Additional contributing factors to the variance were the delays in the relocation of the Department to new offices as well as vacancies resulting from resignations.

The Department reported that it continues to experience challenges in terms of attracting suitably qualified and experienced candidates especially for specialist positions and this often results in prolonged recruitment processes. In other instances, vacancies arise when advertised positions are filled by successful internal candidates which results in capacity voids in the successful candidate’s previous area of responsibility. In addition, the Department experiences delays in obtaining pre-employment security screening results for new appointments.

4.4. 2013 MTEF Budget Allocations

The budget for Programme 1 is set to decrease in the 2014 MTEF from R56.9 million in 2013/14 to R55.9 million in 2014/15 and R55.5 million in 2015/16. The budget for the Programme 2 is projected to increase over the medium term from R61.2 million in 2013/14 to R69.9 million in 2014/15 and R77.3 million in 2015/16. The budget for the Programme 3 is estimated to increase marginally in the medium term from R17.3 million in 2013/14 to R17.5 million in 2014/15 and R17.9 million in 2015/16. The funding allocation for the Programme 4 projects minimal increase in the medium term from R57.3 million in 2013/14 to R60.0 million in 2015/16.

Table 4: Budget Share per Policy Programme

POLICY PRIORITY PROGRAMME

2013/14

2014/15

2015/16

Administration

57.9%

61.3%

60.6%

Outcomes support

40.9%

41.3%

43.6%

Local Government

5.1%

10.4%

11.6%

Evaluation and Research

25.6%

25.3%

25.9%

Management Performance Assessment Tool

16.2%

16.9%

18.7%

Frontline Service Delivery Monitoring and Hotline

37.3%

38.8%

42.0%

M&E Policy / Capacity Building

8.8%

8.2%

8.2%

Source: DPME

The largest share of the Department’s budget allocation is towards administration at 57.9 per cent in 2013/14 and increases to 60.6 per cent in 2015/16. The Department’s budget places priority emphasis on Outcomes Support, FSD Monitoring, Presidential Hotline, and Evaluations and Research. The budget for Local Government (Municipal Assessments Tool) programme increases significantly in the medium term.

The Department acknowledged that funding new or expanding priorities within a budget allocation that increases moderately in the medium term will require significant internal re-prioritisation of funding within the Department.  In its submission the Department indicated that priority will be given to evaluations and government wide management performance assessments. The Department has secured donor funding to support a number of programmes including evaluations, piloting the MAT tool and M&E training and guidelines. The Department has followed a zero-based budgeting approach in preparation for funding allocations for the 2014 MTEF.

5. OVERVIEW AND ASSESSMENT OF SERVICE DELIVERY PERFORMANCE

In 2010/11, the Department facilitated the conclusion of 12 delivery agreements based on the performance agreements signed between the president and Cabinet ministers in April 2010. The new programme of action was developed and implemented to support ongoing reporting on and monitoring of progress against outputs data and targets contained in the 12 delivery agreements. This section provides an assessment of service delivery performance in 2012/13 and for the first quarter of 2013/14. The section will also include highlights of some of the key projects completed by the Department.

5.1. Service Delivery Performance for 2012/13

In its 2012/13 Annual Report, the Department reported that the outcomes system has now been institutionalised with quarterly reporting to Cabinet on progress with implementing the delivery agreements. Many departments have adopted the new approach of focusing on measurable results and impacts. However, there remain challenges in areas such as planning where programmes of departments are not yet sufficiently rigorous in terms of measuring baseline data and clearly explaining how the programme will achieve its intended objectives. There is not yet enough measurement of outcomes and impacts and some departments do not yet have the necessary infor­mation management systems in place to do this. The overall performance against targets for the Department is shown in the table below:

Table 5: 2012/13 Performance against Targets

Programme

Annual Targets

Actual Achieved

Partially Achieved

Not Achieved

Administration

33

27

4

2

Outcomes Monitoring and Evaluation

14

13

1

0

M&E Systems Coordination and Support

14

11

3

0

Public Sector Oversight

18

17

1

0

Total number of Targets

79

68

9

2

Percentage

100%

86%

11%

3%

Source: DPME

The overall performance of the Department shows that there were 79 planned targets for 2012/13 with the Department achieving 68 out of 79 or 86 per cent of its planned targets. This is an improvement on its 2011/12 performance where only 63 percent of the planned targets were achieved. According to the 2012/13 Annual Report the Department’s Service Delivery Improvement Plan for 2012/13 was successfully implemented. The service standards contained in the Service Delivery Improvement Plan include the provision of assessment reports on the Management Performance Assessment Tools within 20 days of the finalisation of the assessment and the preparation of quarterly reports on the implementation of outcomes within 30 days of the end of the quarter.

5.1.1. Programme Performance

Programme 1: Administration

Programme 1: Administration provides for strategic management and the administrative support to the Director-General and the Department. The programme objective is to ensure that the Department has effective strategic leadership, administration and management, and to ensure that it complies with all relevant legislative prescripts.

In terms of performance against planned targets, Programme 1 had 33 planned targets of which 27 planned targets were achieved, 4 being partially achieved and 2 targets were not achieved.

Key highlights from targets achieved include the approval of the risk register, the approval of a 3-year rolling Internal Audit plan by the Audit Committee, the approval of all remaining internal polices as per regulatory frameworks and collective agreements and the development of the Frontline Service Delivery monitoring information management tool. The Department reported that the Head of Internal Audit has been appointed and that the unit will coordinate implementation of the three-year rolling audit plan.

There were 4 targets that were partially achieved. The risk assessment process was not finalised for 2 programmes and this was due to the risk management committee not meeting regularly as planned. The Department indicated that it is to address this through linking the timing of the risk management committee meeting with the timing of other management meetings. In the 2012/13 APP, the Department had targeted to score at least a 3 on every management performance assessment (MPAT) key performance area but the Department  scored less than 3 in 6 out of 30 standards and indicated that an action plan is in place to increase its MPAT scores.  The Department had targeted to have a fully functional Help Desk Service Application in the third quarter of 2012/13 but this was partially achieved because while the system process had been mapped it was decided for it to be incorporated into a broader project to develop a system for corporate services. The Departmental Projects Dashboard application was to have been completed by the end of the third quarter 2012/13 but while intranet sites have been deployed and are operational, one component is still incomplete.

There were two targets that were not achieved. The first target not achieved relates to the Security Plan which was to have been drafted and approved in the second quarter of 2012/13.  The Security Plan forms part of the Information Communication Technology (ICT) governance infrastructure that should meet DPSA’s ICT governance requirements.  The Security Plan is currently only in draft form and is still to be approved and will be completed in 2013/14 as part of ICT governance implementation plan. The second target not achieved relates to Development Indicator application which was to be developed and used by Data Systems unit which is responsible for Development Indicators publication. The performance target forms part of effective business applications output and has been deferred to the 2013/14 financial year.

Programme 2: Outcomes Monitoring and Evaluation

Programme 2: Outcome Monitoring and Evaluation provides for the coordination of Government’s strategic agenda through the development of performance agreements between the President and Ministers, facilitation of the development of plans or delivery agreements for priority outcomes, and monitoring and evaluation of the implementation of the delivery agreements.

With regards to performance against planned targets, Programme 2 had 14 planned targets of which 13 planned targets were achieved and 1 was partially achieved. This represents an achievement rate of 97 percent which is an improvement on the 60 percent attained in 2011/12.

Targets achieved include the completion of the annual review of Delivery Agreements with changes effected where necessary, the successful submission of quarterly reports per outcome to Cabinet, completed briefing notes or reports on the executive’s monitoring and evaluation initiatives and 12 guidelines supporting evaluations were produced and published. The Department completed an improvement plan for the Early Childhood Development (ECD) evaluation that had been produced.  For the 20 year review project, the Department has put in place governance structures and 22 research papers had been received by end of the financial year.

The Department had planned to have 10 evaluation reports approved by the evaluation steering committees however, only one evaluation report (ECD) was approved. Subsequently two evaluations were approved by the steering committee in May 2013 and six more were envisaged to be approved in 2013/14. One evaluation on the National School Nutrition Programme has been withdrawn. The Department indicated that the evaluation process has proven to be more complex than anticipated and that the process of working with departments and obtaining agreements on the various stages of the evaluations has proven to take longer than anticipated. The Committee noted with concern that the 2012/13 APP had targeted for 20 evaluation reports for each year in the MTEF though this was later revised in the 2013/14 APP to 15 evaluations for each year in the MTEF.

Programme 3: Monitoring and Evaluation (M&E) Systems Co-ordination and Support

Programme 3: M&E Systems Co-ordination and Support is responsible for coordinating and supporting an integrated government-wide performance monitoring and evaluation system through policy development and capacity building. In addition, the purpose is to improve data access and coverage, data quality and data analysis across government.

In terms of performance against planned targets, Programme 3 had 14 planned targets of which 11 planned targets were achieved and only one was partially achieved.

Targets achieved include the establishment of a baseline of M&E systems from the first MPAT assessment report with a technical support programme put in place and further engagements have been held with those departments scoring below 2 in their MPAT scores. A diagnostic assessment of human resource capacity for M&E across government was conducted and an improvement plan developed and a plan for an assessment of M&E systems across government was implemented.  In terms of promoting ME policies and guidelines across government, the Department reported that five guidelines were completed and approved and four national M&E forum meetings and four provincial M&E forum meetings were held.

In terms of partially achieved targets, the Department had planned to have the Draft Results Bill submitted to Cabinet for approval by March 2013 but the development of the Draft Bill consultations with other administrative centres of government departments did not result in agreement. The Department indicated that there were different views on the need for a Result Bill amongst the key administrative centre of government departments. In addition, the revised Government Wide Monitoring & Evaluation (GWM&E) framework was to be completed for Cabinet approval by March 2013.  Although the framework was completed, the Department indicated that it was not ready for submission to Cabinet as it requires extensive development through a series of iterations as well as further wide consultation with key stakeholders. The Department stated that the process of the Results Bill and the GWM&E framework has now been combined with the process of developing one overarching discussion document. The Development Indicators report was produced but not published by the end of the financial year as Cabinet requested some additional work to be done.  Subsequently, the 2012 Development Indicators report was published in 2013/14.

Programme 4: Public Sector Oversight (PSO)

This Programme is responsible for Institutional Performance Monitoring and implementation of the Frontline Service Delivery Model. The programme is responsible for the implementation of management performance assessments, assessments of departments’ strategic plans and Annual Performance Plans (APPs) to determine their alignment with the prioritised outcomes, and monitoring of the implementation of key indicators of public sector performance on behalf of the Forum of South African Directors-General (FOSAD). Furthermore, it is responsible for designing and implementing hands-on service delivery monitoring activities with Offices of the Premier and for setting up and supporting the implementation of citizens-based service delivery monitoring systems, including the Presidential Hotline.

In terms of performance against planned targets, Programme 4 had 18 planned targets of which 17 planned targets were achieved and only one was partially achieved.

Targets achieved include the successful completion of Management Performance Assessment Tool (MPAT) assessments and the updating of the MPAT tool.   All 156 national departments and provincial departments completed their MPAT assessments before the end of the financial year. The 2012 MPAT results show that 101 out of 156 departments showed an improvement in at least one of the standards in each of the 4 Key Performance Areas on comparison between the MPAT self-assessments and moderated assessments. All 15 department’s whose APPs were submitted on time were assessed.

In terms of the Front Service Delivery Monitoring (FSDM) Framework, the FSDM implementation tools had been reviewed and published; and the Department finalised the national visits schedule and conducted 215 site visits. Furthermore, the Department produced FSDM findings reports for eight sector departments. There were 27 sites that were monitored for improve­ments and 65 per cent of service delivery sites which were visited at least twice had an improvement in scores for at least two of the seven assessment areas. With regards to the Presidential Hotline, the Department targeted an 80 per cent national and provincial case resolution rate for the 2012/13 financial year and exceeded the target by attaining a 90.2 per cent case resolution rate.

With regards to the target partially achieved, the Citizen Based Monitoring Framework was targeted to be signed off by Cabinet by March 2013. However, while this was signed off by the Minister and tabled at the Governance and Administration (G&A) Cluster, the G&A Cluster requested various recommendations to be incorporated before presentation to Cabinet. The framework has since been approved by Cabinet.

5.1.2. Human Resource Management

In 2012/13 the Department had placed emphasis on ensuring that at least 90% of all funded posts were filled.  At the end of the 2012/13 financial year the Department had 197 funded posts of which 173 were filled thus attaining a vacancy rate of 12 per cent. This was an improvement on the 2011/12 vacancy rate of 30 per cent.

The Department reported that the reasons for the non-achievement of the 10% vacancy rate target include time delays in the completion of compulsory pre-employment screening and delays in DPSA directive on implementing Public Service Commission Bargaining Council Resolution 1 of 2012 related to salary levels 9-10 and 11-12 and difficulties were experienced in recruiting specialized skilled officials due to competiveness in the labour market. The Outcomes Monitoring and Evaluation programme had the highest vacancy rate at 22 per cent followed by Public Sector Oversight programme with 10 per cent.

The employment equity statistics at the end of the 2012/13 financial year were, with respect to people with disability: 2 per cent, African: 79 per cent, and Women at Senior Management Service (SMS) level: 44 per cent. The Department indicated that it experienced difficulties in recruiting people with disability in the category of SMS.  The Department reported that it is currently focusing on recruiting more females and Africans in senior management positions to ensure set targets are achieved as well as increasing the number of people with disability in the department more specifically in SMS.

5.1.3. Report of the Auditor General of South Africa (AGSA)

The Department received a clean audit opinion. This is a financially unqualified opinion with no findings. An unqualified opinion is expressed when the Auditor-General concludes that the financial statements are presented fairly (in all material respects) in accordance with the applicable financial reporting framework. This is an improvement on the 2011/12 audit outcome which was expressed as unqualified with findings on pre-determined objectives and compliance.

Predetermined Objectives

Although the AGSA had no material findings concerning predetermined objectives, the following matter was highlighted in the 2012/13 AG’s report. In the 2012/13 audit cycle, material adjustments in the annual performance report were identified during the audit and all adjustments were corrected by management. The root cause of these adjustments was a lack of a proper record management system and verification of supporting documents. The AGSA recommended that information supporting the quarterly reports should be centrally filed and reviewed by the programme manager for completeness, accuracy and validity; and that internal audit should audit the quarterly performance report against supporting documents to confirm the completeness, accuracy and validity.

Information Technology (IT) Controls

The AGSA did not identify any deficiencies in internal controls which were considered sufficiently significant. However, the following matter was highlighted with regards to IT controls:

·         User access controls (policies, procedures, guidelines) to mitigate the risk of unauthorised access to the to the information systems.

·         Change management policy to mitigate the risk of unauthorised changes being implemented on the production environment of the Programme of Action (PoA) systems.

·         Recovery Plan to mitigate the risk of The Presidency not being able to fully recover should a disaster occur at its current location.

The root cause for the lack of approvals and documentation on IT controls was that the Department had been sharing and utilising the Presidency’s IT environment while it was still setting up its own IT system. The AGSA recommended that the Department finalise the establishment of its own IT system and that IT policies and procedures be developed, approved and implemented. Consequently, the Department reported that it had migrated to its own IT systems as at end March 2013, it has updated its user access policies; and is implementing its own policies and disaster recovery system.

5.2. The Department’s 2012 Management Performance Assessment (MPAT) Results

The Department was assessed in the 2012 Management Performance Assessment Tool process. The Department was fully compliant on 80 per cent of the standards in 3 of the 4 Key Performance Areas that are contained in the MPAT assessments. The following results were obtained per KPA:

·         KPA 1: Strategic management- 66 per cent compliance;

·         KPA 2: Governance and accountability- 89 per cent compliance;

·         KPA 3: Human Resource and Systems Management-80 per cent compliance; and

·         KPA 4: Financial Management – 86 per cent compliance.

The MPAT is scored in terms of 4 levels of management, with level 4 defined as full compliance with regulations and prescripts and doing things smartly. The low score in Strategic Management was due to the attainment of a score of less than 3 in the areas of Strategic Planning and Annual Performance Plans. In the Governance and Accountability KPA, the Department scored less than 3 in the areas of Risk Management and in the Assessment of Risk Management Arrangements. In the Human Resource (HR) and Systems Management KPA, the Department scored less than 3 in the areas of HR Strategy and Planning, HR Practices and Administration, HR Development Planning and in the Management of Diversity. In the Financial Management KPA, the department scored less than 3 in the areas of Supply Chain Management and Logistics Management.

A department which scored at level 1 or 2 for a particular management area is non-compliant with the minimum legal prescripts in that management area, and is performing poorly in terms of its management practices in that management area. The above results suggest that the Department’s improvement plan on the MPAT should focus primarily on addressing weaknesses identified in the Strategic Management and Human Resource Management Key Performance Areas.

5.3. Service Delivery Performance for 2013/14 and 2014 MTEF Proposals

The focus of the Department in 2013/14 remains that of assisting Government to achieve the set outcomes in government’s five key priorities through monitoring the implementation of the delivery agreements, implementing management performance assessments for departments, FSDM and citizen based monitoring; and improving government use of M&E systems. In the first quarter for 2013/14, the Department’s performance outcomes were as follows:

Table 6: 2013/14 Performance against Targets

Programme

1st Quarter Targets

Actual Achieved

Partially Achieved

Not Achieved

Administration

17

10

5

2

Outcomes Monitoring and Evaluation

16

10

6

0

M&E Systems Coordination and Support

5

5

0

0

Public Sector Oversight

18

15

1

2

Total number of Targets

56

40

12

4

Percentage

100%

84%

13%

3%

Source: DPME

5.3.1. Programme Performance

Programme 1: Administration

The key priority areas for this programme during the 2013/14 financial year include the Service Delivery Improvement Programme (SDIP) and the Government-Wide Monitoring and Evaluation Guideline and Support.

In terms of performance against planned targets in the first quarter, Programme 1 had 17 planned targets of which 10 planned targets were achieved, 5 being partially achieved and 2 targets were not achieved.

Targets partially achieved include the attainment of 9 out of a targeted 13 activities in the communications plan as some of the activities were postponed to the second quarter, presentation of the annual audit plan to the Audit Committee only took place in the second quarter and the vacancy rate was less not less than 10 per cent as targeted for.

The two targets which were not achieved were the review of the policy and procedure on planning to align it with the electronic reporting system and the expenditure against first quarter drawings which was less than the targeted 90-100 percent. The first target was delayed as the reporting system was still in testing phase and with regards to the second target which relates to expenditure performance the Department indicated that more focus has been placed on the monitoring of the budget by the departmental budget committee.

Programme 2: Outcomes Monitoring and Evaluation

The 2013/14 evaluation plan was approved in November 2012. In 2013/14 the Department in conjunction with relevant departments will conduct 15 evaluations for identified strategic priority programmes related to the 12 outcomes. The intended outcomes for the Department’s evaluations programme are to improve the quality of government’s service delivery programmes for improved outcome performance.   Ten municipal assessments will be piloted and completed in the 2013/14 financial year. The Department envisages that the municipal assessments will lead to improved delivery of local government services to citizens.

Programme 2 had 16 targets which were set to be achieved at the end of the first quarter of which 10 were achieved and 6 partially achieved. The 6 partially achieved targets include delays in engagement with two provinces on the content of the MPAT and implementation plan, the concept for Provincial Evaluation Plan was to be approved in at least 3 provinces but the Free State province was still working on the plan, guidelines drafted to support evaluations across government were produced but only signed off in the 2nd quarter, two national briefings/training in concept notes were held but less that the targeted 70 people were trained, only two from a targeted three evaluations were approved due to delays with some departments and the quarterly progress report for all outcomes took longer than the target of within one month of the POA reporting week as preparation of data took longer than expected due to inconsistencies that had been detected.

Programme 3: M&E Systems Co-ordination and Support

The key priorities for this programme for the 2013/14 financial year are the building of whole of government M&E capacity for the whole of government and formulating the Development Indicators. The intended outcomes these to these key priorities are improved M&E capacity and systems in national and provincial government departments, improved coordination of M&E practice through national and provincial M&E forums and M&E Learning Networks. The publication of Development Indicators is to inform the public about trends in key areas of development. The Department reported that all five targets for the first quarter were achieved.

The Department’s target of analysing the reporting requirements of government was completed. This is part of developing a proposed framework on integrating and aligning reporting requirements across government developed. There was one national M&E Forum meeting held and one Provincial Forum meeting held as part of coordinating key stakeholders in the M&E sector. Furthermore, there were two M&E learning networks held in the 1st quarter and the Department is targeting at least one M&E learning network or workshop per quarter in 2013/14.

Programme 4: Public Sector Oversight (PSO)

Key priorities for Programme 4 for the 2013/14 financial year include conducting annual MPAT assessments and having at least 60 per cent of national departments and 80 per cent of provincial departments’ MPAT scores signed off by the Heads of Department (HoDs) by the end of the third quarter. Citizens-Based Monitoring (CBM) would be piloted in three departments with six-monthly progress reports signed off by Programme Managers. In 2013/14 eight sector reports on FSDM findings will be submitted to the eight national sector departments.

With regards to performance in the first quarter, the number of targets set were 18 out of which 15 were achieved, 1 was partially achieved and 2 were not achieved. DPME reported that two targets relating to the Heads of Department (HoDs) assessments which depended on the (DPSA) issuing a new PMDS policy for HoDs were excluded from the first quarter targets. The one partially achieved target relates to the completion of only one case study out of a planned target of three case studies on FSDM best practice. The approval of the case study plan and completing the 3 case studies were delayed because of support to site visits. The department reported that the two outstanding case studies will be completed in the second quarter.

The first target not achieved was the rescheduling of the CBM knowledge sharing workshop from 30 June 2013 to 30 September 2013 as the consultations for the CBM pilot sites took longer than anticipated.  The second target not achieved was the service delivery complaints trends reports which will only be produced in the second quarter as there were delays with securing specialist research capacity.

5.3.2. 2014 MTEF Service Delivery Proposals

The Department indicated that given the constrained fiscal environment in South Africa, priority will be given to increasing capacity in Evaluations and to extend management performance and service delivery assessments to local government. In addition to this, Internal Audit capacity will be increased from 2014/15 and the CBM programme to be established and rolled out.

For the 2014 MTEF, the Department reported that focus areas to be achieved include translating the Medium Term Strategic Framework into new performance agreements and delivery agreements from 2014, monitor progress against delivery agreements, develop and pilot Municipal Assessment Tool (25 municipalities), monitor the performance of individual national and provincial government departments (MPAT), develop Citizen Based Monitoring programme, manage the Presidential Hotline (handle current call levels) and promote good M&E practices in government. Focus areas that it may partially achieve are monitoring delivery agreements on the additional two outcomes identified in the NDP, the carrying out of evaluations of major and strategic government programmes and the monitoring of frontline service delivery.  Focus areas that may not be achieved are the rolling out of the CBM programme in all nine provinces and the rolling out the MAT tool in all municipalities.

The Department will be focusing on developing the Medium Term Strategic Framework (MTSF) for the 2014-2019 period as the basis of new delivery agreements. The MTSF will be a five year implementation plan for the NDP covering the 14 priority outcome areas identified in the NDP. The Department indicated that in developing the MTSF it will seek to keep chapters for each outcome short and strategic; and allocate responsibilities to individual departments as far as possible.

5.4. Other Service Delivery Performance Engagements

5.4.1. Oversight Visit to Gauteng 02-03 May 2013

The Committee undertook an oversight visit to the Gauteng Province from 2 to 3 May 2013. The purpose of the visit with specific reference to the DPME was to conduct a site inspection of the Presidential Hotline in order to ascertain the effectiveness of the call centre, and engage the Department on the processes followed to ensure that citizens’ concerns and queries are resolved by the responsible departments.

The Department provided a breakdown of the Hotline statistics and trends per department and province. It was reported that the top service delivery complaints received by the Hotline are as follows:

· National departments : employment, housing, policing matters, social benefits, information about how Government work.

· Provinces and municipalities : water, electricity, housing, and employment.

The Committee made the following findings following deliberations with the Department on the site inspection of the Presidential Hotline call centre and back office:

·         There are funded vacancies within the State Informational Technology Agency (SITA) that could impact negatively on the Service Delivery Level Agreement between SITA and the Department in terms of the capacity to develop and modernise the system for the Presidential Hotline.

·         There is a need for significant improvement within departments and provinces in terms of the management, monitoring and resolution of cases referred by the Department in terms of the Presidential Hotline.

·         The Presidential Hotline case statistics are not detailed enough in terms of demographic and geographic information. There is a need to improve the categorisation and structure of cases captured.

·         There is a need for quality assurance in the resolution of Presidential Hotline cases. According to a survey conducted amongst those citizens who had submitted cases, about 34 per cent of the participants were not satisfied with the resolution process or outcomes of their cases.

5.4.2. Payment of Suppliers within 30 Days

In the 2009/10 financial year, DPME issued a communiqué to all departments requesting them to ensure that they comply with Treasury Regulation 8.2.3, to pay suppliers within 30 days. DPME together National Treasury has been monitoring the payment of suppliers within 30 days.

The Committee views the non-payment of suppliers by government departments as having grave and adverse effects on the economy and as going against the pro-poor policies and development objectives of government. In particular, the Small Medium and Micro Enterprise (SMME) sector is the most affected with many of such businesses experiencing severe financial constraints and some resorting to retrenching employees to keep their businesses liquid whilst others resort to more drastic measures, such as liquidation. Consequently, the Committee received a briefing from and engaged with the Department of Performance Monitoring and Evaluation (DPME) on the monitoring of payment to suppliers within 30 days for 2012.

Following the briefing, the Committee invited the those departments showing little improvement in paying suppliers within 30 days to brief the Committee on measures taken to ensure payment of suppliers within the regulated timeframes. On engagement with departments on measures to ensure payment to suppliers within 30 days, the Committee indicated that departments need to be pro-active in terms of eliminating some of the commonly cited challenges such as the absence of tax clearances and incomplete or inaccurate supplier details and that these need to be resolved during the procurement processes. Furthermore, departments need to enhance communication with suppliers regarding the requirements that would ensure that all requirements for timeous payments are met.

5.4.3. Development Indicators 2012

In the 2012/13 APP, the Department had targeted to publish the 2012 Development Indicators by March 2013. However, the 2012 Development Indicators report was produced but not published by the end of the 2012/13 financial year as Cabinet requested some additional work to be done though has since been published. Following the publication of the 2012 Development Indicators report in August 2013, the Committee received a briefing on the report in August 2013.

Overall, the Department reported that trends in data indicate that poverty levels are declining but the challenges of unemployment and inequality remain while the positive gains in life expectancy gains should result in positive socio-economic implications beyond health.

5.4.4. Municipal Assessment Tool (MAT)

For the 2013/14 financial year, the Department has targeted to carry out and complete ten municipal assessments. In the medium term, 10 improvement plans will be developed subsequent to the ten municipal assessments and 25 additional municipal assessments will be completed. Consequently, the Department was invited to brief the Committee on progress on municipal assessments on 10 September 2013.

The key basis for the MAT was the continued weaknesses in institutional performance of municipalities which is characterised by poor financial and administrative management, weak technical and planning capacity; and poor leadership and governance. The Department stated that there exists a clear link and correlation between management practice and workplace capabilities and quality of service delivery and productivity.

The assessment is meant to provide the municipality with an understanding of the current state of management in each of the MAT categories against which the municipality can benchmark itself and identify actions to improve management and workplace capabilities. The pilot phase for implementation of the MAT will be from September to March 2013 and 11 municipalities agreed to be part of the pilot phase.

The Department reported that the participating municipalities in pilot phase are the City of Tshwane, Msunduzi, Buffalo City, Cape Town, Moretele Local Municipality (LM), Moses Kotane LM, Rustenburg LM, Naledi LM, Lekwa-Teemane LM, Dr Ruth Segomotsi Mompati District Municipality (DM) and Bojanala Platinum DM.

5.4.5. Progress on Outcomes Report

One of the key priorities for Programme 2 for the 2013/14 financial year is the monitoring of the implementation of the delivery agreements for the 12 outcomes and to produce quarterly reports and briefing notes for Cabinet. In ensuring transparency and accountability, the Department is to continuously update the Programme of Action produce 4 quarterly progress reports for all outcomes and these were to be made available to the public. This Committee was briefed on the progress information on the outcomes on September 2013 which contained data up to the end of the 4th quarter of 2012/13.

The Department indicated that there has been good progress towards many of the targets. Some of the 2013/14 targets have already been achieved though some of the targets are unlikely to be achieved. Furthermore, the Department reported that some delivery agreements are too long and detailed with too many indicators and that information systems of departments needed to be improved.

5.4.6. Management Performance Assessment Tool

The objectives of the MPAT are to establish a baseline of the management performance of departments and to provide the leadership of departments with useful information to inform improvements in the quality of management practices.

The MPAT entails assessment against 31 management standards in 17 management areas with standards based on legislation and regulations. The MPAT is a joint initiative by the Department together with the Offices of the Premier. While audits focus on compliance only, MPAT focuses on getting managers to work more effectively and also covers a broader range of management areas compared to general audits. The MPAT covers four Key Performance Areas (KPA), namely, Strategic Management, Governance and Accountability, Human Resource Management and Financial Management.

In the Strategic Management KPA, 76 per cent of departments’ scores were at Level 3 and Level 4, meeting the legal and regulatory requirements for Strategic Management. With regard to the standards related to Governance and Accountability (G&A), 60 per cent of departments scored below level 3 in the G&A KPA. In the Human Resource Management KPA, performance in the Human Resource Management KPA was very weak with only 25 per cent of departments’ scores at Level 3 and Level 4. In the Financial Management KPA, 54 per cent of departments’ scores in the Financial Management KPA were at Level 3 and Level 4.

The results of the 2012/13 assessments show that whilst some departments made positive improvements, there has not yet been sufficient improvement in the level of compliance with regulatory frameworks and policies. The results point to weaknesses in human resource management in particular. However, also critical is to note that there are at least some departments that are operating at level 4. The Department also found that HR related standards were important for achieving results and thus enabling departments to meet more than 80% of their performance targets. In 2013/14 the Department targets that at least 40 per cent of departments will show improvements from the previous year’s scores for MPAT assessments.

6. COMMENTS ON FINANCIAL AND SERVICE DELIVERY PERFORMANCE

The Committee welcomes the improvement in overall improvement in performance of the Department with the increase in performance against planned targets improving from 63 per cent in 2011/12 to 84 per cent in 2012/13. The Committee was concerned about the finalisation of the Security Plan which forms part of the ICT governance framework and views the approval and implementation of the Departments IT environment and internal policies as critical.

The Committee pointed out that the APP targets relating to evaluations to be conducted need to be more realistic given the fact that the Department had reported that the targets had been too ambitious. The reservation is based on the time it takes to conclude each evaluation.

The Committee pointed to the significant increases in advisory services and travel and subsistence; and views the need for the Department to prioritise the use of internal capacity as critical in ensuring there remains an internal skills base in the long term to carry out the M&E function. The Department indicated that it does seek to balance the use of internal and external skilled specialists in carrying out its mandate and in particular evaluations. The increase in travel costs was largely due to the FSDM programme with 215 site visits undertaken in 2011/12.

The Committee welcomes the donor funding but was concerned on the accountability mechanisms for received funding. The Department indicated that all funding undergoes all PFMA provisions and audited and fully disclosed in accountability documents such as the annual report. The Committee noted the low scores obtained by the Department in some areas for its MPAT assessments, in particular Human Resource Management, and pointed out that these need to be addressed without delay.

The Committee noted that the Department’s expenditure consistently lags projected spending. This results in a significant share of the budget being spent only in the fourth quarter of the financial year. The Committee views the need for spending that is aligned to projection as critical in ensuring that there is no fiscal dumping. In year expenditure should be aligned with planned performance and thus ensure the achievement of deliverables meeting the requisite quality standards. The Department reported that monthly budget and expenditure meetings have been introduced to ensure more accurate budgeting and more exact expenditure monitoring in medium term.

With reference to evaluations, the Committee expressed concerns about the timing and structuring of evaluations and the seeming delays in the progress thereof. The Department indicated that delays were usually experienced as a result of the lengthy process of obtaining buy-in from programme managers, administrative and political offices of the programmes concerned. DPME viewed such buy-in as integral in ensuring that evaluation results are utilised to improve programme implementation.

With reference to the 2012 Development Indicators, the Committee sought clarity on the data used, data sources and the quality control thereof. The Department indicated that they sourced data from government administrative databases, the Census 2011 data, and research done by local and international institutions. Furthermore, the Department indicated that there were concerns around data reliability given that there is no central consolidation point in terms of quality control of information from departments.

The Committee pointed out the need for the data in the Development Indicators to consider the quality of services delivered such as the quality of housing structures and detailing what constitutes services as access to water. The Committee was concerned that indicators such as access to water were often confusing as there may be water infrastructure in place though no actual water due to functionality challenges. The Department reported that quality of services was not taken into account in the current report and will look into redefining the definition for “access to water” to refer to access to water flowing in functional water infrastructure and not solely on existing water infrastructure.

With regards to the MPAT 2012 report, the Committee welcomed the results of the MPAT moderated assessments of the quality of management practices in all 156 national and provincial departments in the 2012/13 financial year. Furthermore, the Committee welcomed the fact that moderated assessment results were positively received by departments with no cases of disagreement between parties on final scores. The Committee views MPAT results as an important oversight tool to be utilised by Parliamentary Portfolio Committees. The existence of efficient management practices within departments is critical for the effective implementation of strategic management, governance and accountability, human resource management and financial management.

With regards to the MAT, the Committee indicated that the MAT needed to be distinct from other systems and interventions usually provided to municipalities. The Department indicated that there is currently no similar assessment tool within the local government sector and that most interventions are usually after the fact e.g. when municipality is in financial difficulty. The MAT’s distinctive feature is its assessment of municipal performance in relation to management practices and basic administration standards. The Department indicated that getting management practices and administrative factors correct was critical as it would ensure that the municipality is producing quality outputs and improves service delivery.  The Municipal Assessment Tool has received wide support from across all sectors. In addition, the Department indicated that it would make the recommendation for the MAT scores to be linked to performance agreements of municipal management.

With regards to progress report on outcomes, the Committee welcomed the positive progress on outcomes though expressed concerns regarding the reliability of the data sources. The Committee enquired whether there were any data quality assurance mechanisms in place to ensure the credibility of information reported by departments. The Department indicated that they used multiple data sources which included departments and other independent agencies.  The Department reported that the aim was to avoid burdening departments with onerous reporting requirements and as a result their focus is on departments to put in place data monitoring and collection systems. The Committee noted and was supportive of instances showing that when the national department was closely involved with provinces in terms of concurrent functions, the integration yielded improvements in outcomes.

7. COMMITTEE FINDINGS AND OBSERVATIONS

The Standing Committee on Appropriations having assessed the performance of the Department of Performance Monitoring and Evaluation, made the following findings and observations:

7.1. The Committee welcomes the positive improvement in financial and non-financial performance of the Department. In particular, the attainment of a clean audit opinion and the significant improvement in performance against targets to above 80 per cent.

7.2. The Department scored less than 3 in 6 standards of the Management Performance Assessment Tool (MPAT) moderated assessments. The Committee views exemplary performance from the Department in MPAT scores as critical for the continued government-wide success of the programme.

7.3. For the first three quarters of the 2012/13 financial year the Department was behind projected spending by an average variance of 39 per cent, in comparison, the variance between actual spending and projections for 2013/14 was 26 per cent in the first quarter and 15 per cent in the second quarter. This indicates an improvement in quarterly expenditure performance.

7.4. The Committee notes that the number of evaluations targeted by the Department has been revised downwards from 20 evaluations for each year to 15 evaluations for each year of the 2014 MTEF, however the Committee is of the view that the target is still not realistic.

7.5. There were 27 sites that were monitored for improve­ments and 65 per cent of service delivery sites which were visited at least twice, and had shown an improvement in scores for at least two of the seven assessment areas.

7.6. The Department has combined the process to developing a Results Bill together with the revision of the Government Wide Monitoring and Evaluation Framework through developing one overarching discussion document.

7.7. The Committee welcomes the introduction of the Municipal Assessment Tool that will contribute towards improving management practices in local government. The Committee noted the resource constraints highlighted by the Department that may impede:

7.7.1 The rollout of the Municipal Assessment Tool in all municipalities.

7.7.2 The monitoring of outcomes progress for the two additional outcomes.

7.7.3 The rolling out the Citizen Based Monitoring programme in all nine provinces.

7.8. The Committee welcomes the publication of the 2012 Development Indicators. The Committee notes that there is no set timeframe for the release of the indicators and that there is no central consolidation point in terms of the quality of data.

7.9. The information management systems to produce required data for outcome indicators is not fully in place in many departments and that the required data for the Department to effectively complete the outcomes progress reports was sometimes not available or not of the required standard.

7.10. The Committee notes that only 15 Annual Performance Plans were submitted by departments for assessment to the Department in 2012/13. The Committee views APP assessments as critical given the findings by the Auditor General on the usefulness and reliability of performance targets of departments.

8. SUMMARY OF REPORTING REQUESTS

Reporting Matter

Action Required

Timeframe

2012/13 Audit Outcomes Action  Plan addressing both the Auditor-General and Audit Committee findings

Written Report/Committee briefing

Fourth Quarter 2013/14

Management Performance Assessment Tool Internal Improvement Plan

Written Report/Committee briefing

Fourth Quarter 2013/14

9. RECOMMENDATIONS

The Standing Committee on Appropriations having assessed and reviewed the performance of the Department of Performance Monitoring and Evaluation in the 2013 Budget Review and Recommendation Process makes the following recommendations:

9.1. That the Minister in the Presidency for Performance Monitoring and Evaluation, as well as Administration ensures the following:

9.2.1 That the Department of Performance Monitoring and Evaluation considers putting mechanisms in place to ensure all national departments submit their Annual Performance Plans for assessment on time.

9.2.2 That the Department of Performance Monitoring and Evaluation considers developing and implementing systems to quality control and consolidate information from government departments on outcomes indicators and indicators contained in the annual Development Indicator Report and Outcomes Progress Reports.

9.1.1. That the Department of Performance Monitoring and Evaluation improves its systems and mechanisms in its in-year budget management to ensure that its quarterly financial performance is consistent with its quarterly expenditure projections.

9.2. That the Minister in the Presidency for Performance Monitoring and Evaluation as well as Administration and Minister of Public Service Administration ensure the following:

9.2.3 That the Departments of Performance Monitoring and Evaluation and Public Service and Administration consider developing systems and implementing mechanisms to improve the Management Performance Assessment Tool scores of government departments in the area of Human Resource Management.

9.2.4 That the Department of Public Service and Administration expedites the development of the Performance Management Development System policy for Heads of Department.

Report to be considered.

Documents

No related documents