Frontline service delivery site visits; Management Performance Assessment tool (MPAT) implementation: DPME outcomes and progress report

Standing Committee on Appropriations

19 March 2013
Chairperson: Mr E Sogoni (ANC)
Share this page:

Meeting Summary

The Department of Performance Monitoring and Evaluation (DPME) briefed Members on its National Frontline Service Delivery Monitoring Programme (FSDM) visit findings 01 April to 31 December 2012. The DPME noted that the FSDM was just about to start with the Citizen-Based Monitoring (CBM) programme. It explained the background to the FSDM Programme, gave summary findings 2012, indicated what areas were rated priority for intervention, reported on key common challenges, positive findings, and the FSDM Programme for 2013. It noted that DPME had identified 81 facilities that needed rigorous supervision over 2011/12. The DPME would produce a mid-year report of its findings by the end of October. In March 2014 it would be presenting its third annual report

Members engaged with the DPME intently on the apparent confusion as to accountability of facilities managers. Members were also concerned about the staffing of the DPME, and asked how its visits were to remain unannounced when the DPME developed and confirmed a visit schedule with the Office of the Premier in every province.

An IFP Member asked if during unannounced visits DPME was using department personnel or consultants? Did departments not have self-monitoring tools? How did DPME match performance bonuses with the kind of lax attitude amongst some of the civil servants? An ANC Member was concerned about capacity in the provinces. He was glad that the Departments of Justice and Constitutional Development and Transport had been identified as needing to delegate facility level work to facility managers and allocate budgets for that purpose as that would sort out many of the complaints. Another ANC Member asked how DPME chose what to monitor in the facilities visited and what were the criteria it used to monitor. Another ANC Member wanted clarity on linking performance bonuses with the attitudes of civil servants.

The DPME reported on its Management Performance Assessment Tool (MPAT) Report on the result of the assessment process, and on progress on the Performance Assessments Review for Heads of Departments (HoDs). The DPME had adapted the tools seen in other countries and had contextualised them for South Africa. There was a high level of correlation between MPAT assessments and the Auditor-General of South Africa (AGSA)’s findings. The advantage of MPAT assessments was that they served as an early warning. The 2011/12 report was based only on self-assessment results. The methodology of MPAT was that it was based on self-assessment, but then went on to external peer moderation. The DPME had identified 31 core management standards. MPAT ratings were explained. The main aim was to encourage departments to learn from each other without the need for extra budgets for expensive consultants. It was not good enough just to set the policy. The DPME needed to go out to help the departments comply. The DPME explained revisions to Head of Department (HoD) performance review system in favour of a technical HoD panel assessing peers, which was a model applied in other countries

Members were especially perturbed that DPME was apparently bypassing the Appropriations Standing Committee Appropriations, which was really its portfolio committee, and apparently assuming that its oversight reporting was more intended for the national Cabinet rather than for Parliament.
 

Meeting report

 

Introduction
The Chairperson apologised to the Department of Performance Monitoring and Evaluation (DPME) for a misunderstanding over the Director-General (DG)’s apology. It had indeed been received, but not forwarded to him.

National Frontline Service Delivery Monitoring Programme (FSDM) site visits: outcomes
Ms Bernadette Leon, FSDM Head, began the update on the FSDM, reported on the work that had been done, and briefed Members on some of the findings that DPME had uncovered through that the site visits 01 April to 31 December 2012. (See DPME National Frontline Service Delivery Monitoring Programme Monitoring Visit Findings 1 Apr/31 Dec 2012 document)

There were three programmes within the FSDM programme and Ms Leon listed them. The FSDM was just about to start with the Citizen-Based Monitoring (CBM) programme. The FSDM would focus only on the unannounced monitoring visits, as the programmes were quite detailed. (See document for more details)

Background to the FSDM Programme
The information from the presentation was from the second annual report on the findings and was on new monitoring visits that were conducted from April 2012 to December 2012. The programme still remained a joint programme between DPME and all Offices of the Premier (OoP). The FSDM was now active in all nine provinces. The tools of the FSDM monitors were - using a set of questions that were targeted at citizens and staff and scoring the questionnaire; every facility had a summary report that DPME produced. FSDM had visited 215 new facilities in total during its visits and the sector and provincial facilities totals were given as outlined in the presentation.

Summary Findings 2012:
The findings were very high level for the 215 facilities. The eight areas that were assessed were scored as per colour code (see document); and services that were relatively good were location and access, safety, and dignified treatment. Those that performed poorly were visibility and signage, queue management and waiting times, and complaints and complements systems. Cleanliness and comfort would be the fourth area that needed intervention.

The usefulness of the different representation was, if complaints and complement management was to be an example, that it gave an indication of what users scored; for example, about 40 % of the people that were interviewed scored complaints management as very poor. That representation gave a good idea of what the user scores, the staff scores, and what the monitor scores were. Interestingly the users scored complaints management as quite poor whereas the staff scored it as being reasonably acceptable. That gave the DPME an indication of whether its staff as officials and the people who were using those facilities viewed the situation in those facilities in the same way.

2012 Findings: What Areas were Rated Priority for Intervention
People felt that, in the courts, the complaints and complement systems were the first priority areas for intervention. The DPME provided that information to sector departments because it was important for them to get a view of what was important to the people using the facilities because sometimes what the departments planned as their priority interventions and what was important to the people using the facility were not necessarily the same.

Visibility and signage scored poorly because of the reasons given (see document). Years ago it was decided that, for the protection of the brand, government facilities were to have standardised design specifications for signage of all government facilities. It had emerged that this policy might have unintended consequences in facilities in that there were a many delays and costs associated with signage and design specifications. One would find in some facilities that managers would say that there were no signs giving directions to the rest rooms as they were waiting for them from head office; whereas, with good management and leadership, the DPME could simply instruct the managers to print from their computers a sign to direct people to where restrooms could be found. But when the DPME presented that finding there was nervousness, especially from Government Communication and Information Systems (GCIS) since it felt that protection of the brand and, design specifications and signage were very important. The DPME suggested to GCIS that maybe it should do a cost-benefit analysis to see whether it was still a reasonable policy to implement.

The second finding about queue management systems that DPME discovered was that it was seldom that managers would be walking the floor although that was an expectation if one was doing an intensive quality service delivery improvement programme; because one needed to be on the floor watching the operations of the facilities to be able to make adjustments on a daily bases to make sure the facility worked well and that behaviour in facilities management was very rare. South African Social Services Agency (SASSA) and Home Affairs had started implementing that style of management, insisting that managers be very active on the floor and one could immediately see the positive impact that had on the management of those facilities. Those were low cost solutions, as they required no additional budget but just a different style of management they were recommending.

DPME had been saying to departments that complaints boxes and registers should be installed in facilities and facilities had done that, but years of unresponsiveness from the state had meant that citizens had lost trust in that system. So DPME was suggesting that a public display of received complaints be implemented together with the management’s response.

The DPME also believed that the Departments of Justice and Constitutional Development and Home Affairs should consider delegating the management of functions like maintenance and cleanliness to facility level, providing facility managers with the required budgets, and then holding them accountable for day-to-day maintenance and cleanliness. That was an initiative that had started off with the Department of Health’s appointing Chief Operations Officers for hospitals with allocated budgets and delegation to manage facilities. The DPME was hoping that in the system of courts and in the Department of Home Affairs that approach would be rolled out in the future.

The positive findings for dignified treatment might have possibly been as a result of government interventions of addressing people in their local languages; that also may have contributed to the overall good result of the summary findings.

Key Common Challenges:
Accountability for acting on the findings was still generally very poor. Since there were really no consequences for not acting on a finding, the DPME was working very intensively with departments about the findings. But within departments themselves, there were no clear accountability guidelines for responding to the kinds of findings that the programme was revealing.

At the tabling of their first report DPME noted that the unclear responsibility and accountability for facilities management was a significant factor contributing to some of the conditions that DPME saw in facilities. There was confusion about facilities maintenance, facilities management and contracting management because of the layers of involvement of national and regional departments of Public Works.

Where standards were starting to be introduced, as in the South African Social Security Agency (SASSA), the Department of Home affairs, and the Department of Health facilities, the difference could be immediately be seen.

Positive findings:
The South African Police Service (SAPS) was very interested in the findings and DPME had a good working relationship with SAPS.

The DPME emphasised that during the first year of the FSDM it did not have very good co-operation from the Department of Transport because its national office had to be convinced that it had a leadership role in drivers licensing centres even though it had delegated the management of these centres to provinces and municipalities. But it had eventually realised that if transport licensing centres failed, the Minister of Transport was seen to be responsible even though there were layers of agency agreements. The national Department had now set up a small team, so that when DPME went back for feedback sessions, it would ask to accompany DPME so that it could get experience about what that type of monitoring entailed and could take responsibility for the findings. DPME was confident that would make a difference since that Department had developed that capacity.

DPME had also signed similar agreements with the Department of Justice and Constitutional Development, the Department of Home Affairs, SASSA, and the SAPS, so that even though the visits were still unannounced, their monitoring capacities accompanied DPME when it went back to do the feedback visits to discuss the findings so that they could take responsibility for them. The DPME thought that a positive thing. DPME always maintained to the sector departments that the more monitoring the departments did themselves, the less the DPME would have do in the future, Because the departmental monitoring approach did not have much of a footprint in facilities, DPME felt it was necessary to go itself.

FSDM Programme for 2013:
The DPME did upfront planning for the whole year with the nine provinces. It was important to do that because it had a defined team that needed to be everywhere in the country doing certain things. If it did not know where it had to be within the nine provinces, it could not complete its work. It was quite rigorous in insisting that planning for the year be confirmed in advance by OoP and as soon as that schedule was completed it would be made available to the Committee, so that the Committee could select some visits on which it would like to accompany the DPME.

>From April the DPME was to start implementing the Annual Visits Schedule (AVS) in the nine provinces and it had set a target of 160 new monitoring visits for the year. The DPME’s year was from April to December and that if they did not stop in December they would not be able to analyse and consolidate and produce that very detailed report which every sector department needed, together with nine more additional very detailed provincial reports as well. Provinces often wanted to do large numbers of visits, but DPME was struggling to demonstrate to them that more did not mean better. The DPME preferred depth in its visits and wanted to demonstrate what good quality monitoring meant; which meant that for reports to be of a very high quality it had to do proper analysis and could make sure that it went back in two months to go and discuss the findings.

OoP had small teams that were established to implement the FSDM programme and certain provinces had large teams to do this, whereas others had smaller ones, and were reliant on community development workers supervisors to do the visits with DPME.

If the report was not rigorous enough, then that visit would not be included in the new report, since that would mean that the findings had not being discussed or the information was not rigorous enough.

The DPME made an announced feedback meeting within two months after the first unannounced visit to new facilities, but that had to be scheduled within the AVS. This year the DPME wanted to visit the departments ahead of time at least once a quarter.

The DPME wanted to focus on improvements monitoring, but wanted to make sure that the first findings were acted upon. If DPME kept on doing just new visits, then the facilities managers would realise that no one ever came back to monitor if they had acted on the findings. This had often happened over the years, because the DPME would go and monitor and no one ever came back to check on previous findings. So DPME wanted to show from this year that it came back often, sometimes announced and sometimes unannounced, just to have discussions with the teams about how they were doing and if they were they facing any challenges on which they possibly needed assistance so as to keep some pressure on them.

DPME had identified 81 facilities that needed rigorous supervision over the years 20/2012, and to those facilities the DPME wanted to return often. DPME had set a target in its annual plan to improve. (See document).

Reporting on the 2013 findings
The DPME would produce a mid-year report of its findings by the end of October. Again in March 2014 it would be presenting its third annual report.

From 2014 DPME wanted to make sure to provide to Cabinet detailed sector reports, for example, on all the police stations that DPME would have monitored, Cabinet would be able to access the absolute details of those visits. After Cabinet had seen that report, DPME would then do a public high level summary of that report.

Ms Leon said that the Citizens Based Monitoring Framework CBM) now being initiated would not be presented, but DPME had summarised in the remainder of the presentation the draft framework that was to go to Cabinet. It was A Framework for Strengthening Citizen Participation in Monitoring Government Departments. The programme would be piloted with SAPS, the Department of Health and SASSA, as the DGs of those Departments had committed themselves to be pilots and the DPME would be working with them, starting in 2013. In March 2014 the DPME would be able to table its first findings from that pilot to the Committee.

The Chairperson asked Ms Leon to tell the Committee how DPME conceptualised the Citizens Based Monitoring Framework.

Ms Leon explained that the DPME was doing that work as part of its mandate to promote good Monitoring and Evaluation (M&E) practices in government. DPME had conducted a consultation process in the writing of the framework; it had received comments from 12 departments and statutory bodies. It had 13 one-on-one engagements with Civil Society Organisations and received 205 very detailed comments. The level of interest and engagement received from State departments impressed it. The framework established a very high level set of principles. The DPME believed that monitoring was something very sensitive and required a certain level of maturity in M& E, and felt it was important to work with departments that voluntarily committed to be part of that particular pilot. A few departments did that through writing letters of commitment on wanting to be part of the pilot, namely SAPS, the Department of Health and the Department of Social Services, but those Departments would choose the piloting sites. DPME hoped other departments would come on board, but they did not want to force departments to do that programme if they were not ready.

Ms Leon said that she had not intended to go into detail about CBM. Her intention was to alert the Committee that the framework had been completed and that it would be going to Cabinet in the near future, but what DPME did have were the beginnings of a pilot, starting in April.

Discussion
The Chairperson asked whether Members had questions.

Mr M Swart (DA) asked how the DPME found the co-operation from the Department of Public Works and how guilty was it of ill-performing their tasks, since the DPME had mentioned its findings that there was confusion about responsibility for maintenance in facilities management.

Mr N Singh (IFP) asked to what extent DPME was using best practice to influence other departments and other offices in other areas to try and do well as someone else. Since Members had accompanied the Standing Committee on Public Accounts (SCOPA) on site visits, it found certain facilities performing very well, but others were terrible yet had the same kind of resources, so it was not what was available in terms of technology but the people operating behind it. Secondly, during unannounced visits, was DPME using department personnel or consultants? Did departments not have self-monitoring tools? Could they not inculcate that culture of not having to waiting for big brother to come and monitor? How did DPME match performance bonuses with that kind of lax attitude amongst some of the civil servants that the state had?

Mr J Gelderblom (ANC) was concerned about capacity in the provinces. Since DPME planned to conduct 160 new visits for the year, he asked whether co-operation levels were similar in provinces during visits. Also since Premier’s Offices had capacity to monitor, did the DPME provide training for them, or did DPME experience problems there? He wanted some more clarity on that issue. He was glad that the Departments of Justice and Constitutional Development and Transport had been identified as needing to delegate facility level work to facility managers and allocate budgets for that purpose as that would sort out many of the complaints; therefore he wanted clarity on the link between those Departments and DPME.

The DPME replied that in terms of the improvements the link between itself and the various provincial departments and OoPs, there was a norm being adopted by the whole country that had being established in the Free State by regional managers. This was if there was a finding about a particular site X, that finding then belonged to the whole region

Whatever was available in the Office of the Premier was being used, without using consultants. Sometimes not enough funds were made available for monitoring, but DPME had gone to a point of committing to assist those OoPs to go all out so that they could implement those M& E programmes.

OoPs monitoring capacities were underdeveloped and the DPME was developing those capacities to perform proper M& E and not only for the FSDM, but throughout their general collaborative work they were doing together. The other benefit of the FSDM for OoPs working with DPME was that they were being taught that type of M& E so they could apply it to infrastructure projects and others. By being part of that programme, OoPs were learning the value of onsite monitoring and how it gave some dipstick evidence of what was going on in the field. They often complained that they struggled to get government support in creating vacancies to delegate such work, now that they had a better understanding of what their M& E role should be. What the DPME did with FSDM was that it profiled the work from OoP to encourage them to remain with them in doing monitoring. A team was in place within the DPME to find funds so that there was a proper team to implement the FSDM. There were certain provinces with difficulty in budgeting for the current programme as the FSDM was an operationally intensive programme.

The monitoring team was the DPME and staff from the Office of the Premier and there were no consultants for that work.

Ms A Mfulo (ANC) wanted clarity on the finding that accountability for facilities management including maintenance and contract management remained a challenge. This required policy intervention; what did that mean? She also noted that even after training there was a need for intervention on attitudinal change in civil servants by the DPME to overcome the challenges in service delivery. She also wanted to know whether DPME had considered successful innovations from other departments visited As part of monitoring, was DPME also checking on universal accessibility of facilities? She asked if DPME could make sure that universal accessibility was part of their monitoring tasks.

Mr Mughivela Rambado, Director: Presidential Hotline and National Frontline Service Delivery Monitoring,  circulated a short video clip to the Committee Members about the attitude of civil servants in state facilities. He was deliberately circulating the clip, because whatever was in it was happening in an urban facility and not a rural one. It was not an issue of a budget, neither was it an issue of management, but an issue of the change of mindset, therefore what DPME was saying was that it had to work on changing the attitudes of  colleagues on the ground. It needed to brainwash them to be able to see what needed to happen, without waiting for a big brother to come and tell them what was needed in their own house. Accessibility and location would be separated within the questionnaire. If the first findings revealed lack of accessibility, the DPME would leave the report with the facility manager and, when DPME came back for a feedback visit, one would find that those things that the facility management could improve would have been improved.

Ms L Yengeni (ANC) asked how DPME chose what to monitor in the facilities visited and what were the criteria it used to monitor. Why was the DPME not taking into account the five priorities identified by the President in his State of the Nation Address (SONA)? In that the Eastern Cape had not been visited but was seriously having problems in the education sector, she asked who was responsible for monitoring, whether the departments were focusing on the priorities from the SONA, and if the M& E located in the Presidency was not monitoring that.

The DPME replied that it did not choose the sites, but there was collaboration between DPME and OoPs; but then some issues had been raised in the Presidential Hotline and Premier imbizos, but DPME did not report on those programmes because it was not making use of the tool (FSDM); but whenever a Premier or an MEC went out with their team, DPME made sure that they implemented the tool so that they could report on it systematically. Nonetheless the provinces reported in their own way, but what they had done jointly as DPME and OoP was that the report to National Cabinet was being used for provincial Cabinets as well.

The sectors were chosen because the monitoring was for facilities and not departments; so the DPME was monitoring the working of school facilities instead of the Eastern Cape provincial department of basic education. The origin of the programme was the commitment in outcome 12 in the delivery agreement that government was committing to the improvement of the quality of service delivery in a specific set of facilities, including SASSA, SAPS, and health facilities and courts. DPME had added education and municipal customer care centres.

The Eastern Cape leg of monitoring started late similar to that in the Western Cape and KZN. In those sites visited, DPME collaborated with OoP to choose the sites, as it had currently done with the 2012 AVS. It had gone on a road show with OoPs and had given them enough latitude to say which facilities needed urgent intervention of the FSDM programme, and the OoPs would do the identification themselves.

Ms Yengeni thought that she understood the DPME’s mandate, but was still insistent that before each and every department received a budget it must acknowledge that part of its objectives was to improve the facilities the DPME was talking about. If the DPME went to monitor at a police station, the staff and complainants would be the ones giving information about the quality of service they were giving and receiving. All the work that DPME was doing was work that each department had budgeted for and had being staffed for; therefore there was no difference in the DPME’s mandate and departmental monitoring mandates.

Ms Mfulo suggested that, in terms of monitoring, DPME should also look into the allocated budgets for monitoring and find out whether they were being used properly through unannounced visits. DPME must also go and check in the OoPs about what that staff was doing because she had personal experience of municipal facilities that had been faulty for the past three years. The DPME should align those unannounced visits with the given budgets, and should go to check that what the department had applied for was happening. The DPME would find discrepancies in that regard.

Ms R Mashigo (ANC)’s concern was still the link between sector departments and DPME. She asked what the DPME did after finding that there were no improvements in those particular departments. She wanted clarity on linking performance bonuses with the attitudes of civil servants that was mentioned by Mr Singh. She wanted to know what impact the DPME’s findings had on sector finances. Was there any collaboration between public institutions, for example, did departments share success stories on service delivery? The user impressions should be prioritised in FSDM implementation.

The DPME replied again that, in terms of the improvements, the link between it and the various provincial departments and OoPs, there was a norm being adopted by the whole country that had being established in the Free State by regional managers. This was that if there was a finding about a particular site X, that finding then belonged to the whole region. It said that those concerned started taking responsibility by going to check and analyse.

The Chairperson noted that the presentation had a lot of red areas and findings according to the departmental score (see document). Location and accessibility were well placed. Maybe the DPME visited many facilities, whilst the Committee visited its particular areas, but he was inclined to agree with Ms Mashigo that some of the findings did not correlate with the Committee’s personal experiences of those facilities which were well scored in that report, whether in urban or rural areas.

DPME commented that SASSA had committed itself and was moving things and changing as quickly as was possible. There was a need for managers that were able to change things and use their own initiative.

The Chairperson was also in agreement with Ms Mfulo about requiring policy change in facilities management and that people knew their responsibilities. Otherwise, the next thing the state would have to do would be to employ consultants to perform those duties. The Chairperson also wanted to know whether that report included the provinces or if they reported somewhere else, and how the DPME got feedback on their performance.

The DPME replied that the provinces reported in their own way, but what DPME and the OoP had done jointly was that the report to national Cabinet was being used for provincial Cabinets as well.

The Chairperson said that those visits were supposed to be unannounced but the DPME was sending a programme for confirmation in advance. So were they still unannounced? The schedule gave the Ministries an advantage about what to fix for which particular date as a bad report would reflect badly on the management.

The DPME replied that, whether unannounced or announced, one would realise that DPME collaborated with the Offices of the Premier in planning and agreeing on the dates and facilities that needed to be visited for feedback, but did not announce the dates when they would be coming to a particular facility when the DPME sent the AVS to the OoP. That was the DPME’s own internal agreement and it found that collaboration and agreement very workable.

The Chairperson was also worried about what the DPME meant about the CBM framework. If he lost his ID document he would go to the Department of Home Affairs immediately and want a new one. How did the DPME intend to conduct that monitoring participation?

Ms Mfulo asked how the DPME aligned municipality monitoring with service delivery. The DPME was working with established public institutions on the CBM, but what about community based organisations and non-governmental organisations (NGOs)?

The DPME replied that it had started small, but was thinking big. It was looking at about eight areas that it needed to asses and when municipalities customer care centres were visited there were issues of service delivery that were being raised; but DPME said that there were relevant sectors that were going to deal with those issues in detail.

Ms Mashigo wanted clarity on the progress of the results bill from the DPME’s consultation with departments, because that would maybe assist the DPME going forward.

The DPME replied that it saw its core responsibility as drivers of the outcomes. Those outcomes that were agreed to in 2009, and it had to give a progress report in 2014. Quarterly it provided feedback and reported to Cabinet. The FSDM component and MPAT were underpinning programmes that were trying to sort out the basics. So DPME was not the only institution doing monitoring, therefore it fully accepted that departments had to have their own M& E mechanisms and that was important. Even the DG was on record as saying that DPME would be successful when there was no longer a need for it. It was simply the M&E arm of the Executive; it also did not replace the oversight work of Parliament and other institutions.

There were still ongoing discussions on the results bill and there had not been much progress; but in the work that DPME were doing it had not found any hindrance. Working with OoPs was extremely advantageous and DPME’s experience was that OoPs had adopted the approaches and principles of DPME and were quite keen themselves to identify monitory frontline services, monitor institutional performance and act upon those.

The Chairperson remarked on DPMEs last comment about every department having capacity and budgets. It was something of an anomaly for those very same departments to say that they had no budget for M& E of their work. That was something on which DPME needed to engage provinces, because there were clear mandates for all civil servants. DPME seemed to be unaware of Parliaments’ role in monitoring. The Committee should have been receiving the reports more often and, as part of its oversight, it should be invited to go on scheduled monitoring visits.


Management Performance Assessment tool (MPAT): implementation progress
Mr Ismail Akhalwaya, DPME Acting Director-General: Public Sector Administration and Oversight, presented progress on the Management Performance Assessment Tool (MPAT) and on performance assessments review for Heads of Departments. (See documents)
 
The reasons behind the MPAT were that the outcomes were the core reasons for the DPME’s existence, but it and the state had identified in various reports that improved management practices were also key to service delivery. Many departments had complained that the Public Finance Management Act and the Public Service Act were constraints, but DPME had found examples of departments that were operating very effectively within those frameworks. It was also very important to link how an institution was performing to how the head of that institution was assessed, and DPME was working with the Department of Public Service and Administration (DPSA) to review the policy and ensure that it was put in place.

The DPME had adapted the tools seen in other countries and had contextualised them for South Africa. There was a high level of correlation between MPAT assessments and the Auditor-General of South Africa (AGSA)’s findings. The advantage of MPAT assessments was that they served as an early warning. The 2011/12 report was based only on self-assessment results. The methodology of MPAT was that it was based on self-assessment, but then went on to external peer moderation. Detailed peer moderation had been done on 156 self-assessments from all departments, but DPME was still consulting with departments and would be releasing those results by June after it had gone to Cabinet.

The DPME had identified 31 core management standards. The advice received when looking at international models of similar tools was to keep the tools small and simple.

As to methodology, the process started with senior managers sitting around a table and assessing themselves. The assessments were done annually.

As to moderation, DPME recruited senior national and provincial policy experts, who were civil servants that did that additional work. The DPME also identified good practice during moderation, since it was looking at those departments that could provide evidence that they were operating at a high level.

MPAT ratings
Level 4 was a DPME addition when it decided that it was not enough to just operate within the legal prescripts. The difference between level 3 and 4 was that level 3 was compliance just because the Public Service Commission, DPSA or National Treasury required it, and it and would have no effect on changing how a department performed. The DPME was expecting all departments to be at level 4, since stopping at level 3 would mean they were just being compliant.

From the MPAT report 2011/12 results, it could be seen that 78% of departments were either at level 1 or 2; that was very useful data which indicated that there was still a lot of work to be done in ensuring that departments moved to at least level 3,but preferably moved to level 4.

Improving Management performance
The main aim was to encourage departments to learn from each other without the need for extra budgets for expensive consultants. It was not good enough just to set the policy. The DPME needed to go out to help the departments comply.

Value additions of this process
One could fix a problem only when one owned the problem and identified the problem, so self-assessment was important in that the DPME sought to find management acknowledging its problems.

Limitation of this process
There was a risk that departments could be doing the wrong thing very efficiently.

Revisions to Head of Department (HoD) performance review system
The DPME was trying to change the status quo where the Executive Authorities (EAs) were the panel and it was difficult to get a meeting with all of them. The proposal was that one would have a technical HoD panel assessing peers, which was a model applied in other countries. It was quite a progressive step to put pressure on the Ministers to respond in time.

Discussion
Mr Gelderblom commented that that there would come a time when departments could no longer say that there was not enough capacity. It was now for departmental leaders to take responsibility for bad planning and leadership by their managers. He asked DPME to comment.

Ms Yengeni wanted a list of the national departments that had not submitted their annual reports.

The Chairperson asked again whether the new report would be very different. He asked whether the service delivery improvement mechanism meant that DPME had visited departments before or whether DPME was still to visit for feedback. When it spoke about annual reporting where some departments did not have internal audits arrangements, was DPME talking about national departments or was it including provincial departments.

The DPME replied that the 13% of departments without internal audit arrangements from the MPAT results report was a combination of national and provincial departments.

The Chairperson said that the Committee did not expect that from provincial or national departments because it was a basic requirement for all departments to have internal audits. He repeated that earlier the Committee had asked what it was supposed to do with the report if it was already too late for improvements to be implemented. The Committee was looking forward to the new report, for 2012/13 report. When DPME spoke about risk management arrangements, did it include issues of procurement?
The DPME replied that there was a separate standard on financial management as given in chart 5 of the MPAT result report 2012/13.

The Chairperson asked if DPME could take the Committee through that chapter again.

The DPME replied that from chart 2 it could be seen that only 14% of departments were doing evaluations and that was in 2011/12. In strategic planning and alignment departments seemed to be doing fine although there was worry that there were departments at level 1. From chart 3 it could be seen that close to 80% of departments did not have the minimum service delivery improvement plans in place. The functionality of management structures again was also bad.

The Chairperson said that when DPME talked about functionality of management structures it should be self- explanatory.

The DPME replied that the standards would be very clear about what a level 3 management structure was. The DPME would expect at level 3 that a department would have a terms of reference for its management committees, to have minutes, and to have action items on which to follow up. Therefore, each one of the documented standards had a set requirement of what was needed at every level.

There was a standard that required financial disclosures and the DPME was monitoring whether departments submitted financial disclosures on time to the Public Service Commission (PSC). More than 50% of departments were non-compliant with fraud prevention.

Mr Akhalwya read through the charts from the MPAT results report of 2011/12, but emphasised that those were self-assessment results and were not moderated scores. There had been areas of improvement and stagnation since the publication of those results.

Mr Swart said that the report did not clarify which departments were in trouble?

The Chairperson said that maybe that report was good for the beginning.

The DPME replied that the Cabinet had decided that DPME should release those results of the national departments to the public. The DPME had been called and had presented to certain portfolio committees on those results, but could make them available to the Committee.

The Chairperson had received a report on alignment from the PSC and it was the Committee’s work to worry about the lack of alignment between the DPME’s annual performance plans and spending. Soon there would be strategic planning. He asked if it was possible that during the year if there could be assessments of those annual performance plans vis-à-vis the expenditure; so that it did not take the whole year to find that spending had been directed elsewhere, while the Annual Performance Plan (APP) was to go in a different direction.

Ms Yengeni commented that Members had a right to get reports on departments before they were sent to portfolio committees, without having to make a special effort to retrieve them.

Mr Swart added that the Committee’s work was to see whether the spending that the Committee appropriated was correctly allocated.

The Chairperson said that some of the reports on which it was meant to work took a very long time to arrive and by then problems would have arisen and not been resolved.

The DPME noted the Committee’s points and comments were noted and from July the DPME would be available to the Committee for engagement on its reports. It would also formally table the report cards of the last round for 2011/12 to the Committee. It had discussed with the Portfolio Committee for Rural Development and Land Reform its Department’s 2012/13 results.

Ms Yengeni was not sure if it was proper for DPME reports to go to Cabinet before coming to the Appropriations Standing Committee. The DPME had to understand that the fact that Cabinet would have seen the DPMEs own assessment before the Committee did not make the assessment accurate, because the DPME would come back to the Committee to find that there were differences on certain issues. The DPME should not then expect the Committee to agree with its report findings even if the Cabinet would have approved it. It would be important to have sufficient confidence in the Committee to agree to disagree on assessments so that when the DPME went away, it would have engaged with the Committee and been made aware of anything that it could have possibly missed.

The Chairperson emphasised that the Committee wanted the reports whilst they were still being considered so that it could assist the DPME. There was a good reason for the establishment of DPME, but it was really necessary to find a way of working together, and that was why the Committee had to retrieve certain reports by its own efforts. Maybe DPME was not used to working with a parliamentary Portfolio Committee as it focused more on Cabinet reporting. DPME should internalise the fact that the Appropriations Standing Committee was its portfolio committee. He repeated that DPME’s reports should come to the Committee before they went to other committees, so that the Committee could understand the DPME’s challenges. Moreover, the Appropriations Standing Committee was working across departments.

The meeting was then adjourned.
 

Present

  • We don't have attendance info for this committee meeting

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: