Frontline Monitoring and Citizen Based Monitoring Tools; Performance Agreements of Heads of Department

Public Service and Administration, Performance Monitoring and Evaluation

18 September 2019
Chairperson: Mr T James (ANC)
Share this page:

Meeting Summary

The Committee was briefed by the Department of Planning, Monitoring and Evaluation (DPME) briefing on frontline monitoring and citizen-based monitoring tools, and on the status report on the filing of Director General (DG) and Head of Department (HoD) performance agreements.

The DPME described the monitoring tools in nine bullet point points, and said frontline tools such as the Presidential Hotline and front-line desks at government departments did not just monitor the quality of services at the institutional and facility level, but also provided citizens with an opportunity to complain about services rendered or to query issues. In addition, they allowed the government to monitor itself. However, there were some service-related issues that had proved to be complicated and challenging, thereby requiring a multi-faceted approach and the assistance of a number of other government departments.

The Committee asked what tools were used for monitoring in the rural areas, because hotlines involved cell phones, and this often posed challenges. They also wanted to know if there were ways of measuring customer satisfaction and if the Department did not send people to the areas to interrogate the people on their satisfaction. Because information communication technology (ICT) was an integral part of the monitoring process, did they give consideration to the risk of interruptions leading to non-delivery of services? Did the Department have ways of avoiding a duplication of services between itself and the Human Rights Commission (HRC)?  Did it have a programme, or someone responsible, for citizens’ education to ensure communities cooperated in looking after government infrastructure? Were there consequences for late submission, or non-compliance with performance management requirements?

The Department gave a report on the HoDs’ performance management and development (PMD) system prior to 2018, showing the key result areas as constituting 80%, and the core management competencies as 20%. One of the challenges was the concern expressed by Cabinet that DGs received bonuses while their departments had qualified audits. Another was the backlog created due to evaluation panels not sitting, and DGs not receiving pay progressions.

Meeting report

The Chairperson said the purpose of the meeting was to hear a briefing by the Department of Performance Monitoring and Evaluation (DPME) on the frontline monitoring and citizen-based monitoring tools, after which the Committee would also receive a status report on the performance agreement of accounting officers. Frontline service delivery was at the core of service delivery and it was therefore crucial that the Department gave an account of their activities in the country across the different provinces.

 DPME: Frontline Monitoring and Citizen-based Monitoring Tools

Dr Neeta Behari, Head: Frontline Service Delivery, DPME, said the Department was the interface where government met citizens, so it was necessary that the findings of her unit be interrogated. She explained the National Development Plan (NDP) Vision 2030 using a diagram, and described the DPME monitoring systems at the national, provincial, local and community spheres, showing the purposes of each sphere and the DPME systems used at each one.

She explained frontline monitoring and support (FM&S)in terms of its strategic objectives, its purposes and its functions. She described the frontline monitoring support systems as an integrated approach, using a diagram to show the DPME’s interconnectedness with the Executive monitoring support (Izimbizo and Siyahlola), the frontline monitoring and support, the presidential hotline and the citizen-based monitoring (CBM) monitoring using civil society and citizen feedback.

Dr Behari said FM&S was an integrated approach that had emerged from a strategic reconfiguration process in the DPME in 2018 / 2019 financial year, and the integration brought together the methodologies and tools of frontline service delivery monitoring (facility-sector based), executive monitoring, citizen based monitoring and the Presidential Hotline into a seamless monitoring system with the intention of extending coverage of frontline monitoring, resource efficiency, enhanced effectiveness, and better coordination of frontline monitoring activities. This monitoring was done jointly with the Offices of the Premiers (OTPs) across sectors and the three spheres of government -- national, provincial and local.

Different approaches were utilized in FM&S, such as an area-based geographical model of monitoring, which includes sector monitoring of facilities and projects, with an emphasis on the coordination of government stakeholders across all spheres of government in identification of challenges facing communities and in improving of services. There was sector-based monitoring around national and presidential priorities, based on the executive monitoring approach and a rapid response approach. Typical examples were the inter-ministerial committees, the comprehensive social protection and the North West Intervention 100.

The integrated approach raised some sensitive issues in frontline service delivery, such as the logical framework components cutting across all the FM&S tools. She listed the five steps that were used in all the FM&S tools for frontline service delivery:

1. Planning

  • Develop FM&S programme toolkit; programme review strategy; monitoring approach; targeting strategy; training toolkit; knowledge management strategy;
  • Programme review and reporting strategy;
  • Develop joint annual implementation plans;
  • Conduct pre-research to identify priority areas for monitoring;
  • Training.

2. Implementation

  • Conduct unannounced/announced on-site monitoring;
  • Site verification monitoring ;
  • Community/citizen-based monitoring;

3. Communicating and committing on results

  • Hold feedback sessions to present findings; 
  • Conduct root-cause analysis and prioritise areas of improvement;
  • Finalise improvement plans;
  • Agree on reporting format and timeframe;
  • Communicate monitoring and feedback plans to broader community.

4. Communicating and committing on results

  • Identify areas for facilitated monitoring by the DPME and OTPs; governance structures to monitor commitments;
  • Multi- stakeholder engagements;
  • Update improvement plan based on the stakeholder engagements;
  • Conduct: Improvement monitoring and site verification based on reported progress;
  • Update site/facility monitoring reports.

5. Programme reviews and reporting

  • Provincial Programme Reviews
  • Review Workshop
  • Integrated Reporting
  • Produce Knowledge Products

Citizen-based monitoring (CBM) aimed to strengthen government’s ability to involve communities and citizens in monitoring service delivery, and to ensure government was responsive to the experiences, expectations, perceptions and needs of communities.  The DPME was not the implementer of citizen-based monitoring, but rather facilitates and builds capacity of government officials and departments, and serves as an institutional repository for CBM methodology, good practices and approaches. It was a knowledge partner providing support to government institutions which undertake to implement citizen-based monitoring. 

Dr Behari said FM&S was working towards bringing together the Departments of Cooperative Governance and Traditional Affairs (COGTA), Public Service and Administration (PSA) and the OTPs into an integrated frontline service delivery monitoring system. The Community Development Workers Programme (CDWP) was integral to this work, with the aim of activating ward level information as a critical component of an integrated service delivery reporting and monitoring system that links communities and the three spheres of government. The work was ongoing, with Gauteng testing a collaborative model encompassing all community workers from all sectors.

She gave examples of citizen-based monitoring in Phokwane in the Northern Cape in the form of ward-based community dialogues, which were spearheaded by the provincial OTP and the DPME. These dialogues were chaired by ward councilors for respective wards, and the findings included poor roads and infrastructure, weak oversight and accountability mechanisms in the municipality, delayed formalisation of settlements, lack of implementation of the integrated development plan (IDP) and political-administrative tensions hampering service delivery. Consequently, multi-sectoral stakeholder engagements were held to improve the IDP planning processes with the provincial Cooperative Governance, Human Settlements and Traditional Affairs (COGHSTA) department, officials from the municipality and the municipal infrastructure support agent (MISA).

She explained how Presidential Hotline (PH) complaints were received and captured, and how the integration approach was used by the frontline monitoring team to directly intervene and facilitate complaint resolution in complex cases. This had yielded positive results.

She used diagrams and tables to illustrate situations involving the Presidential Hotline, the top ten service delivery complaints of national departments, and the national departments with high call volumes as at 31 March 2019.

The key challenges of the PH were that the toll-free facility was costly, complex cases were reported and there were unrealistic expectations from citizens. In the bid to overcome these challenges, the PH was being redesigned for two-way engagement through technological changes, in partnership with the Department of Science and Technology and the Centre for Science Innovation and Research.

2018/19 overall monitoring findings

Dr Behari used tables showing the 2018/19 monitoring coverage and executive support to provide a better understanding of the monitoring findings. She said the findings revealed that there was:

  • inadequate maintenance of buildings;
  • inappropriate infrastructure or accommodation for the rendering of services;
  • inadequate staff allocation or the long turnaround times for the filling of vacancies, particularly in health and education;
  • inadequate or malfunctioning equipment that hinders effective service delivery;
  • lack of bulk supplies (water, electricity, waste removal) at several service sites;
  • compromised water and sanitation service delivery;
  • unstable/unreliable network connectivity contributing to long downtime of ICT systems, interruptions and non-delivery of services;
  • poor contract management with service providers;
  • non-implementation of improvement recommendations due to poor intergovernmental relations, inadequate planning and budget allocation, and non-communication of findings to executive structures;
  • shortage of critical technical skills and capabilities in local municipalities to ensure effective contract management and delivery of bulk service supplies;
  • weak working relations between local municipalities and citizens;
  • shortage of ambulances and police vehicles; and
  • ineffective operational management, resulting in delays on the delivery of services.

There were also systemic challenges experienced in service delivery, such as conditions of frontline monitoring on the ground often not being in the IDPs or sector departments’ Annual Performance Plans (APPs), reflecting poor planning as well as alignment of plans across government, as well as a lack of clear understanding of the mandates, roles and responsibilities across the three spheres of government and within sector departments on how to deal with issues, resulting in delays and ineffective policy implementation. Other challenges included poor project management in infrastructure delivery, difficulty in planning, and sequencing of actions across sectors; lack of consequence management, resulting in a lack of knowledge of systemic accountabilities; development of programmes with minimal provincial involvement by the national sector; persistence of delivery blockages due to weak coordination across inter-governmental structures vertically and horizontally, and poor partnerships between Departments to maximize resources in order to increase monitoring coverage and visibility.

Successes and way forward

However, there had been successes recorded in their frontline service delivery, such as cases where the adoption of an area-based approach exposed the complexity involved in the implementation of service delivery and a need for a systems approach; the DPME being able to exercising positional authority in getting multi-sectoral and intergovernmental decision-makers into one room to make concrete commitments on improvements and the Department having energetic, passionate and experienced monitoring and support officials driving delivery from both the DPME and the OTPs.

Other pockets of successes included the usefulness of the Presidential Hotline, the Accelerated Schools Infrastructure Delivery Initiative (ASIDI) verification monitoring, overall findings and analysis at various rural schools and clinics, and the monitoring of the SA Social Security Agency (SASSA)/SA Post Office (SAPO) grants transition. The monitoring of governance structures was bringing local government, provincial and national departments together, and a public liaison forum had the potential to strengthen networking across government and citizens. There had also been presentations on innovative models of practice and indigenous knowledge generation by DPME staff.

In regard to the way forward, Dr Behari indicated that FM&S would strengthen two-way government-citizen engagement, and and participatory governance would be promoted through all the frontline monitoring tools.  As frontline monitoring had been institutionalised across government through the integrated monitoring framework, it would ensure the complexities on the ground were considered in policy making and bridge the gap in ‘top-down-bottom-up’ decision making. 

Mr Thulani Masilela, Outcome Facilitator: DPME, described what the Department had done in the financial year, and gave case studies using the Mafikeng provincial hospital, and the Tlhabane and Tlhakgameng community health centres (CHCs). A newspaper article had exposed the poor state of the hospital, and the DPME had made a rapid response visit to the hospital in July 2017. He explained the reason for the visits to these places, listing the positive and negative findings, and the communication of the DPME’s findings. After this, he gave a summary of DPME’s observations, which included the success reports and main challenges. Finally, he gave the conclusions and the recommendations for the way forward, making frontline service delivery even more successful.


Ms C Motsepe (EFF) asked the Department to explain the tools they used in monitoring rural areas. This was because in the rural areas there were usually challenges that needed attention, and when one talked about hotlines, this involved cell phones, which posed challenges. It required that the rural users had to give feedback to the Department. She wanted to know if the Department had ways of measuring customer satisfaction and if they sent people to the areas to interrogate communities on their satisfaction. Did they have ways of ensuring that those officials did do what they had been sent to do? Her reason for wanting the above information was the example of the Mafikeng project, as the Department would be wrong to assume that the problems were over. They had to visit to ensure that what they did was working, and where there were issues, they should be rectified accordingly.

Dr Behari responded that the Department was getting better, as they were getting to the very hard to reach areas. The question on the tools used in rural areas was very important to her, because it extended to a lot of the questions, as well as the issues of coverage and reach. She explained that when the DPME decided at the beginning of the year on where to go to, they had a target strategy based on the formula, and basically decide to go to areas where there was need and where there were extreme cases of poverty. This was done in collaboration with their partners, as a process. 

It was possible she had not articulated the Presidential Hotline very well, because it was part of the Izimbizo programme. Therefore, when they got to the rural areas, they put queries to the Hotline. Being a toll-free number, this extended the service. This was why they were trying to make more use of technology to get the hotline into areas which they could not reach before.

On whether they measured if the communities were satisfied with the Presidential Hotline or not, she explained that their hotline did have a citizen satisfaction survey attached to it, so they could reach out and measure customer satisfaction. The citizen satisfaction surveys that took place in the IDPs involved the community development workers (CDWs) who were part of the community. That was why they were monitoring the work of the ward committees, and one of the questions would be whether they involved the CDWs, and the relationships with them. Many provinces had their own ward committees, but in spite of that they were not using community feedback enough. She added that if a Councillor could not solve problems, the trust of the people in the government would be low.

She could not say for sure whether all CDWs were trained, but when working on the frontline service delivery, training was provided to the. That had been happening in the Northern Cape where they used them to monitor Cooperative Development Agency (CDA) programmes in the public service.

Ms M Kibi (ANC) referred to the issue of low monitoring coverage in North West province, and asked why this happened when it comes to local governments. Since information communication technology (ICT) was an integral part of the monitoring process, she wanted to know if the unit engaged the Department of Public Service and Administration (DPSA) on the ICT risks with regard to interruptions and non-delivery of services. Did the Department envisage that the government’s intervention framework was aimed at solving the issues raised? At what stage did the CDWs monitor the frontline services, and if they were involved in frontline service delivery, were they offered any training on how to use FM&S tools?

Dr Behari, responding on the ICT infrastructure, said they were dealing with that with with the State Information Technology Agency (SITA) and the DPSA. Most of the problems were nationwide and they were bringing departments together to deal with the issues together. Given that the problems were country wide, they had to deal with them aggressively by bringing the departments together to give accounts on them. Part of their approach to frontline service delivery had been unpacking the mandate of the Department. Even when it was a case to case problem, they called the departments together and everyone unpacks some of the difficulties. They then come up with multi-sectoral methods of operation. This was one of the places where they had pockets of success.

Ms R Lesoma (ANC) referred to the Department’s relationship with Chapter 9 institutions, because they were funded by government and tended to deal in most cases with the same issues. Therefore, it would be good if the Department explained if there was any synergy in terms of the issues that they picked up, and if the DPME monitored such cases in other areas. On the Mafikeng project, she that there should be a real report, and not work based on mere news carried by newspapers. There had to be a tool for measuring inter-governmental relationships that addressed the real issues, instead of depending on newspaper stories that came out during the elections. She suggested that the Department should have service delivery plans which did not necessarily involve more money, but which ensured that issues were dealt with positively.

Dr Behari said that one of the changes in methodology used in the Mafikeng project was the method of rapid response to identify areas where problems were emerging. This was why they had to work with CDWs because they were the eyes and ears of the government. They were trying to get the monitoring into one integrated system. She thought the location of the CDWs was a problem because they were outside of COPTA and the DPSA. Actually, the Director of Community Based Monitoring was presently coordinating the North West intervention, and assisting the administrator in the North West. However, they did not count it as part of their intervention in North West, because they traditionally measured it in terms of facilities, and they were reporting it as part their annual performance plan, and also report to the committee. Every member of the Department had played an important part in their interventions in North West. This had become food for thought for them took look at how they measured the rate in responding. The target of North West was because the community development programme had become a rapid response issue that needed questions to be answer quickly.

The CDWs were a COPTA and DPSA programme, but what they were trying do as DPME was to bring them together in one integrated system. They had been dealing with them, as could be seen in the case study of Gauteng and some of the other monitoring processes, and they could use that information more efficiently, as a lot of government resources were being spent on this. Therefore, they were also looking to find out how the frontline services could play a bigger role in making government facilities more efficient, so they were going to be very active in the public liason forum of 2019.               

Dr L Schreiber (DA) asked the Department if they had ways of avoiding duplication of services between DPME and the Human Rights Commission (HRC), because they seemed to be operating on issues that were related. Could more information be provided on two way communications, because the issue was that people had expectations? Did the Department have a way of communicating in real time and a in a way that would prevent the issue of unrealistic expectations from arising?

Dr Behari said the Department was using that cases of human rights that came from the HRC in developing their targeting strategy, they. She gave instances of cases from their frontline, when people reported to them on problems. In doing this, people did not just report on the Presidential Hotline, but also sought and got her email address and emailed her, copying up to 200 people. This also made her an individual hotline, However, there were ways they could deal with this without compromising their independence, because once the Department started intervening, the people got worked up. This meant that they had to be careful about how they used the Presidential Hotline, because it could be misconstrued that they were using the muscle of the presidency.

They had previously worked together the Public Service Commission (PSC), and those engagements were part of the forums this year. They were trying to streamline without compromising their independence and they were also using them in their targeting strategy to ensure that the DPME did, did not duplicate where they were going. Noteworthy was the fact that the PSC was likely to be different from the DPME and the HRC, but nevertheless they could engage with them more than the independent bodies. Many of their hotline complaints were reported there, but when there were issues like corruption, they sent them directly to the PSC because they did not want to be a corruption hotline.  This had also come out from earlier engagements with the communities.

Mr B Maneli (ANC), commenting on the involvement with local government, asked the Department if at any point it was discovered that the disjuncture while working in terms of IDPs could be as a result of the delegates used by the departments, especially at the preliminary phases. He asked if there were cases where issues should be dealt with by senior personnel, or if it was just a case of sending anyone and expecting compliance to directives. Regarding the recommendations presented by the Department, he wanted to know if there was feedback on their findings to establish if the issues had been being attended to. If the feedback was picked up during monitoring, the issues could be attended to before they hit the headlines. He suggested it would not be good to give grants to municipalities to improve their infrastructures without making provision for the maintenance of the services.

Dr Behari said the reason they had those pockets of success was because DPME used its strongest muscle to bring the highest people to account when things went wrong. Where local governments were involved, they got the local officials in the municipality around the table, including the municipal manager so that so that they could give an account. What the Department had learnt was that they had to use their muscle. Even when they called people around the table to unblock issues, they had not only dealt with this at the local government level, but had also called in the Department of Cooperative Governance, because they coordinate local government, and were able to understand some of these issues. Involving COPTA at the highest level to account around one table was what their district model was all about.

Ms Lesoma agreed with Mr Maneli that the issue of maintenance was very critical, because after one had solved a problem, there should be some form of supervision. She therefore asked the Department if there was a programme, or someone responsible for citizen education, because from what they had been shown, there had been an instance where a toilet door that was initially put up, had been removed without anyone accounting for it, and that implied that the Department would have to return to put back the door. There must be ownership by both the service providers and the receivers of the service so that they would both be committed to taking care of the facilities.

Dr Behari responded on whether they included government structures, and said that when they went out to monitor, government structures were included as one of the ways of ensuring ownership. Regarding the door that was removed, she agreed that where such issues arose, they did have to get ownership from the communtiy if they wanted that door to be looked after, and this was why they were including government structures. They were not a big unit, but they were trying to extend their reach by including government structures in their monitoring programmes as public representatives.

The Department would have to look at the protest on Ithukwane, going forward. As part of the 25-year review of the integrated government processes, it was the report from DPME that dealt with the issues on the ground, including that of Ithukwane. They had to be a responsive government, and this was why the new district model had come out. The Presidency had taken the lead, with COPTA and the DPME, in the planning by bringing senior officials into the process, and this also answered the question of having senior officials around the table. She accepted that it had been a shortcoming in that they were not getting the right people into the IDP planning processes, and that was the reason for the new plan.

She said the DPME did not call the Directors General (DGs) every day but when there was a big issue, they were the officials who were held accountable, and the performance information/bonuses had been brought to the notice of the arm of the government responsible for this, which would be the chief operating officer (COO) of the department. The DPSA was aware of the major management issues they were having, and the whole performance management unit had to be looked at again. In her own unit, they had been forced to look at a lot of issues that did not go according to the regulations and in terms of guidelines. Therefore, Human Resources (HR) had been working very strictly on these guidelines in the last year.

Taking them back to the case to case issues she mentioned about buildings earlier, she said that what they had found in many cases was that they were not sending the improvement plans to the right people. Service delivery was complex, and it was not just the Department that was responsible for the buildings -- there were issues of dealing with the wrong departments. They had to get the DPS and the local municipality around the table to make the issues the responsibility of various officials to resolve the issues, leading to the pockets of success mentioned earlier.

Mr M Malatsi (DA) observed that the Presidential Hotline and the community-based service had proven to be useful tools in facilitating the monitoring of service delivery improvement. He asked the Department if they was any benchmark for measuring the number of calls in countries with populations similar to South Africa.

Dr Behari said that the hotline was a very useful tool, and they were through their pre-development phase and building up the capability by sending information through a range of social media. On the issue of comparability to other countries, she could not answer the question at the moment, but when they did their presentation at the Durban University of Technology (DUT), it had been discussed. The discussion had been in partnership with the SA Sociological Association, and what they had been told was that they were very surprised that the Department had a Presidential Hotline. She was not sure if they could get a comparison, because the hotline was not a complaint management system in itself, but they would look into it.

The cost breakdown was one of their key delivery factors, because technology was getting cheaper. A report was presently on her table to see where they could cut costs on the hotline. To achieve cost reductions, they would benchmark not just against other countries, but against themselves as well.

Ms M Clarke (DA) said she had done some research on the Presidential Hotline, and in the research the HoD had stated that they had an 80% success rate. She asked if the 80% success rate was just for frontline service delivery. Regarding bringing the highest people around the table, she was worried about the real management of officials operating within the human relations sectors. Her worry was because she had been a part of local government for 15 years, and the officials were never to be found in the departments. She therefore believed that there should be some sort of synergy, not just in terms of service delivery and how one got things done, but also in terms of staff morale and how they did their jobs, instead of getting salaries every month and not doing what they were paid to do. She added that she had worked at the council for many years, and not once did a community development worker come to the community. This one of the things that needed to be monitored, because they were supposed to be assisting from a frontline service point of view so that things happened correctly. Even in her spirit of collegiality, she had tried to involve them but not once did they attend meetings.

Dr Behari said the success rate of the Presidential Hotline had been presented to the Committee previously. Her definition of the success rate was that it was usually a call centre complaint that was reported to the Department, and the quality of the resolution was what they would be looking at henceforth. She explained that that was why they also attached satisfaction surveys to look at the correlation between the satisfaction and the resolution rate.

She thought the Department needed to get the morale of staff up, especially when they considered the Mafikeng project and how it had played out. However, she could not say which member of staff was working and which one was not working, but whenever such information came to them, they always reported them. Being a public service sector, they needed to appreciate their staff to ensure that they had genuine energy, passion and the experience that counted. She was glad to state that they did have pockets of good staff and they continued to build the positives. This could be achieved by engaging in initiatives that could provide personal gratification.

Mr Maneli asked if they were any reports about the maintenance of buildings, because if there were reports and they were picked up earlier, there would not be serious issues again.

Dr Behari explained that as part of the planning processes, in the new APP framework where the infrastructures were highlighted, the maintenance of the buildings was also highlighted as part of the planning for the infrastructure. However, though it was one of the issues that had been marked to be attended to in the APPs, she could not give an absolutely confident response on that because it was in another unit of the Department, and it was still at the beginning stage of the new administration.

Finally, she said that the legislation and policy were conflicting, and this was also where they were hoping that the integration and the integrated approach of the monitoring framework would be very useful. Their planning framework would assist them in bringing government together. The self-improvement part of the planning was also very important, because a lot of government resources were being spent on a lot of those processes.

DPME: Status report on filing of DG/HoD performance agreements

The Chief Director (CD): Public Service Monitoring, DPME, gave a report on the HoDs’ Performance Management and Development (PMD) system prior to 2018, showing the key result areas as constituting 80% and the core management competencies as 20%. He explained that contracting and assessment was between the executive authority (EA) and the DG. Evaluation panels consisting of EAs were in place to finalise the evaluation of DGs, and the Ministry of Public Service and Administration (MPSA) had issued an directive in 2015 doing away with panels and evaluations completed between the EA and the DG that only provided for pay progression.

One of the challenges was the concern expressed by Cabinet that DGs received bonuses while their departments had qualified audits. Another was the backlog created due to panels not sitting and DGs not receiving pay progressions.

He described new HOD PMDs, whose appointments had happened as a result of the directive issued by the MPSA in terms of Section 41(3) of the Public Service Act, 1994, on the Performance Management and Development System (PMDS) for Heads of Department (HoDs) which was approved in December 2017 and came into effect from 1 April 2018. He said the DPME was the secretariat supporting the Presidency for HoD evaluations, and was responsible for overseeing the implementation of the directive for DGs from national departments, national government components and the OTPs in each province, while the OTPs were responsible for the HoD PMDS in the provincial departments.

The CD outlined the stages and timelines of the performance cycle as at 31 March, the DPME/OTP role, the key dimensions of the PMDs for HoDs, the submission status of DGs/HoDs from the national and provincial departments and CEOs from national government components, the status of performance agreements received for the 2018/19 performance cycle, the due dates for the 2019/20 performance agreements, the mid-term review, the status on the submission of 2018/19 mid-term reviews, the status on the submission of 2018/19 midterm reviews, annual performance assessments, important due dates for the annual evaluations of HoDs PMDs, and the outcomes of evaluation.


Ms Clarke observed from the status submission that they had not complied 100%, and asked if there were consequences for late submission or non-compliance, and whether the ministers and DGs should not be held accountable when this happened. In her opinion, if submissions were not made and DGs were not held accountable, they were also failing in their mandate.

Mr Malatsi referred to an opening remark, that in order to improve the Department needed to get support from the presidential and executive authorities. He therefore wanted to know what efforts had been made towards getting that support to get over the obstacles that seemed to be standing in the way. In his understanding, performance was not necessarily the sole reason for getting pay progression or incentives, but also whether the performance of the individual aligned with the overall mandate of the position. He therefore asked how they measured the effectiveness of the individual in those posts in instances where there had not been any submission of this performance, because there would be no means of measuring whether the individual was still suited or still performing according to the requirements of his/her contract.

The CD said that they would always appreciate support from the Committee to assess the DGs or HoDs, in getting compliance and getting the system to work properly. Through past experience, the departments that were not doing well would be identified, and then that would be dealt with.

Dr Schreiber said on the one hand the CD had mentioned that they were to some extent satisfied with the compliance when it came to setting up the new performance agreement and the new system, but on the other hand there was the shocking statistic of 33% compliance when it came to the mid-term review. He asked how to reconcile the discrepancies, as the people who were not caring in the mid-term seemingly did so at the beginning. He liked the fact that the Western Cape was at 100%, but the rest of them were below 50%, 40% and 35%. If this was how these provincial offices normally dealt with performance reviews, how were they expected to hold them accountable? He wanted to know how non-compliance issues got condoned, seeing that 77% of departments did not comply. Would they be held accountable or not -- was it a strict matter, or they were just left without any penalty for non-compliance? He was confused, given that they had heard from the Minister that he would do away with performance bonuses at the senior level, so how did this fit in with that plan because at the moment it sounded like bonuses were the main incentive for people to comply. If there was a plan to do away with performance bonuses, would if affect the system, or was there somehow a separate performance system where there would still be an incentive? He said there should be some clarity on these issues mentioned. Regarding what the presenter had said about the system depending on performance agreements being concluded between the President and the executive authorities, was the country far along with this system?  Would it help to get a sense as to where they were in the process? Finally, he wanted to know if there was something that could be concluded soon so as not to hold up the workings of the Department.

Ms Kibi indicated that the CD had said that in 2015, the Minister of Public Service and Administration had issued a directive to do away with performance evaluations being conducted between the DGs and the Executive, and she therefore wanted to know if there had been any compliance complaints that evaluations were no longer unilaterally driven. She also asked if there had been good compliance as a result of that. Asking for clarity, she said that prior to bringing issues to the Committee, there had been concerns expressed that DGs received bonuses while the departments had budgets that were not attended to.

The CD responded on the 2015 directive, he said that though the PSC had indicated grievances in relation to the performance management agreement, somewhere along the line the agreement had come through and assessments had been done. Consequently, the PSC would manage those grievances. He was not aware of any complaints being brought by the DGs about the evaluations based on the 2015 directives.

The concern around the Cabinet system was that part of the 2015 directive had resulted in no bonuses being paid since 2015 to any DGs. They needed to get to a point where good performance could be recognised, as it was even more valuable to recognise good performance by giving people some recognition and acknowledgement that they were doing good jobs. It was also important to recognise non-performance, and deal with it.

The reason DGs and HoDs were excluded was that technically, if someone was still appointed at the DG level, that person was being remunerated at that level and would also be assessed using the 2015 directive. Unfortunately, the PMDs for Deputy Directors General (DDGs) looked the same as the ones for DGs, although there was a bit more weighting for DGs. The department’s performance was assessed and evaluated, but there was no instrument for doing that because they did that more directly, and as such the performance evaluation could also be done for the Department.

Mr Maneli reiterated that performance reviews were not just about incentives, and there should rather be a rationale as to why they would exclude people from the system who were active in their positions, because they would have similar responsibilities towards the department. He was raising the issue because some of the reports they had received were that people in the departments were affected, so there was a need to understand the performance of the HoDs. Attention should be given also to those that have not submitted, especially those who were still in a position of authority to know how the department performed.

The CD agreed that the PMDS was not about bonuses. The new policy of the DPSA dealt with performance management system, and it was assessing if their methods were fully effective or not. Through the PMD system they just identified performances, as the Department envisaged that performance dialogues could be held to provide opportunities to try and support DGs and HoDs.

He said that performance management should be a continuous process, a continuous dialogue between the supervisor and whoever they were supervising, for the system to work and manage people’s performances. They were trying to instil this discipline in terms of performance. Evaluations would be done for all 150 DGs and HoDs so whether they qualified for a bonus or not, the Department would still facilitate the evaluation. Even where there was no performance agreement in place, the panel would still perform an evaluation of the DGs, and from there the Department would grow the existing information. Consequently, the Department would rely a lot on annual reports and the AG’s account. However, there were other monitoring processes that they were looking at their outcomes, such as the Programme of Action (POA) systems, and they would try to evaluate the performance of DGs at the same pace. As a result, they would be evaluating irrespective of whether they were qualified or not in order to identify areas of performance, support and assistance needed.

Mrs Motsepe expressed her concern at how a newly appointed DG could be expected to submit a report on PMDS within just three months. She asked if the new directive from the Minister of Public Service and Administration regarding panels had reduced the backlog in the evaluation process.

The CD responded that the three months was only for the DGs to submit to him, because it was felt that once one was appointed, three months was enough for one to understand one’s department. The three months was only for one to understand the agreement, and ideally in six months one would have a full assessment of performance.

Ms Lesoma said she would not accept that because there was a new administration, things should disappear in the Department. The government and the people of South Africa were always there and as such she would go ahead and give her points:

  • Ministers come and go, but Departments were always there;
  • She did not hear the roles of EAs and their responsibilities for the reports;
  • If the system was oiled and active, it would have nothing to do with the change in government in terms of the Department;
  • The Portfolio of Evidence seemed as if it did not assist the Department, because the system was not oiled and active;
  • The DPSA and DPME should be working closely with each other, sharing notes administratively because they support each other one way or the other.

She added that the operations of the two systems should be complementary to each other, because it was the same Parliament, and even if the previous Committee did not take issues seriously, they would not do the same.

The CD responded that as the Members had stated, the administration goes on in spite of the change in positions. The posts remain the same administratively, even if the Ministers change. However, it was unfortunate that when it comes to statements, reviews and performance management, the change in people affects the system and it would be naïve to say that it did not. However, whether the Minister was there or not, the administration had to carry on and the process must continue.

Dr Schreiber referred to non-compliance involving DGs and HoDs, and asked if those people were still in their offices or whether they had been moved to new Departments. If the statistics were accurate, as he believed and there was a separation between the political goings-on and the public service and administration, why should the 77% non-compliance not be investigated?.

The CD said there had been no condonation of the ones that did not comply, and their evaluations would still be done irrespective of their underperformance or whether they qualified for any awards or not, because everybody had to be evaluated. In the past when someone did not qualify, they were just ignored, but things would not be done that way anymore.

Mr Maneli said it seemed to him that the Department had moved to the point politically where there was no understanding of the priorities, and it did not matter what happened. How did the Department assess if the system was not working?

Mr Malatsi said South Africa had national government so the interactions should be the same in respect of what the priorities were when it came to senior service personnel and their movements. He wanted clarity on what the CD had alluded regarding the movement of people during the election period. What had been the contributing factors to the movement of DGs and HoDs, because each time there was an election and the government gets reconfigured, the Minister moves, the DG follows immediately or a short while after, and this brings about a disruption in the flow of having a positive assessment of an individual.

The CD responded that Ministers were partners in the whole performance management system, and as officials, they did not have any hold over Ministers in respect of holding them accountable, but the reports were provided to the President and executive with the understanding that they would be held accountable by these higher bodies.

In his opinion, the difference between the performance agreements and the methods of reviews was the practice and culture where they had done performance management just to keep things going, and they were still managing the difference for corruption checks. Using himself as a case study, he said that while other people talked about surprises, there should not be a surprise in his annual assessment because he should be able to know what his performance was like. Taking them back to what happened in the change in government, he said that they had to decide as evaluation panels how to manage the processes in remunerations.

He said they were also reducing bonuses in terms of performance assessments, so bonuses were not going to be an issue any more. One thing that had happened with the penalised DGs was that when they were appointed in their tenures, no adjustments had been made to their cost of living, so in a real sense their salaries had reduced over time. The Department was thinking of reviewing this.

The agreement between the Ministers and the President was due before the end of October, and that would give the Department enough time to make a concrete decision on the DG’s files by the end of October/early November. The Department did not know what the agreement was going to contain, or what the priorities were, but the Department would find ways of building the DGs’ files around them.

Regarding the movement of DGs, the change in the new PMDs system was technically driven by the fact that they had a PSC and various reports on the investigations of the DGs. He revealed that at some stage, the average was period of service of a DG was below three periods, but that had been slowing coming up.

On the difference in the systems, he explained that the DPSA was responsible for defining the policy of the PMDS, while the DPME was responsible for the implementation of the policy. The DG was responsible for convening general meetings, and the Department was working very closely on these roles.

Committee reports and minutes

During consideration of the Committee report,  minor amendments and typographical corrections were put forward a few budget votes. Ms Lesoma moved the adoption of the votes to be presented to Parliament, and was seconded by Ms Kibi.  

The minutes of the meeting of 4 September were moved for adoption by Ms Maluleke and seconded by Ms Clarke. A few corrections were made to the minutes of 7 September, after which Ms Kibi moved their adoption, seconded by Mr Malatsi.

The meeting was adjourned.


Share this page: