The Department of Performance Monitoring and Evaluation (DPME) gave a short summary of its three programmes to the Committee. All the monitoring results findings were presented to the Cabinet annually and to the responsible national departments. A Citizen Based Monitoring (CBM) framework had been approved by the Cabinet in 2013. CBM supported government departments to strengthen the citizen voice in monitoring service delivery. About 190 000 complaints and queries were received up to April 2014. The Front Line Service Delivery Monitoring was done jointly with the offices of the Premiers. This programme aimed to monitor various aspects of service delivery, such as , and a number of programmes had been started as pilots. A hotline programme was also set up to provide a grievance mechanism for the public and about 190 000 complaints and queries had been received, of which 95% were resolved. The Front Line Service Delivery Monitoring (FSDM) programme aimed to strengthen the monitoring and evaluation practices of field-level managers, and to demonstrate the value of on-site monitoring on selected types of facilities or sectors. This programme was implemented jointly with all nine Offices of the Premiers, and the findings were presented to top management, and progress thereafter tracked on implementation of the finding reports. Aspects of service delivery monitored included queue management and waiting times, complaints management, cleanliness and comfort, dignified treatment, location and access, visibility and signage, safety, and opening and closing times. The DPME had monitored busy departments and schools and courts, hospitals and police stations that were visited continuously by the public, as well as municipal driving licence testing centres and municipal customer care centres. It had monitored 650 and would be dealing with 90 more. Score cards and reports were drawn after each visit, to deal with the state of performance, and others included photographs and improvement plans. Monitoring results for 2014/15 would be presented by June 2015. Generally, improvements had been seen across all facilities. Some departments were now able to deal with their own monitoring. Some of the challenges included that although commitments were made, they were hindered by weaknesses in planning, and managers had to be clear on whether plans were realistic, if regular data was collected and if trend analyses were done. The DPME had established excellent relationships with senior management, although it had to focus on only a few departments at a time.
The second presentation looked at the Annual Development Indicators of the DPME. There had been a pilot project and it was to be expanded over the next twelve months. In August 2013 Cabinet approved the f "Framework for Strengthening Citizen/ Government partnerships for monitoring Frontline Service Delivery", and also approved the fact that government departments now had to adjust their monitoring and evaluation frameworks to include a mechanism for incorporating the views and experiences of CBM. An annual progress report on the piloting of CBM was submitted to the Cabinet. Essentially CBM involved focusing on the experiences of ordinary citizens, public accountability and service delivery. The delivery was spread across programmes looking into research and knowledge sharing, development of policy instruments and then turning data into change. The time lines and progress were described. 12 000 monitoring exercises had been carried out. Surveys were conducted in the local languages. Where there were perceptions about a certain facility, management of that facility had to respond to the perception in order to put people at ease. In the third (current) phase there would be automated report generation, exploration with the Department of Cooperative Governance and Traditional Affairs, evaluation of the pilots and formulation of the five year strategy.
Members asked if the DPME had any feedback from the visit by the Minister of Public Service and Administration,asked if this body was facilitating reports, and what actions departments were taking. They asked for a list of institutions where monitoring had been done. They raised their concern regarding the criteria used in in recruiting staff, and questioned the involvement of the Department with local government. They asked if the DPME was working with the Public Administration Leadership and Management Academy (PALAMA), how its findings were perceived, and what the time frames were. They asked how the DPME dealt with staff who were fearful of laying complaints and what would be done if there was escalation of any problem, as also whether those trained received any commendation. They questioned if there were different budgets for each department visited, how the selection worked, and the long time frames before results were presented to Parliament. They asked for lists of facilities visited, and asked whether there was any involvement with the Thusong Centres, and whether social media had been used.
Members adopted a series of minutes dating back to July 2014, with no amendments.
Citizen-Based monitoring (CBM) on Front Line Service Delivery Monitoring: Implementation and reporting: Department of Performance Monitoring and Evaluation briefing
The meeting commenced, after a closed session, with a presentation by Ms Bernadette Leon, Head of Frontline Service Delivery Monitoring Programme, Department of Performance Monitoring and Evaluation, on the three programmes run by the Department (or DPME) as part of the citizen-based monitoring (CBM) These were
- Unannounced monitoring Visits
- Citizen Based Monitoring
- Presidential monitoring (which would not be discussed in the current presentation)
Ms Leon explained that the unannounced monitoring visits were aimed at improving the self-monitoring by government departments of the quality of front-line service delivery. All the monitoring results found were presented to the Cabinet annually, as well as to the responsible National Departments. She stated that from June 2011 to date there had been 650 facilities monitored and 77 had also been re-monitored to track improvements.
The framework for Citizen Based Monitoring (CBM) had been approved by the Cabinet in 2013. CBM supported government departments to strengthen the citizen's voice in monitoring service delivery. The piloting of this programme started in 2013 with, among others, South African Police Services, Department of Social Development, South African Social Security Agency. A Presidential Hotline was created to provide a grievance mechanism for the public. The hotline programme provided monitoring data on sector and geographical trends for complaints received. About 190 000 complaints and queries were received up to April 2014 of which around 95% were resolved.
Ms Leon stated that the Front Line Service Delivery Monitoring (FSDM) programme aimed to strengthen the monitoring and evaluation practices of field-level managers. The intention of the FSDM was to demonstrate the value of on-site monitoring on selected types of facilities or sectors. This programme was implemented jointly with all nine Offices of the Premiers, and as a result FSDM had anticipated that monitoring findings would not be acted upon but be presented to the top management of the facility. For this reason it was necessary to revisit the facility to track progress. FSDM was also implemented to demonstrate the value of the collecting monitoring information from different sources like staff and independent monitors and also to demonstrate how to use evidence collected at facility level for catalysing improvements.
FSDM would design and maintain the monitoring tools, monitor protocols, analyse findings and report to the Cabinet and to National Sector Departments. FSDM also conducted the monitoring visits and the improvements of those monitoring visits. The Office of the Premiers conducted joint monitoring with DPME and the findings of these monitoring sessions were presented to the relevant Heads of Departments and Members of Executive Committees.
She explained that there were eight aspects of service delivery performance areas that were monitored by FSDM. These were: queue management and waiting times, complaints management, cleanliness and comfort, dignified treatment, location and access, visibility and signage, safety, and opening and closing times. Some of the Frontline Service Delivery sites that DPME monitored included SASSA offices, schools, Department of Home Affairs offices, hospitals and clinics, courts, police stations, drivers licence testing centres and municipal customer care centres.
There were currently 650 Frontline facilities. 90 new facilities would be monitored annually by FSDM in the future. All the improvements on the 650 facilities that were monitored were presented to the facility-management, national sector department and the Cabinet. For each facility a score card and report was produced and presented annually to the facility management, the responsible national department and the Cabinet. The score card was drawn to enable the portfolio committees and national departments to understand the state of performance at a specific level. The score card also managed the risk of averaged out performance that could hide finer level data. Another score card that was created was for improvement plans in order to facilitate problem-solving, assign responsibilities and to allow tracking of agreed improvements. Another type of score card dealt with photographs that were taken from the facilities as evidence to support the findings. Ms Leon informed the Committee that the 2014/15 improvement monitoring results would be published by June 2015. This would consist of 123 re-monitoring facilities. FSDM had contributed to the stronger strategic focus on the performance of Frontline facilities in DPME's four years of implementing the programme. Ms Leon stated that most of the departments that it was monitoring were improving their own planning and monitoring of facilities.
Ms Leon informed the Committee with pride that there were improvements in the front-line services of the monitored facilities as the Department was doing the ratings and an average rating in all eight types of facilities had improved. As a result of these improvements Ms Leon informed the Committee that there were now facilities that were doing their own monitoring, like the Department of Health (DoH) and Department of Justice and Constitutional Development (DOJ&CD).
In the 2009 -2013 outcomes, commitments were made to improve conditions in the frontline facilities but these commitments were hindered by the weaknesses in planning and monitoring of the programme. In strengthening the planning and monitoring for facility-levels, the service delivery managers had to ask themselves whether there were realistic facility-level standards for cleanliness, or waiting times, or facility-monitoring against these standards. Managers must also ask if the regional office and head office collected regular data from facilities and did trend analysis to assess whether the data was not influencing the budget.
Ms Leon informed the Committee that since the inception of the programme FSDM had been visible on the ground in five provinces within four months, and within 12 months in all the provinces. The Department was able to present to the Cabinet with its first findings report by January 2012. It also established high working relationships with and got access to senior management of the key national departments like South African Police Services (SAPS) and DoH. Another achievement by the Department was the contribution to the building of the profile, value and visibility of monitoring and evaluation and the work of the Department.
Building on the success, Ms Leon stated that the Department had to keep its focus on a few facilities to demonstrate impact, as the DPME faced a challenge of being pushed to do more monitoring whilst at the same time it needed to show how the monitoring would lead to improvement, which required a more targeted focus on few facilities. The Department had to look at how to make its data more useful to more people and how to have better results for increased impact on the provincial and national stakeholders.
Annual Development Indicators briefing
Mr Jonathan Timm, Director of Citizen Based Monitoring (CBM),DPME, started his presentation by giving the Committee an overview of policy commitments to CBM, which included a review of the pilot project for CBM and expanding the pilot over the next 12 months. Mr Timm informed the Committee that in August 2013 Cabinet approved the framework for CBM: ‘A Framework for Strengthening Citizen/ Government partnerships for monitoring Frontline Service Delivery’. Cabinet also approved that government departments involved in service delivery to the public had to adjust their monitoring and evaluation frameworks to include a mechanism for incorporating the views and experiences of CBM. An annual progress report on the piloting of CBM was submitted to the Cabinet. CBM was defined as an approach to monitoring government performance that focused on the experience of ordinary citizens in order to strengthen public accountability and drive service delivery. CBM also required citizens to be active participants in shaping what was monitored, how the monitoring was done and what interpretations and actions were derived from the data. All this meant that monitoring had to be used at the point where it was gathered.
Mr Timm stated that the DPME programme covered and was arranged into three areas. The first area was CBM approaches, which does research and knowledge sharing on approaches and lessons for citizens based monitoring. This area also supported the development of policy instruments for improved accountability. The second area of the DPME programme was involved with turning data into change, by understanding how to support specific local accountability mechanisms for improved service delivery. The last area was building the conversation by creating platforms for dialogue with organised civil society and the state.
Mr Timm further informed the Committee that since October 2013 the timeline for the pilot had four phases. The first phase was the start of piloting, which started in two sites, one of which was Phuthaditshaba. The second phase was initiation of the pilot in three provinces (Limpopo, North West and Gauteng province). Four more sites were initiated in four provinces in the third phase, due to end in September 2015. The Department was now in preparation for the fourth phase, which was to evaluate and scale up what worked. At phase 2, the CBM for piloting had been using a model that was based on a survey of 100 community members and the results of the survey were shared with management, staff and community groups.
12 000 monitoring exercises were conducted in the first two phases on 18 facilities in five communities, and there were reports produced for each facility. Data used to collect the information included questions like what improvement people would like to see at the facility, and there was in-depth engagement with staff, management and community representatives. The survey was conducted in the local languages. Where there were perceptions about a certain facility, management of that facility had to respond to the perception in order to put people at ease.
Planned activities on phase 3 piloting included generating automate reports, exploration with the Department of Cooperative Governance and Traditional Affairs (COGTA) on CBM, evaluation of the pilot and a five year strategy.
The Chairperson stated that the Minister of Public Service and Administration had visited Frontline Monitors during the public service month but the Committee had not yet received the report on that visit, and she wanted to know if the Department had any light as to the challenges that were presented to the Minister.
Ms Leon informed the Committee that the Minister chose to visit Frontline alone and the DPME also had not received the report.
Mr J McGluwa(DA) stated that there were so many tools to conduct monitoring. It seemed to him that this body seems to be for gathering intelligence while it was surely supposed to facilitate reports. He wanted to know what were the actions the relevant department would take where there were negative results in the DPME findings.
He further stated that the surveys done by the FSDM were good and he requested a list of police station where the surveys were conducted. He also requested that the Committee be given the list of 650 facilities that were monitored, and the reports from the monitoring.
Ms Leon promised to give the Committee the lists requested. She also informed the Committee that DPME was unable to instruct the departments on how to deal with the findings, but focused on relationships.
Mr Timm informed the Committee that the DPME received a low response that was very negative.
Mr M Cardo (DA) wanted to know if the Department was working with Public Administration Leadership and Management Academy (PALAMA) and, if so, then what phase this has reached, as the Committee may be able to give recommendations. He wanted to know if some of the DPME’s findings were seen as criticism. Mr Cardo also wanted to know the time frame of the pilot.
Ms Leon stated that the Department did not make the findings of CBM public; if it did that it would tend to impact on the results of the surveys.
Mr Timm stated that the time frame for phase 3 was up to September 2015, as phases 1 and 2 were already completed. He informed the Committee that the Department had identified a need of better communication within the government.
Mr S Motau (DA) wanted to know how DPME would deal with citizens that were afraid to lay complaints against staff of these facilities that were monitored
Ms M Mente-Nqweniso (EFF) wanted clarity as to what measures were taken if there was escalation of complaints. He also wanted to know if the Department gave appreciation to the staff who had worked hard.
Mr Timm stated that the Department gave the recruits certificates so they could add their training in to their résumés.
Mr A van der Westhuizen (DA) requested the average costs of each visit done by DPME. He wanted to know if school governing bodies were also part of "management" to whom the DPME gave finding reports. They had not been mentioned in this presentation. He queried also whether the Department differentiated the way it conducted its visits, according to facility categories.
Ms Leon informed the Committee that there was no budget specifically set aside for each visit. The Department had R10 million a year for the whole project, which included salary payments. She further informed the Committee that as a result of this limited budget the DPME would not be able to re-monitor some of the 650 facilities that they had monitored earlier. The Department had also use donor funding and had also outsourced monitoring to Black Sash. She stated that the criteria used in selecting facilities to monitor included looking into facilities that all citizens visited on virtually a daily basis.
Mr M Ntombela (ANC) raised his concern about the time frames, from the time the survey was first done until the time the report reached Parliament, which was quite a long period.
Ms R Lesoma (ANC) appreciated the pilot results. She pointed out that the Department must not only look at the cleanliness of the facilities but must also look at if the facilities were user friendly and accessible. She also stated that the DPME had to form active relationships with other governmental departments.
Mr S Mncwabe (NFP) also expressed appreciation for the report. He asked for a list of the facilities that were visited by the Department, as it would assist the Committee when it conducted its oversight, to see the improvements in the places that were monitored.
The Chairperson wanted to know what relationship the DPME had with the Department of Communication and whether there was was duplication of duties, as it seemed as if both departments were doing the same duties. She also wanted to know if the job that the DPME was doing was sustainable. She asked what criteria the Department was using in recruiting graduate employees. The Chairperson also wanted to know more about the involvement of the DPME with the Thusong Centres.
Ms Leon stated that if the facility the DPME was visiting was in a Thusong centre, it would take the time also to visit the Centre itself. Its experience had been that there was often a conflict, and hence problems, in the relationship between the the owner and the facility managers.
The Chairperson informed the Department that the Thusong Centres the Committee had visited were not up to standard.
Mr Timm reported that in the first phase, the criteria that DPME used for recruiting was to take matriculated recruits, through community work programmes. In the second phase it worked with unemployed recruits which DPME was put into contact with, through different stakeholders. All the team it worked with was localised, in each place. The DPME made it clear to the recruits that they should not necessarily expect to be employed throughout the whole survey.
The Chairperson raised her concern that she had the feeling that the DPME did not want get involved in job creation,as the Department was outsourcing the monitoring programme to Black Sash. She wanted to know if the DPME had considered using social media when conducting its surveys. She also asked if the Department would consider going to places where there were service delivery protest.
Mr Timm informed the Committee that DPME could not facilitate the conversation in those service delivery protests, as the programme of DPME was not a quick fix but was about building relationships of the facilities and people involved. In relation to exploring social media Mr Timm explained that exploring social media created a potential trap as there were still teething problems, and the DPME had to be very careful, when using any technology, that it was entirely secure and there was no likelihood of devices being stolen.
Mr M Dirks (ANC) wanted to know the Department’s vision regarding local government and if the Department had ever consulted South African Local Government Association (SALGA).
Ms Leon stated that for two (years the Department had been working with local government and had visited the driving licence centres, as these centres were part of the municipalities.
Mr Stanley Ntakumba, Deputy Director General, DPME, informed the Committee that the Department had a close engagement with the local government. He added that although brief answers were given now, the DPME would be looking into the points raised in more detail and would answer in full.
The Chairperson stated that what the DPME was doing was work in progress. She said that the Committee had to call the National Treasury in to explain the budget cuts. She thanked the delegates for their presentations.
Adoption of Outstanding Minutes
Minutes of the following meetings were adopted, without any corrections:
25 June, 02 July, 19 July, 20 August, 10 September, 17 September, 21 October (subject to a request by a Member to check his presence), and 5 and 11 and 19 November 2014, and 18 February 2015
In respect of the minutes of 22 October 2014, Ms Mente-Nqweniso wanted clarity as the delegates had not responded to the issues that were raised by the Committee.
The Chairperson stated that all issues outstanding had been taken to the sub-committee so that the sub-committee had to process all the outstanding issues. Attendance at that meeting had been quite good.
The Chairperson said that in future minutes would need to be adopted more regularly. She thought the Committee had done well in the past term, in following its strategic plan and an oversight visit. She informed the Members that the report for the oversight would be tabled on the next Portfolio Committee meeting and the draft would be sent prior to the meeting.
Mr McGluwa suggested that in future the Committee should conduct some unannounced oversight visits to see what was really happening, rather than getting sanitised reports prepared in advance.
The Chairperson agreed and adjourned the meeting.
Download as PDF
You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.
See detailed instructions for your browser here.