Health facilities inspection findings: Office of Health Standards Compliance 2016/17 report

This premium content has been made freely available

Health

05 June 2018
Chairperson: Ms M Dunjwa (ANC)
Share this page:

Meeting Summary

The Portfolio Committee met to discuss the annual inspection report of the Office of Health Standards Compliance (OHSC), which described some of the challenges faced by health establishments (HEs) in the provinces, municipalities and districts across the country.

The presentation focused on the aims of the annual inspection report, the background and the methodology used to collect and analyse data, leading to the findings and additional inspections. Comparisons were drawn between the findings of 2016/17 with those of 2014/15 and 2015/16 in order to determine if there was an emerging trend. The report was based on opportunistic inspections of health facilities in all the provinces, so the results and findings could not be extrapolated to the entirety of the HEs in a particular province or district. Cluster samplings were conducted at HEs which were considered high-risk and poor-performing. The extent of sampling was constrained by the budget availability, while other constraining factors were the distances between facilities and the staff availability at the time of inspections.

During 2016/17, 538 clinics out of 3 167, 56 community health centres (CHCs) out of 324, and 55 hospitals out of 325 had been targeted. This amounted of 649 targets out of 3 816 HEs, or 17% of the total. The number of inspections had varied in the different provinces, depending on the number of HEs, with the Eastern Cape having the most inspections (142) and the Northern Cape the least (29).

The OHSC provided extensive details of the criteria against which the facilities were rated, and ranked the performances achieved in the various provinces.

A total of 201 HEs – 12 hospitals, seven CHCs and 182 clinics – had been re-inspected in 2016/17, and differences in their performances had been recorded. Common reasons for non-compliance or lack of improvement included records reflecting the action taken on incidents of staff abuse on a patient not being available, forms used for informed consent not being completed correctly, no security personnel at the gate/entrance, a patient referral policy not being available, clinical audits of priority programmes not being conducted, and quality improvement plans on health initiatives not being implemented, among others.

The OHSC put forward recommendations that could assist in improving performances of HEs across the country.

Committee Members were concerned about the state of the health sector in the country. They asked questions about the disposition of the OHSC towards the integration of information from various media outlets; the values and attitudes of patients and healthcare providers; the protection of patients’ dignity and respect; remedial actions taken by the OHSC to solve identified challenges; cooperation between the OHSC, communities and all levels of health establishments in the country; specific names of inspected facilities, action plans of the OHSC to improve performance of primary healthcare facilities; the difference between ideal and compliant clinics; the differences in performance of facilities managed by provinces and municipalities; as well as inconsistencies in the number of facilities inspected in different provinces and districts.

Meeting report

Chairperson’s introduction

 

The Chairperson spoke about the importance of effective healthcare delivery to the people. While Committee Members performed legislative duties, officials of the Office of Health Standards Compliance (OHSC) were the foot soldiers that performed the actual work in the field. She urged OHSC to avoid a generalised presentation. The presentation should be specific, targeting individual Provinces, municipalities and Districts in terms of the different health establishments (HEs). She had worked and had considerable exposure in the health sector. She urged the OHSC to respond to questions posed by Committee Members in a clear manner. Committee Members were a group of health activists that had been exposed to the health sector, either by training or employment. The meeting would take a comprehensive and critical look at the report presented. The Committee aimed to get the active involvement of every stakeholder in the healthcare system. The community should be empowered to have adequate knowledge of their benefits in the healthcare system. She expected the OHSC to empower the Committee in preparation for the meeting to be held the next day, June 6.

Office of Health Standards Compliance: Annual Inspection Report

Dr Siphiwe Mndaweni, Chief Executive Officer (CEO): OHSC, said she would draw comparisons between the findings of the 2016/17 financial year with those of 2014/15 and 2015/16 in order to determine if there was an emerging trend.

Looking at the mandate of the OHSC as contained in Section 78, the objects of the Office were to promote and protect health and safety of users of health services by monitoring and enforcing compliance by HEs with prescribed norms and standards, and ensuring consideration, investigation and disposal of complaints relating to breaches of norms and standards. As of February 2018, the Minister had promulgated a series of standards and norms for HEs in the country that would come into effect in February 2019. The OHSC would verify the applicability of the norms and standards and advise the Minister on their implementation. It would also rate HES as compliant or non-compliant, based on specified norms.

The OHSC was linked to National Health Insurance (NHI) to ensure universal health coverage. The OHSC, through the certification process, monitored the risk in HEs. Starting from 2019, OHSC would be able to certify HEs through compliance. This would enable more HEs to access the NHI fund, which would empower them to handle contracts dealing with various services both in the public and private sectors.

The CEO took the Committee through the methodology the OHSC used to collect and analyse data. It had eight teams that had conducted inspections in 2016/17. Each team had a team leader who co-ordinated the inspections, which were conducted in line with National Core Standards (NCS). The layout of the NCS encompassed seven domains, which included patient rights, patient safety, clinical governance and clinical care; clinical support services; public health; leadership and corporate governance; operational management as well as facilities and infrastructure. The first three (patient rights, patient safety, clinical governance and clinical care, and clinical support services) formed the core, while the last four domains (public health, leadership and corporate governance, operational management as well as facilities and infrastructure) were supporting systems. These were required for any healthcare system to function efficiently and appropriately. This was in line with some of the dimensions that were embodied in the NCS according to the World Health Organisation (WHO) standards, that the OHSC must have prescribed dimensions of healthcare systems, which include efficiency, efficacy, accessibility, safety, and it must be centred on the patients.

Sampling of HEs

Dr Mndaweni took the Committee through the processes involved in the selection of HEs for inspection. The report was based on opportunistic inspections of health facilities in a given province, so the results and findings were only indicative of what happened in a given province or district. Cluster samplings were conducted at HEs that were considered high-risk and poor-performing. Sampling was constrained by budget availability, the distance from one facility to the other, as well as the staff strength of the office at the time of inspections.

For 2016/17, 538 clinics out of 3 167, 56 Community Health Centres (CHCs) out of 324, and 55 hospitals out of 325 had been targeted. This gave a total of 649 targets out of 3 816 HEs, constituting 17% of the whole. The number of inspections had varied in the different provinces, depending on the number of HEs. There had been 142 in the Eastern Cape (ED), 42 in the Free Sate (FS), 56 in Gauteng (GP), 109 in KwaZulu-Natal (KZN), 98 in Limpopo (LP), 54 in Mpumalanga (MP), 29 in the Northern Cape (NC), 56 in North West (NW) and 53 in the Western Cape (WC).

Types of inspections

696 routine inspections had been conducted, surpassing the target of 649 in 2016/17. 204 public HEs that scored below 50% had been re-inspected as part of a follow-up process to monitor the compliance of HEs with the Compliance Judgment Framework (CJF). Inspections were unannounced and facility managers were informed only when inspectors were on site. The inspection process followed a logical Plan-Do-Check-Act (PDCA) cycle. Inspection data were validated according to the data quality control steps outlined in the Standard Operating Procedure (SOP) to ensure the validity of the inspection process.

Data Analysis

Data was captured using the District Health Information System (DHIS). The database was structured to allow analysis of domains, sub-domains, standards, criteria, measures and values, as well as aggregation of the values by province, district, sub-district, facility name and facility type. Values of the measures were structured as zero (0) for non-compliant measures, and one (1) for compliant measures.

Pre-determined weights were attached to the value of the measures. Weights were determined based on the risk level of the measures and structured as follows:

  • Extreme = 40%. This represented non-negotiable measures that guaranteed the safety of staff and patients to prevent death and maltreatment at HEs;
  • ​Vital = 30%. This represented measures that ensured the safety of patients and staff. This helped to reduce harm to the minimum;
  • Essential = 20%. These were measures considered to be fundamental to the provision of safe, decent and quality healthcare within the confines of the available resources;
  • Developmental = 10%. The measures were considered elemental to the quality of healthcare that helped to obtain optimal health.

Overall scores were determined by using the sum of the weighted compliant measures as the numerator, and sum of all weighted compliant and non-compliant measures as the denominator.

Performance scores by province

Dr Mndaweni took the Committee through the overall average score of HEs inspected in all nine provinces in 2014/15, 2015/16 and 2016/17. Though performances had generally been below 60% in 2016/17, most provinces had shown improvements compared to 2015/16. The Provinces with the lowest average scores were the Eastern Cape 43%, Limpopo 43% and Mpumalanga 48%. Gauteng had scored 73% in 2014/15, 60% in 2015/16 and 61% in 2016/17. Western Cape had scored 56%, whereas Kwazulu-Natal had scored 57%, which were close to the baseline (BL) score of 60%. Free State and Northern Cape had declined in 2016/17 when compared to 2014/15, but had demonstrated improvements when compared to 2015/16.

The OHSC believed that patients’ experience would change and the quality of healthcare would improve if all HEs took cognizance of the six ministerial priority areas. The six ministerial priority areas were considered, based on the sub-domain of the overall domain:

Availability of medicines and supplies

There had been significant improvements across all nine provinces in the past three years. FS, NC, WC, KZN and NW had improvements in 2016/17, but FS had fallen below the initial baseline of 58%. GP had remained almost static compared to 2015/16. NC appeared to have the most significant improvement in terms of availability of medicines and supplies, rising from 48% in 2014/15 to 57% in 2016/17. EC and LP had the lowest scores, hovering around 40%.

Cleanliness

All nine provinces showed slight improvements in 2016/17. EC, FS, KZN, LP, NC and WC improved in the 2016/17 compared to 2015/16. EC, KZN, LP, NC and WC showed improvements with respect to the BL. It was worrisome that MP declined in 2016/17 compared to 2015/16.

Patient Safety

This was a critical measure, as it concerned clinical management and performance. Performance was therefore important. The BL had been set at 60%, and only GP had exceeded it. No significant improvement was recorded in all nine provinces. Compared to 2014/15, WC had remained static at 55%, KZN had declined from 57% to 55%, and MP had improved from 43% to 49%. It was noteworthy that MP had been static in the past two years at 49%. Patient safety was a priority area that needed improvement.

Infection prevention and control (IPC)

The importance of IPC could not be over-emphasised. All nine provinces had not been performing well in this area. Only NW showed a slight improvement in 2016/17 compared to 2015/16. Other provinces experienced a decline. IPC was a priority area of concern.

Values and attitudes

All provinces improved in 2016/17 compared to 2015/16. Though the sampling was not representative, it was encouraging that inspected HEs recorded improvements. Between 2014/15 and 2016/17, EC had moved from 50% to 56%; FS had gone from 62% to 55%; GP had slipped from 72% to 68%; KZN had improved from 58% to 67%; LP had risen from 42% to 51%; MP had gone from 49% to 58%; NC had jumped from 47% to 65%; NW had improved from 50% to 60%; and the WC had moved from 55% to 66%.

Waiting Times

Eight out of nine provinces had improved in this area. Improvements had been encouraged in the all HEs, since sampling was opportunistic. Only MP had declined in this priority area.

Performance scores per domain for all levels of care

Dr Mndaweni discussed the sub-domains under each domain to give clarity on the specificity of what happened in each domain

Domain 1: Patients’ Rights

Provinces recorded improvements, except for FS and GP, which declined when compared to their performance in 2014/15. MP had declined when compared to its performance in 2015/16.

Sub-domains

Access to a package of care (APC)

Based on provincial data, most HEs performed well. The APC was in accordance with national guidelines and licensing certification. The scores ranged from 98% (MP) to 91% (FS).

Access to information (AI)

The average sub-domain score was 62%. Provinces performed moderately well in terms of AI. The top three performing provinces were GP (73%), KZN (72%) and WC (71%). MP had the lowest average score of 59%.

Some of the identified deficiencies included incorrectly filled consent forms, non-availability of policies and guidelines on informed consent, some observed health professionals and providers were not wearing name tags, patient rights posters were not displayed, there was no signage board at the entrance of some health establishments to indicate times when various services were offered, and non-availability of a help desk and signage to direct patients and visitors to key areas.

Emergency care

The average score across the provinces was 42%. The top three performing Provinces were GP (75%), WC (72%) and KZN (70%). LP had the lowest average score of 41%.

The identified deficiencies were the non-availability of procedures emphasising the speedy handover of patients to reduce handover time from Emergency Medical Services (EMS), the policy on health establishment closures and ambulance diversions not being available, and the policy for the diversion of ambulances in the event of closure of an HE not being available. In certain cases, there was no appropriate security apparatus to welcome the patients. The entrances were sometimes untidy.

Complaints management

The average score in this domain was 46%. Generally, provinces did not perform well in terms of complaints management.

The policy for complaints management not available, complaints procedure not displayed, and complaints not logged in the register. Also, the committee for reviewing complaints did not have a term of reference (TOR).

Continuity of care

The average score across nine provinces (33%) was very low. Continuity of care dealt with patients' referral, booking, the arrangement of transport, patients’ destination, and the map of catchment areas.

Service providers and their contacts were not available.

Policy, procedures and protocols on patient referrals and bookings were not available; most maps of the catchment areas were without contact details of service providers in the referral chain, and the audited files of patients transferred into and out of the health establishment did not contain copies of referral letters.

Reducing delays in care

The average score across nine provinces (60%) was moderate. Only MP (56%) and NC (52%) fell below 60%.

Designated health professionals for triaging patients were not allocated, OHSC could not identify special queues designated for specific groups of patients like the elderly and the vulnerable, systems for reducing delays in care were not in place, documents indicating requirement for effective service delivery including human resources and equipment were not available, agreed-upon local targets or benchmarks for waiting times were not available, and patients were not informed on how long they would wait.

Respect and dignity

The average score across nine provinces (45%) was low.

Records to describe actions taken in the event of an incident of staff abuse on patients, actual or alleged, were not available, consultation and counselling of patients did not take place in an appropriate area which ensured privacy and confidentiality, patient satisfaction survey reports were not available, clean drinking water and disposable cups for patients in waiting areas were not available.

The OHSC had observed that 23 of the 25 inspected hospitals did not monitor waiting times for elective procedures. Eight hospitals did not have documentation regarding the handover of emergency patients from EMS.

Pictorial evidence showed that consultation and counselling were conducted in a manner which did not ensure the privacy and confidentiality of patients. In a certain instance, a procedure was done on a pregnant woman without the use of screens. There was no separation between paediatric wards and toilets (no privacy), visitors monitoring was done in the waiting area, and a bathroom was used for storage. Sometimes, toilets were not used for the intended purpose. Boxes could be found on the floor and on top of the patients’ toilet.

Domain 2: Patient safety, Clinical Governance and Clinical Care

This domain had special importance, as it pertained to the areas with high risk of litigation in terms of management. In 2016/17, GP had 64%, KZN (58%) and WC (58%). While these scores were low, the performances were higher compared to other provinces. GP had scored 78% in 2014/15 and 64% in 2016/17, and WC had 62% in 2014/15 and 58% in 2016/17. Compared to the score in 2014/15, FS had declined by 16% and GP by 14%. EC and LP constituted a concern because their scores were very low.

Sub-domains

Adverse events

Scores were generally low, except for GP with 53%. The overall average across all nine provinces was 30%.

Policies and procedures on management of adverse events, clinical risks, reporting and support staff affected by adverse events were not available, the forum for reviewing clinical risks was not available, an annual in-service plan that included training on how to carry out safety checks and prevent accidents in the environment, was also not available.

Clinical leadership

The overall average across all nine Provinces was 59%. The average score per province was as follows: WC (74%), KZN (69%), GP (69%), MP (69%), LP (61%), EC (59%), NC (56%), NW (56%) and FS (55%).

There were no job descriptions of healthcare providers designated as operational managers or sectional heads, nor did health professionals initiate quality improvement and patient-centred quality care.

Clinical risk

The overall average across all nine provinces was 44%.

The policy for emergency resuscitation procedure and forum for review of resuscitation, including terms of reference, were not available, the procedure for patients with special needs and protocols for safe administration of medication to patients were not available, clinical risk assessments were not done and clinical audits of priority programmes/health initiatives were not done.

Infection prevention and control

There was a general improvement in 2016/17, but there were areas identified for improvement. The overall average across all nine provinces was 49%.

In hospitals, there were no isolation facilities for infectious and communicable diseases; policies, procedures and isolation facilities for patients with infectious and communicable diseases, including standard precautions, prevention and control were not available; hand washing campaigns and audits were not conducted, TORs for the forum reviewing Infection prevention and control were not in plac; evidence of monitoring of common healthcare associated infections and educational material was not available for staff and patients; educational material for patients and staff on respirator use and universal precautions were not available.for specific healthcare-associated infections such as swine flu, cholera and Methicillin Resistant Staphylococcus Aureus (MRSA).

Patient care

The overall average across all nine provinces was 76%.

There was no evidence that morbidity and mortality were monitored. Though certain HEs complained about the inadequacy of their equipment, OHSC emphasised compliance with standards to ensure quality healthcare to the public.

Clinical management for improved health outcomes

The overall average score, 26%, was very low. Priority programmes or health initiatives were not monitored against the relevant targets.

Inspection reports showed that 11 health establishments had lacked procedures and precautionary measures required for vulnerable patients with special needs. Eight health establishments did not have systems in place to promptly identify adverse events, whereas 19 health establishments did not analyse the adverse event reports in order to manage the gaps identified. Lack of implementation of IPC programmes was noted in nine health establishments.

Pictorial evidence showed improper record keeping, incomplete signatures, and improper handling of the suction bottle. There were situations when times were not written against entries, qualifications of staff were not written and legal requirements of record keeping were not met. Also, emergency trolleys were not equipped with all emergency equipment. Therefore, patient safety was not assured. Double benches were seen to be put together to make a table for a clinic manager.

Domain3: Clinical Support Services

The score for EC had been low for the past three years. FS, GP, KZN, NC and WC recorded slight improvements in 2016/17 compared to 2015/16. LP constituted a concern because the score was very low and there was no consistency in the trend.

Sub-domains

Clinical efficiency management

The overall average score in this sub-domain was 16%, which was very low.

The case management systems were inefficient in HEs, as audits were not conducted to ensure accurate billing, no evidence that managers’ code according to Prescribed Minimum Benefits, nor that quality improvement plans were in place to address shortcomings in coding, and the procedures for mitigating against patient’s medical aid funds being exhausted with costs incorrectly passed to patients, were not in place.

Diagnostic services

The overall average score in this sub-domain was 78%, which demonstrated significant improvement in performance.

Radiology results requested were not available in patients’ files, radiation workers did not wear registered dosimeters, and X-Ray machines were not provided with a log book indicating quality control information on the device.

Pharmaceutical services

The overall average score in this sub-domain was 48%.

Standard operating procedures (SOPs) on the management of Schedule 5 and 6 medical supplies, the dispensing of medicines according to the Pharmacy Act, including after-hours access to medication, and the monitoring of adverse drug reactions were not available, and registers for Schedule 5, and 6 medicines were incorrect or incomplete. Medicines and medical supplies were not procured nor managed in compliance with relevant legislation and supply chain management (SCM) processes, and stock control systems including stock-take reports for medicines and medical supplies were not in place. The pharmacy and therapeutics committees in HEs were non-functional and some committees operated without TORs. A duty roster indicating the availability of an appropriate healthcare provider (pharmacist/ assistant/professional nurse) for dispensing medication according to the SOP during operating hours was not available.

Sterilisation services

The overall average score in this sub-domain was 31%.

The policy on sterilisation and decontamination was not available, nor approved and reviewed by a relevant authority as required and managers of sterilisation services were not appropriately qualified, experienced or competent for safe service delivery.

Health technology

The overall average score in this sub-domain was 28%, which was very low.

Medical devices were not maintained to ensure safety and availability, records for maintenance of critical equipment and systems to monitor items for replacement/ orders were not available, reports on adverse events involving medical equipment, as well as actions to prevent recurrence, were not available, nor time allocated for orientation and staff development and in-service training programmes, including assessment and updating on the correct use of equipment.

Mortuary services

The overall average score in this sub-domain was 47%. GP scored (83%), KZN (80%) and NW (58%), while FS had the lowest score.

Hospital mortuaries were not compliant with policy and legal requirements, as equipment was not regularly serviced or not in working order. Mortuary staff did not put on protective clothing and masks. Appropriate temperature was maintained twice a day, as required. Policies on transportation of corpses were not available in some of the HEs.

Therapeutic support services

The overall average score in this sub-domain was 32%.

Blood and blood products were available to support the level of care required, but there was no evidence that blood reactions were reported monthly to the Adverse Events Committee. Regular multi-disciplinary meetings were not held, attended and recorded by a full range of clinical support staff, there was no updated list of non-governmental organisations (NGOs) and disabled people’s organisations (DPOs), nor records of access to a social worker at HE’s to ensure patients requiring social support were assessed, treated and referred according to local clinical protocols.

Findings showed that 11 health establishments did not have a system for reporting and monitoring of adverse drug reactions. Also, 12 hospitals lacked access to effective blood and blood products.

Pictorial evidence showed that stock was not rotated as per policy to ensure that expired drugs were disposed of in a proper manner. Two autoclaves were not in working order, the ceiling of a radiology room had patches of water leakage, a medical supplies storage room was infested with cockroaches, radiology machines were found not to be in working order and awaiting repair, some radiology machines had been decommissioned because they had stopped functioning since 2013. In a certain case, water with soap in a basin had been found together with medicines in the same medicine trolley. Schedule 5 medicines were not managed and controlled properly and the person who had issued medication was not witnessed.

Domain 4: Public Health

It was noticeable that most provinces had not performed well in this domain. GP had led other provinces in previous years (BL=68%), but was currently almost at the same level (48%) with other provinces.

Sub-domains

Health emergencies and disaster preparedness

The performances were low in all provinces, GP being the highest with 37%, and EC being the lowest with 14%

Inter-sectoral plans for the management of potential health emergencies and disease outbreaks were neither available nor updated, annual disaster management plans were not available, not updated and not displayed. Staff were not knowledgeable on the disaster management plan, including health emergencies and their relevant roles in the plan.

Environmental controls

The average score across all nine Provinces was 57%. The performances of provinces were commendable in terms of this domain. The scores ranged from 87% for WC to 44% for KZN.

A service level agreement (SLA) for the safe disposal of toxic chemicals, radioactive waste and expired medicines was not available. Where agreements were available, they were not monitored or reviewed as planned, and they did not include safe disposal of radioactive waste. Implementation of environmental controls limiting environmental damage and public health risk management was not available.

Health promotion and disease prevention

The average score across all nine provinces was 44%. The scores ranged from 69% for GP to 44% for EC.

Evidence of participation in health promotion activities was not available and health calendars for health promotion campaigns were not available.

Population-based planning

The average score across all nine provinces was 32%. The scores ranged from 45% for GP to 26% for FS.

A health establishment was not sign-posted on the access road, and minutes or correspondence to indicate a remedy to improve signage and road access was not available. Management had no understanding of the disease burden in the catchment population, and the health service plan for health outcomes and needs of the community was not available.

Domain 5: Leadership and Corporate Governance

OHSC measured the existence and functionality of governance structures, such as the hospital boards and clinic committees, and oversight on operational matters like service level agreements and so on. Most provinces had dropped from the BL, but EC and NW had improved, although off a low base.

Sub-domains

Communication and public relations

Most HEs did not have effective communication strategies. The overall average across all nine Provinces was 21%.

The policy for obtaining patients’ consent on the disclosure of identifiable information to third parties was not available. There was no communication strategy, neither was there evidence of communication channels or staff satisfaction surveys. Public relations were not well managed to provide accurate and appropriate information on the service rendered, or exceptions.

Effective leadership

The performance in effective leadership was generally low across all nine provinces, with the overall average being 34%. This was a concern, as leadership was an important factor that drives quality in healthcare system.

Exit interviews and action plans to address concerns raised by managers were not conducted and managers were not held accountable for implementing service delivery objectives, compliance requirements and performance reviews, as there were no performance management agreements in place. Senior managers did not have evidence of leadership and performance management assessments to support all levels of leadership development.

Quality improvement

The overall average score was 50% across all nine provinces.

Terms of reference for the forum reviewing quality and minutes indicating that quality aspects were regularly discussed, analysed and evidence to show that actions had been taken to improve quality, were not available.

Risk management

The overall average score was 20% across all nine provinces. MP had performed very well, with an average score of 80%, whereas NC showed no sign of performance (0%). The risk management strategy not available

Oversight and accountability

The overall average score was 44% across all nine provinces.

There was no evidence that the governance structure provided appropriate oversight to ensure quality, accountability and good management.

Strategic management

The scores across all nine provinces were generally low, the overall average being 20%.

The organograms were either not dated, out-dated or not signed. The minutes of management meetings did not show that internal audit reports were prepared and reflective of what actually happened within the facilities. There was no evidence that operational plans were monitored on a quarterly basis against set targets. There was no evidence of the requirements for finance and human resource indicators.

Human resource allocation did not ensure sufficient staff in terms of appropriate qualification, scope of practice and disciplines required for service delivery. Strategic and operational plans with clear objectives to support the delivery of services were not available and there was no evidence that findings of internal and external audits were considered. HEs did not have a risk management strategy to ensure risks were actively monitored, recorded and managed.

Nine hospitals were found to have inadequate leadership. Pictorial evidence showed that minutes of forums did not contain action taken to address identified challenges, patient satisfaction survey reports were not available and only raw data was provided, and a sub-district operational plan was out-dated (2012/13). There was no complaint management policy guideline, as the document was still in draft.

Domain 6: Operational management

This domain was closely linked with domain 5. NC, NW and WC showed improvements compared to their performance in 2015/16 FY. However, their average score still fell below 50%. There was no consistency of trend in the performance of EC and LP over the past three years.

The sub-domains of operational management were information management, medical records, supply chain and asset, transport and fleet management, financial management, human resource management, as well as staff welfare and employee wellness.

Staff wellness was not actively promoted in 20 health establishments, and there was no efficient management of stock in three community health clinics (CHCs).

Pictorial evidence showed inadequate filling systems and space for all records, leaking roofs in stores that posed a risk to the quality of items, and redundant furniture was not properly managed.

Domain 7: Facilities and infrastructure

This domain covered buildings and grounds, food services, hygiene and cleanliness, linen and laundry, machinery and utilities, safety and security and waste management.

EC and MP Provinces showed year-to-year improvements, which was commendable. NC improved from 43% in 2014/15 to 50% in 2016/17 and WC improved from 57% in 2014/15 to 61% in 2016/17. LP remained static at 46% for the past two years.

Sub-domains

Buildings and grounds

Most of the buildings were old and infrastructure was not adequate. The overall average score across all nine provinces was 52%.

Available infrastructure was inadequate and not appropriately used as intended according to the original building plans, as the layout of HEs did not allow for the facilitation of a logical flow of patients and services, and waiting areas provided were inadequate shelter. The seating and space for patients were not adequate, and there was poor ventilation and lighting. Grounds were not maintained and inspections to ensure adequate lighting for safety and protection of the environment for staff, visitors and vehicles were not regularly conducted

Food services

The overall average score across all nine provinces was 52%. GP had the highest score of 73%, while LP had the lowest score of 45%.

The service did not meet the required hygiene and environmental standards, as meals were not delivered to wards on appropriate trolleys, nor was there evidence of patients’ satisfaction with the presentation and quality of the food. Some kitchen equipment was not in working order. Processes for procurement, storage and preparation of food were inadequate.

Hygiene and cleanliness

The overall average score across all nine provinces was 39%. WC had the highest score of 66%, while LP had the lowest score of 36%

Not all areas were kept clean, including critical public and patient care areas, nor were records of daily inspections of cleanliness and monthly pest control available. Cleaning machines were not regularly serviced. Notices prohibiting smoking were not displayed.

Linen and laundry

The overall average score across all nine provinces was 65%. NW had the highest score of 73%, while NC had the lowest score of 54%.

Linen rooms were observed to be used as storage cupboards, and some laundry machines were not in working order. Policies and procedures for handling linen were not available, nor were records of maintenance and servicing of laundry equipment. Stock-taking was not done and linen rooms were not locked.

Machinery and utilities

The overall average score across all nine rovinces was 44%. WC had the highest score of 69%, while LP had the lowest score of 42%.

HEs had no documented evidence that critical clinical areas were supplied with emergency power without delay in the event of a disruption, including an electrical power logbook and inspection sheets, nor was there recording of regular functional piped medical gas and vacuum systems. HEs did not have a functional public communications system ensuring communication in the event of an emergency, including evacuation. There was no evidence that systems and installations were maintained, tested and inspected according to the regulations, nor policy and procedures for the maintenance and management of equipment and installations, nor site and floor plans depicting the location and layout of the main utility services.

Safety and security

The overall average score across all nine provinces was 40%. GP had the highest score of 56%, while FS had the lowest score of 18%.

The policy on the security system for safeguarding buildings, patients, staff and visitors was neither in place nor up to date. Fire certificates for HEs’ compliance with regulations were not available, nor were safety and security notices displayed and promoted. Quarterly emergency drills were not conducted.

Waste management

The overall average score across all nine Provinces was 42%. KZN had the highest score of 71%, while MP had the lowest score of 38%.

Waste management policies and procedures were not noted. Up to date waste management plans and reports for Health Care Risk Waste (HCRW) were not available. General waste was inappropriately removed, stored and not transported timeously.

Maintenance of buildings was lacking in 16 health establishments. Nine health establishments did not actively protect people and property to minimise safety and security risks.

Pictorial evidence revealed damaged and collapsed ceilings, broken toilet seats and unappealing toilet conditions, inadequate condition of the bath tub in a doctors’ room, cleaners did not wear protective clothing, improper disposal of waste, grounds not well maintained and waste burned within the HE, posing a health risk, and accessibility for wheelchair patients was not considered.

Additional inspections

According to the procedural regulations pertaining to the functioning of the OHSC, an inspector may at any time conduct an additional inspection, provided there were reasonable grounds to believe such an inspection was needed to establish whether non-compliance had been remedied. OHSC planned to re-inspect 35% of health establishments (HEs) that had scored 50% and below within a period of six months for both compliant and non-compliant measures.

A total of 201 health establishments (12 hospitals; seven CHC’s and 182 clinics) were re-inspected in financial year of 2016/17 FY. There were disparities in the performance of HEs in relation to the time they were re-inspected. Some HEs were re-inspected within a period of six months and had either improved, declined or had no change in their overall performance. Some HEs were re-inspected after twelve months, two years and even four years, and had either improved, declined or had no change in scores after the re-inspection.

An analysis of hospitals and CHCs’ dashboards gave an indication of what had contributed to the improvement, decline or no change in scores of HEs. It had also revealed extreme measures that needed to be addressed immediately as well as developmental measures such as waiting areas, staff or documents/policies and quality improvement plans that needed to be developed to address the gaps.

Overall summary of findings

Dr Mndaweni spoke about the number of public HEs that had been re-inspected in 2016/17 FY. There had been 44 re-inspected in the EC, 35 in FS, 39 in GP, ten in KZN, 50 in LP, six in MP, one in NW, 15 in NC and ten in WC, totalling 201. In all 182 clinics, seven CHCs, six district hospitals, three regional hospitals, two provincial tertiary hospitals and one central hospital had been re-inspected across all nine Provinces.

Findings: Hospitals

She gave a comprehensive breakdown of the performances of individual hospitals that were re-inspected, using bar chart representation. The pictorial representation could be found in the annual inspection report of the OHSC, on page 65 (see attached document).

Hospitals that had declined following a re-inspection were not compliant with the following extreme measures:

  • Formal policy for handling emergency resuscitations was not available;
  • Records describing action taken in the event of an incident of staff abuse (actual or alleged) on a patient were not available;
  • Measures to prevent incidents of harm to staff were not in place;
  • There was no documented evidence of power supply available in critical clinical areas such as an intensive care unit (ICU), theatre, or accident and emergency facilities during a power disruption;
  • There were no reports documenting remedial actions taken in the event of an incident of harm to staff members;

Findings: CHCs

Dr Mndaweni also gave a comprehensive breakdown of the performances of individual CHCs that had been re-inspected using bar chart representation. The pictorial representation could be found in the annual inspection report of the OHSC, on page 67 (see attached document).

The OHSC would conduct more re-inspections to ensure CHCs complied with prescribed norms and standards. It was commendable that there were more CHCs with improvements than those which had declined or remained static.

CHCs that had declined following a re-inspection were not compliant with following extreme measures:

  • Formal policy for handling emergency resuscitations were not available;
  • Measures were not in place to prevent any incident of harm to staff;
  • No reports on remedial actions were taken in the event of an incident of harm to staff;
  • There was no documented evidence of emergency power supply available in critical clinical areas such as ICU, theatre, or accident and emergency facilities in the event of a power disruption

Findings: Clinics

There were improvements and declines in scores amongst the 182 re-inspected clinics in relation to the time elapsed between the first and subsequent inspections

The results of re-inspected clinics by province and district could be found in the annual report of the OHSC from page 70 to 105. Clinics had been re-inspected across all nine Provinces.

EC Province

In the EC, most of the clinics did not show improvement. There was a general decline, except at Paballong clinic, in Alfred Nzo District Municipality, which remained static at 28%. Bhisho Gateway clinic in Buffalo City Metropolitan Municipality demonstrated a commendable improvement of 11% between the first and second inspections that took place after three years. All but Elliot clinic, in Chris Hani District Municipality, showed a decline between the first and second inspection, conducted with an interval of four months.

There was no specific trend in the performance of clinics in Joe Gqabi District Municipality. Some clinic displayed improvement, whereas others declined following re-inspection.

In Oliver Thambo District Municipality, Buchele, Caguba, Ludalasi and Nkanunu clinics showed a decline in performance. On the other hand, Lujizweni, Majola, Mgwenyan and Phahlakazi clinics had improved. Re-inspections were conducted after three months.

FS Province

In Fezile Dabi District Municipality, only Deneysville and Qalabotjha clinics had declined. Other clinics demonstrated improved performances following re-inspection after three months.

In Lejweleputswa District Municipality, only K-Maile clinic had declined. Bophelong (Welkom) clinic remained static, while Bophelong (Odendaalsrus), Kgotsong, Phomolong (Hennenman), Rheederspark and Riebeeckstad clinics showed improved performances. Re-inspections had been conducted after four months in the above-mentioned facilities. However, re-inspections were conducted three times for Welkom clinic. There were improvements for the first two re-inspections, but the clinic had demonstrated a decline of 3% between the third and fourth inspections.

In Mangaung District Municipality, Harry Gwala (Botshabelo) and Winnie Mandela (Botshabelo) clinics had declined performances following the second inspection after 23 months. Bophelong (Botshabelo), Dr Pedro Memorial and Jazzman Mokhothuclinics had improved performances following the second inspection after 23 months. There was no specific trend in performance of Thusong clinic following the first and second re-inspections.

In Thabo Mofutsanyane District Municipality, Eva Mota and Makhalane clinics had declined performances following the second inspection after four months. Marakong, Matsieng, Monontsha, Nthabiseng, Tebang, Thabang and Tshirela clinics had improved performances following the second inspection after four months.

Gauteng Province

In the City of Johannesburg Metropolitan Municipality, 80 Albert Street, Bezvalley, Jeppe Street, Malvern, Mayfair and Tshepisong clinics had declined performances following the second inspection after four months. However, Leondale, Princess and Weltevreden clinics had improved performances following the second inspection after four months.

In the City of Tshwane Metropolitan Municipality, only Soshanguve Block TT clinic had a declined performance following the second inspection after 38 months. However, Danville, Phahameng, Phedisong 1, Phedisong 6, Soshanguve 2, Soshanguve Block X and Ubuntu clinics had improved performances following the second inspection after 38 months.

In Sedibeng District Municipality, both Heideberg and Rensburg clinics had improvements, from 47% to 55%, and 41% to 49%, respectively, following re-inspection after five months.

In West Rand District Municipality, only Badirile clinic had declined performance following the second inspection after five months. All other clinics had improved performances following the second inspection after five months.

KZN Province

In iLembe District Municipality, only Maphumulo clinic had declined performance following the second inspection after four months. All other clinics had improved performances following the second inspection after four months.

In uMgungundlovu District Municipality, only Scottsville clinic was inspected. It had improvements from 45% to 59% following re-inspection after twelve months.

Limpopo Province

In Capricorn District Municipality, Makanye clinic had declined performance following the second inspection after three months. All other clinics had improved performances following the second inspection.

In Greater Sekhukhune District Municipality, Hlogotiou, Kwarielaagte, Mamone, Moutse East and Schoonoord clinics had declined performance following the second inspection. All other clinics, except Dichoeung clinic, had improved performances following the second inspection. The performance of Dichoeung clinic remained static after the second inspection.

In Mopani District Municipality, only Muhlaba clinic had a slightly declined performance following the second inspection. All other clinics, except Dan Village clinic, had improved performances following the second inspection. The performance of Dan Village clinic remained static after the second inspection.

All clinics re-inspected in Vhembe District Municipality had improved performances following the second inspection, except Mashau clinic, which remained static at 43%.

In Waterberg District Municipality, Bavaria, George Masebe Gateway, lp Sekgakgapeng and Rebone clinics had declined performance following the second inspection. All other clinics, except lp Segole clinic, had improved performances following the second inspection. The performance of lp Segole clinic had remained static after the second inspection.

MP Province

All clinics re-inspected in Ehlanzeni District Municipality had improved performances following the second inspection.

NW Province

Wolmaransstad clinic in Dr Kenneth Kaunda District Municipality had a declined performance of 6% following the second inspection.

NC Province

In John Taolo Gaetsewe District Municipality, only Churchill clinic had a declined performance following the second inspection after four months. Manyeding, Kuruman, Kathu and Bothetheletsa clinic had improved performances following the second inspection after four months. Maruping clinic had also demonstrated an improved performance after two re-inspections.

All clinics re-inspected in Namakwa District Municipality, except Churchill clinic, had improved performances following the second inspection.

In Pixley ka Seme District Municipality, the De Aar Town clinic was static in its performance between the first and second re-inspection, Hopetown clinic had declined in performance, while there was no specific trend in the performance of Montana clinic.

WC Province

In the City of Cape Town Metropolitan Municpality, Phumlani clinic had declined in performance, whereas Rocklands clinic had improved following re-inspection after four months.

In Overberg District Municipality, both Hawston and Standford clinics had improved performances following re-inspection after twelve months.

In West Coast District Municipality, Diazville, Moorreesburg and Saldanha clinics had improved performances following re-inspection after three months, while Langebaan and Piketberg clinics had declined following re-inspection after three months.

Common reasons for non-compliance/lack of improvement

Dr Mndaweni provided the Committee with the main reasons why health facilities were not complying or not making improvements in their performance. These included records reflecting action taken on incidents of staff abuse on a patient were not available, forms used for informed consent were not completed correctly, there were no security personnel at the gate/entrance, the patient referral policy was not available, clinical audits of priority programmes were not conducted, and quality improvement plans on health initiatives were not implemented. Also, health care professionals interviewed had indicated that they did not have access to adequate supervision.

The policy for handling emergency resuscitation was not available. The protocol regarding safe administration of medicine for patients and children was not available. The emergency trolley was not properly stocked. Adverse event reports and the procedure to support staff affected by adverse events were not available. Reporting systems for needle stick injuries or other failures of standard precautions were not in place. The hand washing campaign was implemented.

The SOP detailing how schedule 5 and 6 medicines were stored, controlled and distributed was not available. Radiology results requested were not available in patients’ files. A system to monitor items requiring replacement or ordering was not in place. Maintenance records of medical equipment were not available. Reports showing adverse events involving medical equipment were not produced. Operational plans were not available. Records showing that needle stick injuries staff received post-exposure prophylaxis, were not available.

Evidence showing medical examinations performed on staff exposed to occupational hazards were not available. Records of needle stick injuries were not available. Standard operating procedurew for request/retrieval/filling of patients files produced were not signed. Safety hazards such as collapsed ceilings were observed at waiting areas. There was no security on duty on the day of the inspection. Daily inspection of cleanliness was not done. Records of pest control were not available. Linen was not proportionate to the requirements of the health establishment.

Summary Findings

Dr Mndaweni took the Committee through the average scores per Province. The average scores were 61% for GP, 57% for KZN, 56% for WC, 50% for NC and NW, 48% for MP, 45% for FS and 43% for EC and LP. Both EC and LP generally had low scores all along. Despite the fact that GP recorded an average score of 61% in the 2016/17 FY, the score was 14% lower than that of 2014/15 (75%). The National BL was set at 52%. Once the overall performance of an HE was determined, OHSC used the Compliance Judgment Framework (CJF) to determine compliance of the HE to prescribed norms and standards. The CJF considered the following:

  • Aggregate scoring awarded after each inspection
  • Clinical outcome of HEs
  • Consequences that may arise from non-compliance
  • Improved capacity of HEs

Dr Mndaweni explained the categorisation of scores based on the Framework, and the implications for the health facilities rated within the various categories, which ranged from compliance to non-compliance. The OHSC intended to implement CJF in conjunction with the promulgation that would take effect in February 2019 in terms of facility grading. This would help to certify a facility as compliant or non-compliant.

Most of the clinics and CHCs in EC were non-compliant and critically non-compliant.

Clinics in FS were mainly non-compliant and critically non-compliant.

In GP, three HEs had compliant status, 16 had compliant with requirement status, 30 had conditionally compliant status, 26 had conditionally compliant with serious concern status, 24 had non-compliant status, and three had critically non-compliant status.

In KZN no HEs had compliant status, eight had compliant with requirement status, 18 had conditionally compliant status, 42 had conditionally compliant with serious concern status, 31 had non-compliant status and four had critically non-compliant status.

In LP no HEs had compliant status, one had compliant with requirement status, two had conditionally compliant status, 23 had conditionally compliant with serious concern status, 65 had non-compliant status and 62 had critically non-compliant status.

In MP no HEs had compliant status, none had compliant with requirement status, four had conditionally compliant status, 16 had conditionally compliant with serious concern status, 13 had non-compliant status and 17 had critically non-compliant status.

In NC no HEs had compliant status, two had compliant with requirement status, five had conditionally compliant status, seven had conditionally compliant with serious concern status, 24 had non-compliant status and four had critically non-compliant status.

In NW, one HE had compliant status, one had compliant with requirement status, seven had conditionally compliant status, 20 had conditionally compliant with serious concern status, 17 had non-compliant status and 19 had critically non-compliant status.

In the WC, one HE had compliant status, four had compliant with requirement status, nine had conditionally compliant status, 20 had conditionally compliant with serious concern status, 22 had non-compliant status and three had critically non-compliant status.

Dr Mndaweni said the OHSC was making the following recommendations:

  • The National Department of Health to develop a National Quality Improvement framework and guidelines
  • The National Department of Health to develop quality indicators to monitor health establishments
  • Based on the OHSC inspection findings, Provinces to develop a provincial health service-based quality improvement programme ad monitoring mechanisms
  • Continuous education and capacity building of managers with regards to internal quality improvement programmes in all health services.

Discussion

Dr S Thembekwayo (EFF) commended the presentation, but remarked that the health sector was in a shambles. She expressed concern about the additional re-inspection of 201 HEs. How did the OHSC integrate information from different media outlets into its plans? Did the OHSC have any plans for deviations? It should be flexible with its plans, especially with HEs that required urgent attention.

She also spoke about values and attitudes. Whose values and attitude did the OHSC talk about? Was the OHSC referring to the attitudes of patients or the staff at HEs? She requested the OHSC to furnish the Committee with a questionnaire detailing the questions that had been put to the respondents.

She sought clarity on how patients’ dignity and respect were protected. She also asked about the OHSC’s monitoring and remedial action taken to address identified challenges. What did it want HEs to do with its findings? Did the OHSC have meetings with various stakeholders in HEs to discuss the findings and recommendations in order to ensure sustained improvements? It should ensure active participation at all levels to ensure that improvements were sustainable.

Dr P Maesela (ANC) complained about the inability of the OHSC to provide Committee Members with the specific names of the inspected HEs. He asked if the OHSC would make the results of inspections available to HEs.

Ms S Kopane (DA) commended the OHSC for its presentation. The presentation painted a picture of what happened in the health sector. The situation was worrisome, but she expressed hope that action could be taken to rectify the identified challenges. She sought clarity on the attitudes of HEs towards the OHSC officials during inspections. Had HEs cooperated with it during inspections? What happened to HEs that did not cooperate? She requested the OHSC to give a detailed description of each sub-domain. This would help Committee Members during assessments. What were the plans of the OHSC to improve HEs that performed poorly during inspection? Did the OHSC arrange meetings with HEs to discuss its recommendations? What were the action plans, costing and timeframe associated with the recommendations? Recommendations were not useful until they were implemented. Action plans gave hope that improvements could be expected. Most hospitals had performed better than clinics. What was the OHSC doing to assist HEs, especially clinics (primary health centres) in preparation for National Health Insurance (NHI)?

Mr A Mahlalela (ANC) commended the presentation. He sought clarity on the relationship between compliant and ideal clinics. He could not establish the correlation between the two, because of the disparity in the numbers. The number of ideal clinics was higher than the number of clinics that were considered compliant. Clarity on this matter would ensure that the Department and the OHSC were on the same page in terms of their reports.

To whom were the recommendation addressed? It appeared to him that recommendations were addressed to the National Department of Health, whereas challenges occurred at the provincial level in respect of human resources, financial management, supply chain management, procurement and so on. These were functions that were performed by provinces. Had the OHSC engaged with provincial authorities to find out if they had measures to address the identified challenges? Had the OHSC provided solutions to provincial authorities to address the identified problems? One of the roles of the OHSC was the provision of remedial actions. Remedial actions had the potential to improve compliance and reduce cases of lack of performance. Also, the OHSC would be better placed to enforce compliance during subsequent inspections if it provided solutions to identified problems. He advised the OHSC to consider offering viable solutions, as some HEs had complained at a recent annual performance report meeting that the OHSC did not offer solutions to the challenges it identified. 

He sought clarity on the summary of the findings in terms of average scores. It was important to ensure the scientific correctness of the scores in terms of what obtained in individual Provinces.

What were the differences in clinics managed by provinces, and those managed by municipalities?

The Chairperson drew attention to the inconsistency of the OHSC in respect of the number of HEs inspected in each province. It had inspected more HEs in certain provinces, while neglecting HEs in other provinces to a large extent. LP and MP were good examples of where a handful of HEs was inspected. She complained that no HEs were inspected in the far east and west of the EC. Such gaps had to be addressed, especially as the OHSC prepared for the 2018/19 inspections and report. There must be a well-defined pattern for inspections. She also sought clarity on the distinction between ideal and compliant clinics.

OHSC’s response

Mr Bafana Msibi, Executive Director: OHSC, started by responding to the questions asked by Dr Thembekwayo. On deviations, the OHSC was flexible in its operations. He said media, random sampling and clinical data in terms of clinical outcomes were the three criteria that guided inspections. Detailed information about this was contained in page 158 of the final annual performance report.

The OHSC engages in discussions with HEs in provinces and districts. Mr Msibi addressed this question in line with the attitude of HEs towards OHSC officials during inspections. So far, all inspected HEs had cooperated with the OHSC, except for one in KZN. The inspection had continued unhindered after interventions from senior managers from the OHSC and the HE. Inspections were conducted unannounced ,and facility managers were informed only once inspectors were on site. Inspection of district HEs usually took a day, while inspections of HEs across a province were done for a week. Inspections were conducted by a team of six inspectors headed by a Deputy Director and an administrative assistant who helped to gather evidence obtained during inspections.

The CEO of the health facilities usually instructed the operational managers and other relevant officers to participate in the inspection. Relevant officers were requested to sign, and data collected was populated on the OHSC database. Preliminary results were usually available within 20 days. The results were made available to managers of HEs in provinces and districts. The OHSC then advises managers of HEs to facilitate a venue to discuss the findings. This event usually takes a whole day, and it helps HEs to learn from one another.

In the FS, the heds of departments (HODs) form part of the meetings. In KZN, the HODs participate in the meetings. Some HEs also come with facility managers. The OHSC shows them the evidence and takes them through the findings. District and provincial managers were also involved in certain cases. The OHSC tries to get relevant people to sign after presentation of the results. The OHSC welcomes feedback from HEs within ten days. So far, there were no disagreements except for a single facility. The OHSC responds promptly to matters raised.

In response to Mr Mahlalela’s question on follow-up, Mr Msibi said that the OHSC serves compliance notice to HEs that had challenges. The compliance notice includes identified challenges and recommendations. The HEs were expected to implement the recommendation within a stipulated timeframe. If the HEs failed to implement recommendations, the OHSC was compelled to take enforcement action. Enforcement actions and policy had not been taken yet because they were still in the developmental phase. They would come into effect during subsequent inspections.

In response to Dr Maesela questions about reports to HEs, Mr Msibi said that HEs were given tools that helped them to conduct self-assessment in order to monitor progress.

In response to Ms Kopani’s question on the disparity in the performances of hospitals and clinics, Mr Msibi said that hospitals performed better than clinics because hospitals had dedicated staff that managed quality issues, compared to clinics where staff were usually multi-tasked.

Mr Msibi gave a breakdown of the number of facilities inspected in relation to actual denominators in each province and district. as seen on page 10 of the annual inspection report.

In response to the Chairperson’s question on the disparity in the number of HEs inspected across all nine provinces, Mr Msibi said that not all district or provinces had been inspected in 2016/17. HEs were usually given six to eight months to improve performance before the next inspection. HEs were re-inspected immediately only if the status was critical. The facilities missing in the 2016/17 report might have been inspected in previous years.

In response to Mr Mahlalela’s question on the number of HEs inspected in LP and MP, Mr Msibi said that most HEs in the two Provinces had scored below 50%, causing more re-inspections.

Ms L Libese, Executive Director : Department of Health, responded to the questions on the correlation between ideal and compliant clinics. Regarding the questionnaires, she said that questionnaires were given to users during the inspection. This enabled the users to indicate their satisfaction and challenges.

Mr Msibi promised to give a copy of the questionnaire to the Committee.

In response to the question on what the OHSC did to help HEs in preparation for NHI, Ms Libese said that the OHSC used the CJF. A copy of quality improvement plans was given to HEs ahead of OHSC inspections.

Regarding continuity, the OHSC met with heads and staff of provincial departments to discuss recommendations and offer remedial actions. This helped to improve performance and the quality of health care to users. Measures were in place to provide remedial actions to problems. Going forward, the OHSC would not only deal with report, but would also consider observations of what went on in other provinces. This allowed for benchmarking across the country.

In response to the question asked on the protection of patients’ dignity and respect, Ms Libese said that users were interviewed during inspections in order to have the knowledge of their satisfaction and challenges. She said the values and the attitude of the users must be balanced with those of the staff of HEs. Patients should be educated to have knowledge of the services they were entitled to in HEs.

Mr Msibi referred to the comparative performances of HEs managed by provinces and municipalities, and said that hospitals in districts and municipalities had improved their performance from the national viewpoint. However, municipal hospitals posed considerable challenges from the provincial viewpoint. Clinics in municipalities needed more attention than those managed by provinces.

In response to Mr Mahlalela’s question on ideal and compliant clinics, an official of the OHSC said that the office would examine the sustainability of ideal clinics. This would facilitate consistent improvements in HEs. The OHSC would work with the NDOH to establish the challenges facing ideal clinics. This would help to sustain the status of ideal clinics, which was the target on a national scale. She said there was a need to make a distinction between ideal and non-ideal clinics. The OHSC would coordinate with the NDOH and provincial departments to address identified challenges with a view to improving the performance of provincial HEs.

Dr Mndaweni responded to questions on dignity and respect of patients’ sub-domain in terms of the disparity between data and pictorial evidence. While the HEs showed improvements in the sub-domain, pictorial evidence was given to highlight critical issues observed during inspections.

Discussion

The Chairperson commented that all questions had been answered.

Mr Mahlalela sought clarity on specific factors responsible for deficiencies in each sub-domain.

Ms Kopani sought clarity on the roles of provinces and municipalities in the management of clinics. Constitutionally, provinces were supposed to manage clinics in municipalities, while municipalities had limited responsibilities. Deficiencies may arise if municipalities attempted to perform the roles of provinces.

Dr Mndaweni said that the OHSC encountered challenges, and it was putting measures in place to address the challenges.

The Chairperson spoke about the need to balance patients’ attitudes towards staff with staff attitudes towards patients. She cited examples of situation where staff of HEs were maltreated. The OHSC should also give questionnaires to staff of HEs to get a reflection of patients’ attitudes towards them. She urged communities to take appropriate care of HEs in their areas. The HEs were the properties of the communities. One of the ways communities could take care of HEs was to treat staff of HEs with dignity.

She said that the OHSC had not given any examples of HEs that had been closed down due to non-compliance. The Committee could have closed down certain facilities for non-compliance, but it did not have the power to do so. She advised the OHSC to inspect and report on big facilities like Groote Schuur and Livingstone hospitals. The OHSC could then use the lessons learnt to empower HEs having challenges during public meetings. Public HEs must be managed properly to correct the impression that they were “places to die”.

The health sector was delicate, and people must be empowered to get appropriate healthcare. People in positions of leadership must own up when they make mistakes and steps must be taken to correct such mistakes. Parliament was set to coordinate with the NDOH to provide adequate leadership in the health sector.

The meeting was adjourned.

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: