The Department of Basic Education (DBE) briefed the Committee on the 2013 national senior certificate (NSC) results, and Umalusi reported on its quality assurance of the NSC.
DBE said that the majority of school subjects had already been moderated and were compliant with the 18-month cycle. It had constituted two separate panels for paper 1 and paper 2 subject exams, where examiners were concerned. It had started preparing exemplar tasks for School Based Assessments (SBA) and Life Orientation (LO) that educators could use when assessing presenting at the schools level. To that effect, it was planning to train subject advisers in a one-day intensive workshop so that they could pass on the skills and knowledge they would have gained to the large number of teachers they dealt with.
The Committee asked what standards would be applied in the accreditation of markers, to ensure quality competence. Could DBE clarify the issue of the 30% pass rate, specifically with regard to progression towards tertiary institutions? What were the actual problematic elements for markers, when using rubrics? What were DBE interventions to address those elements?
DBE replied that there were criteria that considered qualification as well as experience, that were outlined in policy for marker appointments. Per qualification, the marker needed to have done the subject at a second-year level of diploma or degree studies. Per experience, two years had to be grade 12 teaching experience, with an additional three that could be from anywhere in the schooling system, totaling five years of teaching overall. DBE made sure that every marker application was validated and verified by the school principal and a district official.
The DBE said that the requirements for passing Matric had not declined over the years -- if anything, it had become more challenging to obtain the NSC than before. People also tended to confuse the pass requirement for a subject, with the pass requirement for the entire NSC. The education commentators were misleading the nation in the 30% pass mark debate. Possibly the state would have to communicate more effectively about where the 30% came from. That percentage was over 150 years old. Having compared the country’s NSC with equivalent qualifications in other countries, Umalusi had found it to be almost similar. In the old Senior Certificate, the pass mark had been 33 1/3% at standard grade (SG) and 40% at higher grade (HG) per subject respectively, at grade 11. The subject combination to arrive at a pass, was 720, which translated into three subjects passed at 30%, three subjects at 40%, with an optional provision that one could fail one subject and still progress to the next level. The 33 1/3% could be converted to lower grade, to effect a pass at 25%. Currently the educational commentators were not telling the public all of that. With the NSC, there were no SGs and HGs. Additionally the question papers were set in such a way that learners with those two older competencies could be accommodated into the new system. Previously, one could in theory pass a subject, using that lower grade conversion provision. Therefore it had never been and currently was not the case, that the pass requirement to get a NSC was 30%. A lot of the commentators, because they had gone to university, confused varsity admission requirements with the pass requirements for the NSC, and these were not interchangeable.
Markers did have a common memorandum and were required to use that for marking, but what Umalusi had said was that when marking grade 12 scripts, it was never enough to simply apply the memorandum.
The Chairperson greeted and welcomed everyone. She said that because the Committee had another engagement that morning somewhere else, she was going to allow the meeting to start so that the Committee could get through both presentations by the time the meeting had to be adjourned. She asked that Members introduce themselves to the presenters and for the presenters to do likewise. After that she requested the presenters to take the Committee through the presentations.
Report on the Quality Assurance of the National Senior Certificate (NSC), Umalusi
Dr Mafu Rakometsi - Chief Executive Officer (CEO) of Umalusi apologised for the absence of the Chairperson of the Umalusi Council, and said that it was the entity’s first interaction with the Committee since 2009. It was grateful for the invitation to present so that it could get inputs from public representatives. He then read through the introduction with the Committee.
Quality Assurance of the DBE 2013 National Senior Certificate Examination
Mr Emmanuel Sibanda, Acting Senior Manager, Quality Assurance of Assessments, Umalusi, took the Committee through the technical report. (See document)
There were specific concerns regarding the quality assurance of the NSC examination and assessment:
●There had been some improvement with regard to adherence to timeframes. However, there were subjects where improvements must be made. 25 November and 25 March question papers were submitted in May/June 2013 for first moderation. It was important to note that for the credibility of the NSC examination it was vitally important that every effort was made to adhere to agreed deadlines for setting and moderation of question papers. Any delays affected Umalusi in its quality assurance exercise.
●There had also been some improvements made pertaining to administration of SBA and presentation of learner evidence of performance. Having said this, the following issues were found to be problematic:
- Internal moderation reports are generally not available.
- Lack of constructive feedback given back to learners after moderation.
- Teachers are still challenged regarding the development of tasks pitched at appropriate cognitive levels: focus is more on lower cognitive level.
- Assessment of Practical investigations, Research projects, Assignments and simulations still remains a major problem.
- The use and development of rubrics is problematic: descriptors are unrealisable and vague.
- Assessment of the Physical Education Task (PET) in Life Orientation continues to be a problem – inflation of marks.
In general Umalusi is pleased with the manner in which the 2013 NSC examination was administered. Umalusi acknowledged that a number of technical irregularities were reported, but these were addressed in a fitting manner. Umalusi took this opportunity to express appreciation to the national & provincial departments of education for their concerted effort in ensuring a credible examination. Umalusi expressed appreciation also to all the relevant stakeholders for the necessary support given in line with Umalusi quality assurance initiatives.
Report on the National Senior Certificate – 2013, Department of Basic Education
Dr Bobby Soobrayan, Director General (DG), DBE said that the Department would be presenting as in depth an analysis as it could of the NSC results for 2013. Importantly though, DBE wanted to present its interpretation of what the results meant in terms of all its interventions going forward. Additionally there were responses to some of the issues that had been raised by Umalusi in the Departments’ presentation.
Professor Rufus Poliah, Chief Director, National Assessments and Public Exams, DBE said that he would skim through the presentation as the Committee had a detailed technical report with the breakdown of the NSC results. He then read only from the beginning of the NSC section in the report.
Professor Poliah said that DBE looked at detailed analysis of each question per subject, when dealing with areas of weaknesses and published a report on that, together with remedial action. That report would then be distributed to every school in the country so that the problems identified therein could be dealt with. DBE took Umalusi’s concerns very seriously. Bilateral meetings with Umalusi dealt with all operational and strategic issues it would have identified within DBE. Even though some of the issues seemed to be taking long, it was the nature of those very issues, as it was not possible to attend to them in the space of one year. Concerning timeframes, the majority of DBE school subjects had already been moderated and were compliant with the 18 month’s cycle, with a few outstanding that would be dealt with in the following cycle.
DBE had constituted two separate panels for paper 1 and paper 2 subject exams where examiners were concerned. Comments from schools and the public were incorporated into the setting of exams for the following year. It had gone on an extensive and rigorous recruitment and training exercise for all its examiners, focusing on areas of taxonomy and preparing examiners for the Curriculum Assessment and Policy Statement (CAPS) that would be implemented in 2014. DBE had started preparing exemplar tasks for School Based Assessments (SBA) and Life Orientation (LO) that educators could use when assessing and presenting at the schools level. To that effect, it was planning to train subject advisors in a one-day intensive workshop so that they could pass on the skills and knowledge they would have gained to the large number of teachers they dealt with.
DBE had introduced a common assessment task for LO that had been running for the past two years. It looked at re-mark outcomes very closely, as those were clear indications of subjects where there may have been variances in terms of marking.
Mr Matanzima Mweli, Acting Deputy Director General, Curriculum, DBE, said that the intention was to bring the lower grades to the same standard that was obtainable at the Further Education and Training (FET) band. He then took the Committee through his section of the presentation.
The Chairperson welcomed the late Members, and some members of the public, to the meeting.
Mr W Faber (Northern Cape, DA) asked if it would be possible that some of his questions could get written responses, as he would have to leave before he could be replied to.
The Chairperson replied that it was procedural for responses to be written, if the Member asking would not be present when responses were being given.
Mr Faber commented that he had observed some improvement, as reported by DBE, but there still remained a lot more work, especially in the Northern Cape. He did not believe that markers were tested rigorously on their competency levels, across the board. What standards would be applied in the accreditation of markers, to ensure quality competence? Were adjustments based on percentages? If that was so, that meant people could be promoted based on those adjustments without necessarily obtained the cognitive ability. Could DBE clarify that issue? The Northern Cape Education Member of the Executive Council (MEC) had said that their results were not that bad, but what DBE was presenting was a slightly different picture -- could DBE also give clarity on that as well?
Ms B Mncube (Gauteng, ANC) said that when she retired from the education sector, there had been talk of learners having to progress according to age cohorts, and that they could also not repeat twice within the same phase. Were all those factors indeed so? In her constituency she knew of a 17-year-old learner, who after repeating grade 10 twice, had been told to go and continue studies at an FET college. Some other pupils from Ennerdale Technical High School, who had not progressed from grade 10, had had their report cards withheld by the school, and had also been kicked out of school. When those pupils went to those colleges, they found that they could not register as the criteria would be that they had to be there to read for grade 11. Additionally, there just was no space for more learners at those colleges. How did that impact the issue of retention which DBE had been talking about? Could DBE clarify the issue of the 30% pass rate, specifically with regard to progression towards tertiary institutions? Was it a requirement still that one needed mathematics “proper” for admission into degree studies, or was it possible with a distinction in mathematical literacy? If that was so, did DBE have a mechanism to advise and guide both teachers and learners, on the adverse effects of channeling learners towards mathematical literacy, on their future prospects. On which subjects, of the seven that had been mentioned and identified, was DBE training examiners on? What was DBE’s intervention in the inflation of marks and their authenticity in LO and Physical Education? Did DBE have and apply standardised criteria in the appointment of exam markers?
Ms R Rasmeni (North West, ANC) asked if markers were using their own common knowledge or memoranda when marking? What were the actual problematic elements for markers, when using rubrics? What were DBE interventions to address those elements? What were DBE interventions to attract more learners to read and practice mathematics? Were the matriculants that had been used in the study of the value of a Matric certificate working, or being trained. What did the success in the labour market that DBE was talking about, translate to?
Ms D Rantho (Eastern Cape, ANC) reiterated a similar concern about the value of mathematical literacy and its usefulness in the future of a learner. The education MEC and the Provincial Executive Council (PEC) from her province had called the DBE to come and account for why the province was performing worse than any other province in NSC achievement. Her MEC had said that he did not understand why the province was at that level, because in comparison with other provinces, the Eastern Cape should have been number five, as its Bachelor degree passes had increased beyond most provinces. DBE provincial officials were still those officials from the former Transkei and Ciskei homelands, and that was worrying. Did DBE also consider its provincial backlogs when tallying the NSC results? Could DBE give clarity to some of her concerns, as she had not been satisfied by the MEC’s responses?
The Chairperson noted that Umalusi’s report had raised issues over the competence and relevance of markers. DBE had been quiet on some of those issues. What had it done in response to those issues? In 2012 DBE had reported it had had seven schools that had achieved zero passes in the NSC results. Had those schools managed to improve or where they part of that nine that were in the underperforming group in the schools’ performance within different percentage categories during 2013? She then added her voice to the mathematics and mathematics literacy debate. The latter had increased enrolments, while the former was losing numbers -- could DBE comment on that?
Dr Rakometsi said that Umalusi had distributed a booklet that detailed the difference between a certificate and a statement of results, which also featured the security features of both documents. The concerns that it had raised did not mean that the basic education system had failed entirely, but that what was required was a strengthening of the system to a certain level. The level of control was still high at the centres, to the extent that Umalusi had been able to vouch for the authenticity of the 2013 results. Umalusi would be deploying more monitors to the marker appointment centers, as well as more people to visit the memoranda discussions, and there would be permanent Umalusi monitors at the marking centres as well.
Education commentators were misleading the nation in the 30% pass mark debate. Possibly the state would have to communicate more effectively about where that 30% came from. That percentage was over 150 years old. Having compared the country’s NSC with equivalent qualifications in other countries, Umalusi had found it to be almost similar. In the old Senior Certificate, the pass mark was 33 1/3% at standard grade (SG) and 40% at higher grade (HG) per subject respectively, at grade 11. The subject combination to arrive at a pass was 720, which translated into three subjects passed at 30%, three subjects at 40% with an optional provision that one could fail one subject and still progress to the next level. That 33 1/3% could be converted to a lower grade, to effect a pass at 25%. Currently the educational commentators were not telling the public all of that. With the NSC, there were no SGs and HGs. Additionally the question papers were set in such a way that learners with those two older competencies could be accommodated into the new system. The NSC was not a requirement only for tertiary studies. Some learners simply read through it to become good citizens, whereas other people immediately wanted to work after matriculation. The commentators also did not say that it was possible to simply increase the passing mark at any time, and that was because increasing it to 50% for example, the country would have to define what that mark would mean in terms of the cognitive capacity needed to attempt those questions. So it could very well be that those questions would be low order or easy questions, where 50% would be attainable. Possibly that debate had worn itself out, because the Minister of BE had set up a task team to deal with the issue, and its findings would be available soon. The margin of learners that actually passed at exactly as per the stipulations Dr Rakometsi had specified, was very small, which begged the question over the significance of the debate.
When Umalusi standardized, it would take raw marks for LO. That was historical, because that subject had no exam, so Umalusi used SBA. It had, of course, informed DBE that if the problem was not attended to, as raised by Umalusi over the years, it would be compelled to standardize even LO, and to also look at other strategies to strengthen the subject so that it became respectable as it appeared on the NSC.
Ms Rantho commented that the template NSC in the booklet Umalusi had given to the Committee looked too much like the real thing, and that posed a potential risk for NSC fraud.
Dr Soobrayan said that DBE had been confronted with exactly what Ms Rantho was talking about a few years ago, but when DBE checked the certificate it had found that it was a seriously bad photocopy done with someone’s bad printer. He said that the certificate was quite different from the template certificate. DBE said there was a lot it could do, but employers also had to understand that one could not accept a piece of paper that clearly looked as if it had been photocopied ten times. There was also still a lot of education that DBE had to do on how to verify the authenticity of the NSC.
DBE said that the requirements for passing Matric had not declined over the years. If anything, it had become more challenging to obtain the NSC than before. People also tended to confuse the pass requirement for a subject, with the pass requirement for the entire NSC. Previously, one could theoretically pass a subject, using the lower grade conversion provision. Therefore it had never been, and currently was not, the case that the pass requirement to get a NSC was 30%. A lot of the commentators, because they had gone to university, confused varsity admission requirements with the pass requirements for the NSC, which were not interchangeable. There was also a difference between minimum entry requirements and what universities actually expected in a particular year, and that was because everyone was getting seven distinctions at that time. So one with the minimum entry requirements simply had the right to access a university, but not necessarily to study medicine. With specialist fields of study like engineering, more demand was placed on the type of NSC marks one came with from high school. All of that had become more onerous with the current NSC.
There was no such thing as percentage adjustments -- there were just upward or downward adjustments and those were not standardized across the board. When adjustments were done, they did not vary an individual’s chance of passing, even if that individual had failed, because they were based on different scales. The minus meant downward and the plus indicated an upward adjustment, respectively.
Northern Cape at one point was the best performing province when dealing with the NSC. Possibly that province needed to take stock over why that had changed, considering the responses to the socio-economic challenges that had spilled over into the education sector in the very recent past.
It was current policy that a learner was not allowed to repeat more than one year in a single phase, but that was one section of the policy. The other section said that if a learner did not make it in a grade, that school had to have curriculum interventions to work with that learner, so that in that phase the learner could achieve the outcomes. That was what Outcomes Based Education (OBE) meant. The weakness then was not the policy, but its implementation at schools, because if a learner had to repeat a grade twice, surely there was either something wrong in the school, or with the learner. DBE’s argument was that that learner could not be punished, but the school or the district had to focus on that learner to find out why the learner had had difficulties in the same area for two years successively. That was the responsibility of the education system, which was what Dr Mweli had been alluding to when speaking about the district meetings.
The mathematics literacy policy was an excellent one; its implementation, however, was problematic. The subject had been offered to deal with individuals that in the past would have never been exposed to mathematics at all. The intention had never been to replace mathematics with its mathematics literacy counterpart. Mathematics literacy was never intended to be standard grade mathematics. A DBE study had found that mathematics-competent learners were sometimes doing mathematics literacy with the opposite happening with mathematics literacy-competent learners. If the state decided to scrap mathematics literacy, it would be depriving itself of an important policy initiative as everyone needed to be mathematically literate. That issue was being addressed in the district meetings between the Minister and the district heads.
Markers did have a common memorandum, and were required to use that for marking, but what Umalusi probably had been saying was that when marking grade 12 scripts, it was never enough to simply apply the memorandum.
When referring to NSC passes according to types of qualification, the Committee could see there that the Eastern Cape had 19%, which was the lowest in the country in comparison to other provinces. Although provinces when not ranked according to Bachelors passes, that could have contributed to why that province was where it was. The rankings were actually determined according to NSC passes. Indeed, poverty played a role there and DBE always dealt with that when announcing the results. However, Limpopo, KwaZulu-Natal and North West were also very poor. Poverty therefore could not be the only mitigating factor over why Eastern Cape was in the position it was in.
DBE was really strengthening its monitoring, especially in curriculum. The subject choice problem unfortunately affected poor schools more, because some schools asked learners what subjects they wanted to do without considering the teaching staff available. There would be three learners wanting to do economics and two others wanting something else, and the school would cater for that. The moment a school did that, it added more learners to the English-language class. Teachers were allocated according to learner numbers, therefore schools needed to find the most efficient combination of subjects to offer. DBE had done a study where it had found that high performing schools did not offer low enrolment subjects.
Professor Poliah said that there were clear criteria that considered qualification as well as experience that were outlined in the policy for marker appointments. Per qualification, the criteria was clear that the marker needed to have done the subject at a second-year level of diploma or degree study. Per experience, two years had to be grade 12 teaching experience, with an additional three that could be from anywhere in the schooling system, totaling five years of teaching overall. DBE made sure that every marker application was validated and verified by the school principal and a district official. In the absence of a competency test, DBE felt that that did not imply that anyone could mark and that its marking standards were being compromised. No marker marked an entire question paper. A marker worked with one or two questions, which meant that the knowledge required was very limited to one content area, which always necessitated training before the marking began. For every five markers there was one senior marker, and for every five to seven senior markers there was a deputy chief marker, so every script went through that pyramidal structure of moderation, which ensured that quality standards were maintained.
As he had mentioned before, LO had a common task that had been running for the past two years, and constituted about 20% of the final mark. That task was set by a panel at DBE and was centrally marked there, as well as at provincial level, to make sure that the standard in that aspect was controlled. Additionally Umalusi did its normal statistical analysis before it determined whether it would accept those LO marks as raw.
The training of subject advisors that DBE had alluded to was to be specific to SBA, because DBE had found that teachers were unable to design an assessment task that was of the quality and standard for SBA. The best option then would be to expose subject advisors to that skill, where the subjects of interest would be mathematics, physical science, history, geography, economics, life sciences and business studies.
None of the schools that had a zero percent achievement in 2012 were still there in 2013. The issue there was that DBE always found that those schools were new, with small numbers, were independent, and sometimes were adult centres, but the detail could be provided if it was requested.
The diagnostic report was a detailed document that looked at the key subjects, with a breakdown in terms of problem areas, per content area and per question, together with remedial measures. Dr Mweli currently was going to provinces with curriculum implementers and advisors to mediate the contents of that document, so that they could support the implementation of that document at all levels, in terms of feedback to learners.
Dr Mweli said that mathematical literacy was for social use, to the extent that the Umalusi Chairperson had proposed that DBE should consider offering it to all learners, even those taking mathematics if they so wished. For the international benchmarking that had been done on question papers with the four assessment bodies, mathematical literacy had received the most favourable and positive report. DBE had been directed to formulate three forms of mathematics in total, for example adding technical mathematics so that learners could have more choice.
Of the 50 000 increase in learners doing numeracy, 17 000 went into mathematics. There had been an increase in the learner numbers between 2012 and 2013 in enrolment for mathematics. That was because two years ago, the DG had sent a circular to schools that said a learner could not do physical science with mathematical literacy -- it had to be with mathematics proper. Provinces had requested recently that a similar approach be applied to accounting.
Regarding the value of the NSC, one had to have read the recent South African Broadcasting Cooperation (SABC) report to get a sense of the value of the NSC. There were individuals that had gone to extraneous lengths to get illegitimate NSC documents, just to keep their jobs.
The Chairperson said that it was a challenge for all state officials to communicate what DBE had reported on, specifically the mathematic literacy and the 30% debates respectively.
Mr M de Villiers (Western Cape, DA) asked how the staff vacancy rates at schools affected the curriculum offered by schools, and what implication that had on the NSC results as well. Similarly with Computer Applications Technology (CAT), schools in certain provinces had not been offering or involved in that DBE offering. What was the impact of that in the results from those schools? Were schools compelled to participate in that? If so, what were DBE’s interventions to ensure that schools came on board? He was vehemently opposed to the irregularities that Umalusi had labelled as insignificant, because even a singular irregularity was significant in an examination situation. He therefore wanted further detail on the exact irregularities. There was deep concern over the decrease in mathematics passes in the various provinces and districts, as outlined in DBE’s presentation. Was DBE aware of why that was happening? How was the percentage pass rate affected by the numbers of actual enrolments for NSC exams and the actual learners that had written those exams? Which numbers were used there?
Ms Mncube said that it was important that DBE advise schools that failed learners, and then referred them to FET colleges, that that tendency would entrench a negative stigma of poor cognitive capacity against those learners. That would result in those learners falling through the cracks and never finishing school. Did DBE have a mechanism to ascertain that the teachers that were setting NSC question papers had the capacity to set quality, cognitively challenging, questions? Was moderation through sampling of certain questions the reason DBE had not responded to the issue of the question over dramatic arts?
Dr Rakometsi said that the issue of dramatic arts was not linked in any way to sampling. The examiner appointed by DBE would set a paper, and an internal moderator also appointed by DBE would moderate the entire paper. Then the external moderator appointed by Umalusi had done the same, but at the time Umalusi’s moderator had looked at the paper he had seen nothing wrong with it. It was only after the public outcry that he had come back to Umalusi to say that even his children had challenged him over the oversight, and had admitted that there might have been something wrong. That was why Umalusi had publicly apologized for the inclusion of that question in the paper.
Sampling in moderation happened at the marking centres, where Umalusi took a sample of scripts and remarked them to check whether the marking was at the right level, but not in terms of the approval of question papers. There were people appointed to look at all questions set for the exams. What had happened there was simply an oversight, based on the judgment of three layers of the structure – namely the examiners, internal and external moderators.
Dr Soobrayan said that DBE did two things in terms of curriculum coverage, where schools had vacancies. Mr Mweli and his team went from province to province, checking curriculum coverage. That was mainly because DBE felt that if it did not monitor curriculum coverage, it would not get results at the end. The issue extended beyond grade 12, in that DBE worked with districts across the board to ensure that they did their own monitoring. The responsibility for making up for vacancies lay squarely with the school, the circuit and the district, even though DBE monitored that as well. However, there were instances when that issue fell through the cracks, but DBE was working with provinces to address this.
It was indeed correct that there were no such things as insignificant irregularities, but the DBE’s use of the term was colloquial in exam terms; it certainly did not imply unimportant. The test was to ask whether, when all the irregularities were taken together, they impacted on the credibility of the exams, so when it was insignificant that meant that cumulatively those irregularities did not impact on the credibility of the exams entirely.
DBE took the number of learners who had passed, divided by the number of learners that had written, to calculate the percentage, and the reason for that was that those that did not write were generally those who were ill, and as such they would get an opportunity in the supplementary exam.
Key assessments like the Annual National Assessment (ANA) were to tell educators that those were the assessment standards, how they were supposed to be done, and how they could be improved.
Mr Mweli replied that CAT and Information technology (IT) subject offerings were usually managed at provincial level. The regime was quite strict from both national to provincial DBE, that schools offering those subjects had to have the necessary infrastructure -- from computer labs to the actual educators -- to teach those offerings.
The bulk of schools experiencing downward trends or that obtained zero percent NSC achievement were generally first time NSC presenters, or were independent schools, and DBE’s business was not to care for independent schools, except to monitor them. Those schools consciously committed to certain standards when they were registering as institutions of learning, and to that effect, provinces were supposed to deregister them if they were underperforming consistently. If public schools were underperforming consistently for two years, principals would be notified, and they would be given one year with some additional resources to improve the standard. Failure to do so resulted in the removal of the principal from that school, with disciplinary action following that.
The Chairperson thanked everyone present and noted that the ANA initiative was the best way forward at that moment, as focusing on NSC results only without dealing with the grades leading up to the NSC, would be illogical.
The meeting was adjourned.