National Senior Certificate Examinations 2010: briefing by Department and Umalusi; Election of Chairperson

Basic Education

18 January 2011
Chairperson: Ms M Malgas (ANC)
Share this page:

Meeting Summary

The Department of Basic Education (DBE) briefed the Committee on outcomes of the 2010 National Senior Certificate (NSC) Examinations. The national pass rate had increased by 7.3% from 60.6% in 2009 to 67.9%. While DBE was happy with the progress, it recognized that with a 32.1 % failure rate, there was no reason to rejoice. Of the 58 subjects presented to Umalusi for standardization of results, the raw marks of 39 subjects were retained. In 9 subjects the marks were adjusted upwards and in 10 subjects they were adjusted downwards. Mathematics and Physical Science paper passes were still below target. The report presented was technical in nature and after receiving feedback from each school and district in all the provinces, DBE would invite the to engage on a comprehensive analysis of the results.

Members asked what measures were in place to improve under-performance;
why more than 21 000 students who enrolled did not write the exams and how the drop outs were accounted for; and if DBE’s focus on improving pass rates, as well as school’s protecting their reputation, was perhaps encouraging students to drop subjects or not write examinations. Members asked whether the performance figures for mathematics differentiated between maths literacy and pure maths and if a third maths paper was written; how many students from the Class of 2010 had to-date not received their results and in which provinces they were; and why there had been a significant increase in part-time student figures. Members further asked what the process was for standardizing results by Umalusi and DBE; the criteria for appointment of markers and how DBE ensured that all markers had the same basic understanding of the procedure; what intervention measures were in place for Special Schools to improve on their performance; if the Dinaledi Schools Project had been effective in assisting students with their school results; and what was being done about teacher salaries to prevent striking action.

Umalusi presented on the
Quality Assurement of the NSC and areas of concern for 2010. Standardization was applied to the NSC results to ensure that learners were not advantaged or disadvantaged by factors other than knowledge of the subject and aptitude. The process was observed by international bodies, as well as Higher Education and SA Qualifications Authority and conducted in a transparent manner. The process was confidential but not secret. Since the release of the 2010 matric results, there had been lively debate on why Umalusi was not willing to disclose the decisions made with respect to each subject. However, the process was highly technical and open debate would lead to the risk of erroneous interpretations. The public concern had demonstrated that all relevant bodies needed to engage in a public education exercise. Umalusi planned to educate the public on Umalusi Quality Assurance processes in 2011 to reduce confusion around standardization.

Areas of concern at the marking centres were: delay in commencement and delay in capturing of marks; concern with the appointment of markers; the large number of marking centres in some provinces; security of papers. Concerns over non-attendance of relevant provincial delegates at pre-marking discussion preparations had been addressed with DBE. The majority of irregularities were technical in nature and no major irregularities were reported. Umalusi was pleased with the administration of the 2010 NSC examination.

Members asked for information on Umalusi’s standardization of Independent Examinations Board (IEB) and
the Eksamenraad vir Christelike Onderwys results; if the Ogive curve continued to be used as a measure for standardization; for clarification on the 10% latitude for adjustment of marks; if bridging-courses and other measures at universities undermined the credibility of the NSC; how DBE had responded to the new School Based Assessments model; and how Umalusi ensured security of information.

Meeting report

Ms N Gina (ANC) moved for the nomination of Ms M Malgas (ANC) as Chairperson of the Portfolio of Basic Education and was seconded by Mr J Skosana (ANC). Ms Malgas accepted the nomination and congratulated the Department of Basic Education (DBE) and the Class of 2010 on the National Senior Certificate pass rate of 67.9%.

National Senior Certificate (NSC) 2010 Examination outcomes: briefing by Department of Basic Education
Dr Nkosinati Sishi, Chief Director: Educational Measurement, Assessment and Public Examination: Department of Basic Education, indicated that the 2010 NSC success could not have been possible without the support of the Committee, particularly in Mpumalanga Province, where monitoring intervention had led to marked improvements in the administration of the examinations in 2010 compared to previous years.

In 2009 the national pass rate was 60.6% and in 2008 it was 62.6%. This result had been cause for concern for DBE and a decisive response was implemented in January 2010. In 2010, 67.9% (364 513 students) passed the NSC examinations. While DBE was happy with progress within the system, with more than 32.1% young people failing, DBE recognized that there was no reason to rejoice. DBE was proud that the public was at liberty to access the DBE policy documents, the Curriculum Framework and question papers on the DBE website. This transparency that characterized the NSC examinations was in line with the DBE constitutional mandate.

Markers appointed were mostly teachers with years of experience and 18 experts drawn from the nine provinces interrogated marking memoranda before exams commenced. Internal and external moderators ensured quality assurance measures and the marking itself was controlled by national research norms and standards. Each year, DBE had an obligation to comprehensively report any irregularities to Umalusi and on 6 January 2010, after Umalusi had engaged with the report and the integrity of the process was in line with regulations, the results were released.

Standardization was an important international practice to mitigate the effect of factors other than the learner’s knowledge and aptitude. The principles that underpinned the process had been in place for many years.

In 2010, the outcome of the standardization process was that out of a total of 58 subjects presented to Umalusi for standardization, the raw marks in 39 subjects were retained. In 9 subjects the marks were adjusted upwards and in 10 subjects they were adjusted downwards. The maximum mark adjustment of 10% was not applied in any subject.

There were 6 540 examination centres in the nine provinces – 6 042 for Public, 463 for Independent and 35 for Special Schools and a total of 5 915 schools. Analysis of quintile rankings (Q1-Q5) across the provinces, where Q1 was the highest poverty ranking, 127 of the 504 schools which achieved 100% passes were from Q1-3. This finding was important for reflection and for future strategic measures. More detailed statistics on the performance within each quintile would be made available.

Of the 642 001 class of 2010, the number of part-time students had increased compared to previous years. In 2009, there were 580 937 full time and 39 255 part time students. In 2010, there were 559 166 full time and 82 835 part time students.

The fact that there were only two marking centres in the Western Cape, where there were 53 276 students and 33 in Kwa-Zulu Natal, where there were 150 928 students, out of a total of 127 marking centres, was one of the issues which would be addressed by the DBE. There were 35 000 markers and a national norm of one senior marker for every five markers.

Mathematics and Physical Science paper passes with 30% and above had increased compared to the previous year but the level was still below target capacity. Significant challenges had been identified and would be addressed in the Action Plan for 2014. As in the Schooling 2025 Vision, focus was on three important principles: performance management, accountability and remediation included short- and long-term strategic interventions for effectiveness in delivery of the education mandate.

Discussion
The Chairperson commended DBE on the comprehensiveness of the report and asked DBE to include information on education of persons with disabilities in future presentations.

Mr Skosana asked what administrative measures, in terms of resources, were in place to achieve 100% pass rate and how DBE would address schools which had under-performed.

Ms F Mushwana (ANC) asked how DBE planned to ensure that learners who enrolled for the NSC did in fact write the exams and how DBE would resolve the problem of the distance students who had to travel to access examination centres.


Mr Z Makhubele (ANC) asked for more information on how standardization of results ensured that the cohort of learners was not duly or unduly advantaged by fluctuations of results. He also asked how the internal moderators ensured that all markers had the same basic understanding of the procedure and what was required in terms of their task.

Mr Makhubele asked why one slide describing quintile rankings of Public Schools showed 5 915 examination centres and another showed a total number of 6042 examination centres for Public Schools. He also asked how the number of students who failed the year would be included in the statistical figures.

Mr Makhubele asked for clarification on how the gap between male and female pass rates was closing - unless DBE was looking at performance alone - as enrolment was a variable. He also suggested that management of the examination centres be tightened to the same standard as there were notable disparities between them.

Mr Makhubele asked DBE to share with the Committee what was being done about teacher salaries to minimize striking action and if the situation was monitored. It was an expensive disruption to the education of learners. He also indicated that in his observation of the figures, it was not acceptable that there was a 0% pass rate or even a 50% pass rate at any school, irrespective of whether there were few students who wrote the exam at that school.

Mr K Dikobo (Alt) (AZAPO) said that for the past two years some students had not received results and ‘lost a year’ while DBE was investigating irregularities. He asked how many students from the Class of 2010 had to date not received their results and in which provinces they were.

Mr Dikobo also asked what the process was for both Umalusi and DBE to standardize results and why more than 21 000 students who enrolled did not write the exams. He suggested that small farm schools had low pass rates because these schools were originally primary schools and had grown without increasing teacher capacity. He suggested that DBE implement intervention measures at such schools and that DBE should also engage on addressing inequalities between schools.

Ms S Molau (COPE) asked how the drop outs were accounted for alongside the improvement in the overall pass rate.

Mr W James (DA) said that the fact that high schools were not contributing to the Human Resources Development Strategy in terms of scarce skills - engineering, accounting, health sciences, medicine - was a major challenge for the DBE. The 2010 mathematics results showed that at the 30% mark, 47% of students had passed and 53% of had failed. At the 40% mark, 30% of students had passed and 70% had failed. He asked for a breakdown on the cut-off figures for the pass rate and percentage distribution at 50%, 60%, 70%, 80%, and 90%. The modeling figure suggested that at the 50% cut-off, 76% of the nation’s students had failed high school mathematics. Together with hard science subjects, this was a problem for universities. He suggested an accelerated programme for teacher development - both pre and in-service, as well as smaller class sizes for hard science teaching.

Mr James asked for an explanation as to why there had been growth in part time student figures from 1 116 in 2008 to 82 835 in 2010. This significant figure would have an impact on the pass rate. He also said it was of concern that Zulu speaking (first language) markers were marking history papers which were in Afrikaans. His view was that Zulu speaking students should be able to learn history and write the history exam in Zulu. He asked what plans were in place to improve the quality of marking.

Ms A Mashishi (ANC) said that infrastructure was key for improvement of matriculation results.

Mr D Smiles (DA) asked if the integrity of Umalusi as an independent body was compromised by having a representative of Umalusi in the National Examinations Irregularities Committee – a Ministerial Committee.

Mr Smiles said that it was difficult to accept that the irregularities at marking centres in question were classified as technical in nature and that there were no non-technical irregularities. He had heard of irregularities at two marking centres in the Eastern Cape and asked for a response regarding those reports.

Mr Smiles asked how DBE’s underlying capabilities had led to improvement in results when there was evidence from the HSRC’s report that teachers were striking and were not in classrooms and that DBE’s capability was lacking.

Ms M Dunjwa (ANC) (Science and Technology Portfolio Committee) asked if the Dinaledi Schools Project had been effective in assisting students with their school results. She suggested that DBE and the Department of Science and Technology should focus on Foundation Level to ensure more positive examination results at high school.

Ms Gina asked for clarity on whether the performance figures for mathematics differentiated between maths literacy and pure maths and if a third maths paper was written and what intervention measures were in place for Special Schools to improve on their performance.

Mr N Kganyago (UDM) asked how markers were appointed. He was concerned that markers who did not teach the subject were marking the subject.

Ms C Dudley (Alt) (ADCP) asked if DBE’s focus on improving pass rates, as well as school’s protecting their reputation, was perhaps encouraging students to drop subjects - particularly maths to maths literacy - or not write their exams to prevent possible failure.

Ms A Mda (COPE) cautioned that one should not celebrate until all students had received their results. In 2008 and 2009, the figures changed once all the results were released.

The Chairperson said that due to time constraints, DBE was required to answer the questions briefly and furnish detailed answers in writing to the Committee.

Mr SG (Paddy) Padayachee, DBE Acting DDG: Planning, Quality Assessment and Monitoring
and Evaluation, said that DBE would respond to all answers in writing. DBE was aware of the challenges and the 2011 examinations would put DBE to the test. In due course the Minister would announce intervention plans for 2011, with the three priorities being: delivery of the curriculum; teacher development; and infrastructure.

Dr Shishi said that given the time restraint, some answers to questions could be found in the Annual Report. He assured Members that DBE was competent in dealing with the issues in a comprehensive way. The report was technical and not an analysis of the results. At a further meeting, the Committee would be invited to engage with the DBE on a comprehensive analysis of the results after receiving feedback from each school and district in all the provinces.

DBE was addressing the issue of some examination centres being difficult to access and also some registered marking centres not complying with DBE regulations. The issue of standardization of results would be dealt with in detail by Umalusi. The standardization process involved internationally recognized measures to ensure integrity for learners.

DBE’s main focus was teacher development. This would address the challenge of quality of markers appointed by DBE and involved many stakeholders. Targets for teacher development were set out in the Schooling 2025 Plan and specific measures would be announced in due course.

Umalusi had a clear oversight role in the
National Examinations Irregularities Committee (NEIC) and an Umalusi representative was not there on behalf of DBE but was representative of Umalusi. The executive of the Ministerial NEIC included Umalusi, Higher Education (on behalf of universities of South Africa), as well as South African Qualifications
Authority (
SAQA).

The intent of dealing with irregularities was to ensure that the structures and stakeholders within the system were in line with the law. Irregularities were clearly defined by law. DBE presented its report and Umalusi declared whether or not there were problems. DBE would welcome policy amendments which would enhance the integrity of the system.

Question papers were adapted to accommodate learners with special needs at Special Schools. A parent or child could apply for recognition of a disability and the legal process was public knowledge. DBE would continue to give this area of education attention.

The Chairperson said that the Committee looked forward to engaging with DBE on analysis of the results. She asked for a time line on the turn-around strategy and teacher development projects; for a discussion document on the drop-out rate of learners and to address the distribution of Special Schools in the country.

Umalusi on outcomes of the National Senior Certificate (NSC) Examinations for 2010
Professor Sizwe Mabizela, Chairperson: Umalusi, said that the NSC, which was in its third year of existence, was maturing and stabilizing. Umalusi had the responsibility of ensuring that standards were consistent and of the highest level.

Standardization, also referred to as statistical moderation of raw marks, was a process which was evidence-based and was applied to the NSC results to ensure that learners were not advantaged or disadvantaged by factors other than knowledge of the subject and aptitude. Any assessment system had sources of variability, which were unintended, unplanned and often undesirable.

Though raw marks were often assumed to be a true reflection of the candidate’s capability, only after analysis of performance of candidates; previous cohorts; and paired analysis (of performance between candidate’s subjects), could problems which may have crept in be identified and examination papers be deemed too easy or difficult.

Independent researchers of impeccable integrity and with years of experience were appointed to report on their analysis of the paper and the quality of the paper compared to previous years. Umalusi then decided on the appropriate course of action.


In 2010, out of a total of 58 subjects, the raw marks in 39 subjects were retained. In 9 subjects the marks were adjusted upwards and in 10 subjects they were adjusted downwards. (
See hand out for more information on limitations for adjustment).

The standardization process was observed by international bodies, as well as Higher Education and SAQA and conducted in a transparent manner. The process was confidential but not secret. Since the release of the matric results, there had been lively debate on why Umalusi was not willing to disclose the decisions made with respect to each subject. The process was highly technical and debate would lead to the risk of erroneous interpretations. The variations and fluctuations which enter the examination process were not of the learner’s making and the learner would not understand the process leading to adjustment of their raw mark.

Mr Vijayen Naidoo, Umalusi Senior Manager: Quality Assurement of Assessment, said that in 2010, of the 130 question papers, 47 were approved at 1st moderation, 71 at 2nd moderation and 12 at the 3rd moderation. (See handout for more detail on question papers which were approved and those which had to be re-set before approval.)

Umalusi had implemented a new model in 2010 for moderation of School Based Assessments (SBA). The model focused on tasks rather than the educator portfolios; it was ongoing across the provinces; and allowed immediate feedback to educators. Umalusi believed that SBA moderation could be strengthened and feedback should be directed at the educators as well as the learners. In 2010, Umalusi monitored the state of readiness of learners together with DBE to avoid duplication and to establish veracity of the DBE monitoring processes. Umalusi also monitored the writing of the NSC examination at the exam centres (162 of 6540 centres) and the marking of the NSC examinations (45 of 127 centres). Areas of concern at the marking centres were: delay in commencement and capturing of marks; concern with the appointment of markers; the large number of marking centres in some provinces; security of papers. Concerns over non-attendance and pre-marking discussion preparations had been addressed with DBE.

Mr Mafu Rakometsi, CEO: Umalusi Council, highlighted the areas of concern: Filtering of candidates in Grade 12 as a means of achieving a higher pass rate; late submission of question papers for moderation; moderation of SBA; monitoring of exams - improper registration and certain invigilators not suitable; appointment of markers and consultancy with stakeholders when changes were made in relation to appointment of markers; non-attendance of relevant provincial delegates at marking/memoranda meetings; potential administrative and management problems in Limpopo, Kwa-Zulu Natal and Free State; number of marking centres; irregularities as a result of registration-related problems. The majority of irregularities were technical but no major irregularities were reported. In conclusion, Umalusi was pleased with the administered 2010 NSC examination.

Discussion
Mr Dikgobo asked for clarification on why a Member of the Umalusi Council had publicly criticized Umalusi’s decision not to disclose standardization decisions. He asked for information on the role of Umalusi in the standardization of the Independent Examinations Board (IEB) and
the Eksamenraad vir Christelike Onderwys (ERCO) results.

Mr Rakometsi said that the person was not a Member of the Umalusi Council and had not discussed the issue with Umalusi. The IEB adjustments by Umalusi were: Of 53 subjects, 11 subjects were adjusted upwards, four were adjusted downwards and raw marks were used in 38 subjects. Statistics for ERCO were: 22 subjects were presented, seven subjects were adjusted upwards and zero subjects were adjusted downwards.

Mr Smiles asked for clarity on the 10% adjustment given its historical use and if the Ogive curve continued to be used as a measure for standardization and if the 10% latitude for adjustment of marks could be too high. He also asked how DBE had responded to the new SBA model.

Professor Mabizela
replied that the Ogive was still used particularly as a visual tool but was complemented by qualitative and historical information from experienced independent sources. Umalusi used the broadband of 10% as a guideline. The average was 3-5% adjustments.

Mr James suggested that since the public was concerned about irregularities, Umalusi should take the extra step and reassure the primary stakeholders. That standardization was ‘standard practice’ was not an adequate response.

Ms Dudley agreed and added that in a democratic and transparent system, it was understandable that the public would demand a rational explanation for decisions - in writing. She believed that DBE had to find a way to reasonably explain its decisions.

Mr Rakometsi said that the public concern had demonstrated that all relevant bodies needed to engage in a public education exercise. It was important that in a democracy, people asked questions to understand why things happened. Umalusi planned to educate the public on Umalusi Quality Assurance processes in 2011 to reduce confusion around standardization.

Ms Mda asked to what extent Umalusi took responsibility to prevent leakages and panic for the public.

Mr
Rakometsi answered that all people attending Umalusi meetings on the standardization processes had the proper credentials to attend the meetings and books used to assess the standardization process were not allowed to leave the meeting room. He highlighted the fact that Umalusi remained an independent organization despite Umalusi observers attending NEIC meetings

Mr Makhubela said that bridging-courses and other measures offered by public higher education institutions for entry into those institutions undermined the quality assurance of the NSC by Umalusi. He asked how this problem could be resolved.

Mr
Rakometsi said that South African universities did indeed respect the standard of the NSC and that research had shown that the NSC was the best predictor of performance at universities.

The Chairperson thanked Umalusi for their presentation and the meeting was adjourned.

Present

  • We don't have attendance info for this committee meeting

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: