Defend Truth

Opinionista

Matric 2016: The numbers can be deceiving

Gavin Davis is CEO of Resolve Communications and a former Member of Parliament. He writes in his personal capacity.

Now that the dust is settling on the 2016 matric results, it is worth reflecting on the numbers to see what they tell us about the state of basic education in our country.

Our national obsession with the pass rate – fuelled by the national Department of Basic Education (DBE) – tells us that, nationally, we have improved from a national pass rate of 70.7% in 2015 to one of 72.5%.

It is unfortunate that this nominal improvement is taken as a sign that the “system is on the rise” – to use the somewhat clumsy parlance of the DBE.

For one thing, the pass rate is but one indicator of whether the system is moving in the right direction or not. How are we doing in maths and science, the two subjects critical for our country’s development? How many learners attained access to tertiary education? What was the gap between the best performing schools and the worst? Is it closing or widening? And so on.

The pass rate can also mask what is really happening behind the numbers. This is because the pass rate is expressed as a percentage of the learners who wrote, and doesn’t take into consideration the learners who dropped out of the system. But we need to remember that it is possible for a school, district or province to attain a higher pass rate simply by ensuring that fewer weaker learners write the exam.

Take the Free State, for example. The Grade 12 (matric) class of 2016 was the Grade 10 class of 2014. In 2014, there were 55,293 learners enrolled in Grade 10 in the Free State. But, in 2016, only 26,786 of those learners actually wrote matric. If we look at the number of learners in the Free State who obtained a matric pass (23,629) and divide them by the number of learners who enrolled in Grade 10 in 2014, we can calculate a “real pass rate” of 43%.

Now let us compare this with the Western Cape. In 2014, there were 75,791 learners enrolled in Grade 10. In 2016, 50,869 of those learners wrote matric and 43,716 passed. Using the same method as for the Free State above, we can calculate a “real pass rate” for the Western Cape of 58%.

In other words, the Free State’s claim to be the best performing province (with a pass rate of 88.2% compared with the Western Cape’s 86.0%) is misleading. Any assessment of performance must take into account the number of learners retained in the system. It is clear that, in the Free State, relatively fewer learners make it to matric, which is why the pass rate is high.

The national picture is illuminating in this regard. In 2014, there were 1,100,877 learners enrolled for Grade 10, but only 442,672 went on to pass matric in 2016. This gives us a national “real pass rate” of 40%. We need to start asking questions about what happened to these learners. Did they exit the system? If so, why? Or are they stuck in Grade 10 or 11, unable to progress?

Another aspect of the system that we need to question is the manner in which the matric papers are standardised by Umalusi, the quality council for education.

We know that 32 of the 58 matric subjects had their marks adjusted this year during the standardisation process. Of the 32 adjusted subjects, 28 had their marks adjusted upwards and four downwards.

Some of the subjects saw a dramatic upwards adjustment. Mathematical Literacy, for example, was adjusted from a mean raw score of 30.06% to 37.22% – an upwards adjustment in the mark of 7.16%. According to Umalusi, it was justified in raising the raw marks to bring them in line with the historical mean (from 2011 to 2016) which, in the case of Mathematical Literacy, was 37.20%.

Now, it is true that an upward (or downward) adjustment may be warranted if it can be shown that the examinations were demonstrably harder (or easier) than in previous years. I have asked for evidence that the papers were more cognitively demanding, but Umalusi has been strangely reluctant to provide it. This is despite Umalusi’s stated commitment to “making its processes transparent to all who have an interest in the examinations”.

Umalusi’s reluctance to provide evidence fuels the suspicion that the upward adjustments in many subjects were not due to the papers being more difficult, but for other reasons. And to understand this, we need to look to the impact of including”‘progressed” learners in the standardisation process.

A progressed learner is one who was pushed through to matric despite not meeting the pass requirement for Grade 11, in line with the department’s progression policy. This is the second year that the progression policy has been in force. According to Umalusi, 109,400 progressed learners (13.4% of the total enrolment) wrote the National Senior Certificate (NSC) examination in 2016, up from 66,088 in 2015.

In other words, there was a significant increase in the number of weaker students (i.e. progressed learners) who wrote the NSC this year compared to previous years. This raises the question of whether the inclusion of progressed learners in the standardisation process leads to certain anomalies.

Let us go back to the example of Mathematical Literacy to illustrate. This year, according to Umalusi, 389,015 learners wrote Maths Literacy. We do not know precisely how many of these were progressed learners, but it is likely that most progressed learners would have opted for Maths Literacy instead of the more cognitively demanding Mathematics (it is compulsory to do one or the other). This means that as many as one in four learners who wrote Maths Literacy could have been progressed learners.

Given this injection of weaker learners into the cohort who wrote Maths Literacy, it follows that the drop in the subject’s raw mark (30.06%) from the historical mean (37.20%) may not have been to the increased cognitive demand of the examinations, but because of the increased number of weaker, progressed learners who wrote the examinations.

Under these circumstances, adjusting the mean raw score up by 7.16% to 37.22% would not be justified. This is because it would mean artificially inflating the marks of progressed learners simply because they were weaker and not because the papers were more difficult. And, like a rising tide that lifts all boats, the marks of non-progressed learners would be adjusted upwards as well.

Could it be that the inclusion of progressed learners in the standardisation process creates additional impetus to adjust the marks upwards, for reasons not related to the cognitive demand of the papers? And, if so, would this not mean that the marks will end up higher than previous years when there were no progressed learners?

As with any statistical analysis, the numbers can be deceiving – especially if they are massaged in a certain direction or reported on selectively. Our job – as parliamentarians, members of civil society and journalists – is to dig behind the numbers to get a clearer picture of reality. DM

Davis is the DA’s Shadow Minister of Basic Education.

Gallery

Please peer review 3 community comments before your comment can be posted