Our kids’ poor exam marks reflect an inability to move beyond rote learning to any true knowledge. Without a culture of questioning and a culture of error, it is going to be difficult for our education system to improve the quality of the matric passes. By MARYKE BAILEY.
Many South African classrooms lack basic resources: infrastructure, books, teachers. But many of the schools that are resourced in a bureaucratic sense still lack two fundamental learning tools: a culture of questioning and a culture of error. A culture of questioning is a classroom practice where both teachers and learners ask many questions across a range of cognitive levels. A culture of error is a classroom practice where teachers and learners recognise the value of mistakes in the learning process, and constantly reflect on correct and incorrect answers. Both practices require knowledgeable, secure and confident teachers.
Many South African teachers struggle to ask good questions, with most apparently limited to straightforward knowledge or recall. Sometimes the children chorus the answers back, or the same three children answer all the questions, or a teacher picks on petrified learners and humiliates them for not knowing the answer. Sometimes there are no questions at all, only instructions.
The poor quality of our teachers’ questioning techniques is only matched by the dearth of questions from the learners during class. Many student teachers have detailed how their own teachers shouted at them for asking questions, shutting down any further curiosity. I’ve seen whole lessons go by without learners asking any meaningful questions. Some of these classes only had a black board while others had smart boards and tablets. The latest technology does not guarantee an engaged lesson.
Children with access to a quality education have teachers who take time to ensure that their learners understand basic concepts and, importantly, ensure that they can apply the concepts in different circumstances. They engage with higher-order thinking and problem solving on a daily basis. Their ability to analyse new situations and to respond appropriately in unfamiliar contexts provides them with a clear and sustained advantage in tertiary education and in the labour market. With more cognitive exercise, privileged children have more real knowledge and understanding of the matric subjects than most disadvantaged children. This is driving our educational inequality gap.
Unfortunately, it’s completely possible to pass matric without demonstrating any real understanding of a subject’s content. If a learner scores 40% for a test, we often seem to take this to mean that she knows (only) 40% of the work. In the context of the state matric exams, it’s more accurate to say that the candidate could answer enough questions to garner 40% of the marks, but that she only really knows, or understands, between 10% – 20% of the work.
We need to read our matric results in conjunction with the prescribed weighting of the cognitive levels for each subject. A cognitive level refers to a question’s complexity. For most key subjects, Level 1 questions relate to basic knowledge or recall, sometimes with low-level comprehension. Level 1 questions tend to make up between 30%-40% of the total exam marks. Level 2 questions contribute to between 40%-50% of the exam marks. They assess understanding, application and low-level analysis. Level 3 questions include some sort of evaluation. Some subjects will split the demands of a cognitive level so that there are four categories.
Linked here are two examples from the different Curriculum and Assessment Policy Statement (CAPS) documents for Agricultural Sciences and Life Sciences. You can view more subject tables here. The tables show the definition and prescribed weighting of the cognitive levels for all formal assessments in these subjects.
Agricultural Sciences and Life Sciences are two of the ten “key” or “gateway” subjects analysed in Part 1 the 2017 National Senior Certificate Diagnostic Report. The other eight key subjects are Accounting, Business Studies, Economics, Geography, History, Mathematical Literacy, Mathematics and Physical Science. These are the ten most popular non-language subjects and their performance distributions, read in conjunction with the cognitive level weightings, can tell us a lot about the skills crisis we face in our education system.
Generally the averages for the languages are over 50%, and their distribution curves look like pyramids. Together with Life Orientation they can improve a candidate’s overall average. But the performance distribution curves for the key subjects are very skewed. Below is an example from Geography taken from the 2017 NSC Diagnostic Report (pg 82). Most of the other subjects (except History and Maths) follow a similar curve. To view and compare all the distribution curves for the gateway subjects and the languages, click here.
I collated some of the data for the gateway subjects. On average, for the 10 gateway subjects:
Nearly three-quarters of the candidates achieved less than 50%. We can account for low averages in different ways. Learners who scored a 40% average might have correctly answered about 40% of all the questions, regardless of their complexity. Perhaps they were just not able to provide full answers due to poor language skills or exam techniques. Maybe the candidates did well in some section, and very badly in the others. In both these scenarios it would probably be accurate to say that a candidate displays the ability to answer cognitively demanding questions, but only knows 40% of the work. If so, our solutions should focused on improving language proficiency, access to information, study skills and exam techniques. Language skills aside, providing various extra lessons and doing numerous past papers should be able to rectify these issues on some level.
But I think that in most cases a 40% average indicates that a candidate can answer most of the Level 1 questions, but they can only answer a small fraction of the questions that meaningfully assess true understanding and application. In this case, extra lessons and past papers won’t really help to improve the national average. The solution requires a much longer-term intervention since you can’t teach someone higher-order thinking skills in a few extra lessons. Thinking skills need to be modelled regularly.
There are different ways of asking Level 1 questions, and the state matric exams seem to do so on the easiest level. The History papers virtually give the marks away. Easy questions are not limited to History. Below is a screen shot of Paper 1, Question 3.2 from the 2016 Geography exam (the 2017 exams are not available online as yet). Question 3 is out 75 marks, so Question 3.2 accounts for nearly 10% of the section’s marks. I would assume that these are all Level 1 questions. How many of these questions can you answer?
What a Level 1 weighting of 30% or 40% implies in the state matric exams, is that candidates do not have to display any meaningful understanding of the subject to get enough marks to pass.
The average mark for Question 3.2 was 77%. The very next question (Question 3.3) asked candidates to apply their knowledge to a diagram, and the sub-questions included a range of cognitive levels. The average mark for Question 3.3 was much lower at 39% (2016 NSC Diagnostic Report). This is only one example, but I think we can use it and reasonably extrapolate that our kids’ poor marks reflect an inability to move beyond rote learning to any true knowledge. Without a culture of questioning and a culture of error, it is going to be difficult for our education system to improve the quality of the matric passes.
This was further illustrated by the different subject reports in the 2017 NSC Diagnostic Report. The same issues kept rearing their head across different disciplines. Students could often provide textbook definitions for basic concepts, but demonstrated very little understanding. Attention to accuracy and good English language skills also hindered candidates. Many struggled with basic subject terminology, while also showing a weak understanding of what the questions required from them. Poor language skills also filtered into the way questions were answered. The history report repeatedly mentions learners’ inability to write logical or coherent paragraphs, and other subjects often noted that candidates provided one-word answers for questions that required at least a sentence.
Undoubtedly language plays a role in many matrics’ inability to understand and answer higher-order questions. Most learners are not writing the exam in their home language. This gives a decided advantage to those that do, or who attend a school where English or Afrikaans proficiency is the norm. Solving the language issue is central to improving our education. Yet, I would argue that even if learners wrote the exams in their home languages, the achievement rates would not necessarily be significantly different.
The 2012 National Education Evaluation and Development Unit (NEEDU) report showed that learners in the foundation phases were adept at answering lower-level questions in a reading comprehension, but struggled to answer higher-order questions. The 2016 PIRLS results also showed us that most South African Grade 4s struggle to read for meaning in their home language. The NEEDU report suggested two possible reasons why most young children could not answer higher-order questions. Either their teachers lacked subject-specific higher-order thinking skills, or the teachers were unable to teach the subject’s higher-order thinking skills. I’ve argued elsewhere for the former point. These reports focused on the younger grades, but our matric results reflect the consequences of the challenges picked up in our primary schools.
If we continue to produce new teachers who don’t have good subject knowledge and are unable to model complex thinking, we will continue to have a workforce that is not skilled enough to meet the demands of a modern, industrialised economy. Our businesses and industries will not grow at the rate they should, or could, thus affecting our economic development.
Furthermore, we can kiss goodbye to the fourth industrial revolution. The majority of matriculants entering the labour market might have a certificate that proves that they can answer routine questions, but it does not prove their ability to communicate adequately or use information flexibly.
If we continue to protect under-performing teachers and provide them with no incentive to improve their knowledge and skills, our inequality gap will merely continue to grow. A minority of matriculants will be highly employable, while the vast majority will struggle to find a space for themselves in a labour market that demands true understanding and complex problem solving. DM
Photo: Charmaine Phoofolo revises ahead of the start of the 2017 matric examinations. Photo: Willem van der Berg
While we have your attention...
An increasingly rare commodity, quality independent journalism costs money - though not nearly as much as its absence.
Every article, every day, is our contribution to Defending Truth in South Africa. If you would like to join us on this mission, you could do much worse than support Daily Maverick's quest by becoming a Maverick Insider.
Click here to become a Maverick Insider and get a closer look at the Truth.
There are more skin cancer cases related to tanning beds than there are lung cancer cases to smoking.