• Tidak ada hasil yang ditemukan

National Systemic Assessments

2. CHAPTER TWO

2.2 M ONITORING M ATHEMATICS E DUCATION

2.2.4 National Systemic Assessments

There are three policy issues that national assessments must address and these are:

issues relating to quality; these refer to quality teaching and learning that is aligned to the implementation of curriculum, what mathematical knowledge learners exhibit as a result of engaging with the type of national assessment; issues relating to equity;

the national assessment can help to determine how the education system is responding to issues related to gender, socioeconomic diversities, ethnic groups and school governance (public or private); issues related to provision; a national

26

assessment can provide evidence on provision of education such as the challenges of curriculum reform, learner retention rate and its effect to teaching and learning, restructuring of the education system, and factors associated with achievement (DFID [S.a.]).

Reflecting on these policy issues that a systemic assessment must address, the studies reviewed points at equity and provision as being the only issues addressed by TIMSS and SACMEQ which South Africa is participating. Issues relating to quality are partially addressed, like in the TIMSS video studies, however, a lot of challenges are encountered here which raises questions on the validity of such results (Leung, 2014). These are reported by Ferrini-Mundy and Schmidt (2005) that the video studies were conducted in three countries out of forty seven countries that participated in the 2003 TIMSS. This is an indication that the video study did not reflect enough on participating countries. The issue on quality remain not addressed by the TIMSS and subsequently by SACMEQ studies. The study by Koretz (2009) highlights the fact that systemic assessment for student achievement is important and in the context of the United States it continues to fail to provide clarity on performance. I observe that the same challenge that America is experiencing is also evident in South Africa which justifies the need for an alternative way of reporting and interpreting results of national systemic assessments.

Now I focus on the South African Annual National Assessments. The principal aim of ANA is to monitor learner attainment at regular intervals, using nationally or provincially defined measuring instruments (DoET, 2002a). According to DoET (2002a) this form of evaluation compares and aggregates information about learner achievements so that it can be used to assist in curriculum development and the evaluation of teaching and learning. As mentioned in the current study, national systemic assessments must address three issues, quality, namely, provision and equity (DFID, [S.a.]). As such, the depth in what ANA is addressing as mentioned in DoET (2002a) does not cover these three issues.

Reports such as DBE (2012a, 2013b & 2014) presented the following results on ANA testing between 2012 and 2014: (1) National average in Grade 9 mathematics

27

of (13%) in 2012, (14%) in 2013 and (11%) in 2014; (2) Western Cape province with the highest achievement in Grade 9 mathematics with (16.7%) in 2012, (17%) in 2013 and (13.9%) in 2014. (3) The second province was Gauteng, (14.7%) in 2012, (15.9%) in 2013 and (12%) in 2014. (4) The worst performer, positioned last, was the Limpopo Province with 8.5 percent in 2012, (9%) in 2013 and (4%) in 2014. (5) Female learners achieved slightly higher results than males nationally and provincially in Grade 9 mathematics in the 2012, 2013 and 2014 respectively. (6) Analysis of schools in terms of the poverty index, called quintiles, indicated that poor schools performed worse than affluent schools (DBE, 2012a & 2013b, 2014). Most of these findings are consistent with those reported in South Africa’s participation in all TIMSS and SACMEQ studies.

The results reported above reveal that there is enough information on issues of provision and equity that is at the disposal of the DBE to redress these issues. These results only report on mere aggregated scores between provinces on, gender, poor and affluent schools (DBE, 2014a). These results have been reported in consecutive years and are still consistent. The need to use another lens is pressing and the DBE needs to change its’ focus to view other obstacles that the national systemic testing must address to respond to the context and the needs of South Africa. To be concise, it is evident that rich schools perform better and poor schools perform badly. Rich parents provide better education for their children (they send them to rich schools) whilst poor parents cannot afford to provide better education for their children (they send them to poor schools) and rich provinces are ahead in addressing issues related to provision and equity (Dunne et al., 2002; Koretz, 2009). The challenge is now on the issue of quality which has not been adequately addressed by systemic assessments both at international and national level.

A study by Graven and Venkat (2014) was conducted with 54 teachers in 21 township and suburban primary schools in Johannesburg and Grahamstown. Their focus was on the teachers’ experiences with ANA. The findings revealed the following;

(1) Learners in Grade 3 had a serious problem reading the test questions and interpreting these tests by their teachers compromised their validity. (2) Learners were subjected to write content in ANA without being taught that content. (3) Teaching

28

towards ANA at the expense of quality mathematics learning. (4) The marking guidelines of ANA did not allow multiple solution strategy which was seen to disadvantage learners. Monitoring by district and Provincial Education has revealed a need to empower teachers and subject advisors with knowledge and skills needed to develop quality learning and assessment materials such as tests, assignments and projects (DBE, 2014a). An observation here is that teachers, as well as subject advisors, lack skills that are critical in the challenges facing achievement in ANA testing.

A qualitative textual analysis carried by AMESA (2012) on the ANA 2012 Grade 9 mathematics focused on content coverage and cognitive level requirements. The findings of the analysis were; (1) only word problems, a small portion of the test in question 3 challenged second language speakers and this revealed that mainly the test did not pose serious language problems. (2) In some questions, learners struggled to answer the questions because the formulae were not given. (3) Some content such as transformational geometry, data handling (statistics) and probability;

15.7% of the test which is taught after September were tested which could have disadvantaged some learners. (4) The stakes for the ANA tests were low which made learners not take it seriously. The analysis suggested that obstacles that could have resulted in low achievement were, psycho-genetic, didactical and epistemological obstacles. These results gave direction to the current study hence it took the epistemological perspective.

The report on the 2012 ANA testing, DBE (2012b) identified language and mathematics knowledge and skills as key challenges to learners who participated in the 2012 ANA. Learners’ scripts were randomly collected and remarked and it was found that a majority of learners lacked skills and knowledge of the grade in which they were placed. This was an indication that as learners progressed there was lack of systematic progression in their mathematics knowledge and skills learned in consecutive grades. The main challenge here was to locate the origins of the problem.

To identify the niche of these challenges, the following conceptual questions raised:

Is the problem in the teaching and learning? Or, Is the problem in the ANA tests themselves? This is not mentioned in this report and needs to be researched. It may

29

be important to revisit the aims of national systemic assessment testing and find alternatives that are relevant for the discursive educational context of South Africa.

National systemic assessments evaluate an education system, schools, students and sometimes teachers in a quest to provide evidence on learners’

achievement at a particular stage of education in identified curriculum discourses (DFID, [S.a.]). In achieving these aims of national systemic assessment, Kellaghan et al. (2009) argue that the testing takes two forms: (1) census-based in which all schools and learners at targeted population participate, and (2) sample-based which uses only sampled schools. In the DFID [Sa], it is explained that a sample-based has three advantages; the cost is less, turn-around time is faster and higher quality of data due to possible higher supervision. In the context of South Africa, census-based provides information about all school, all districts, all provinces, and the education system in general, (DBE, 2012a). The cost is higher, there is low quality of data due to low quality of supervision, and the turn-around time must be long (DFID, [Sa]).

If poorly performing schools are sanctioned and results published, the assessment become ‘high stakes’ which may have the following negative effects;

neglect of curriculum main areas in favour of the national systemic assessment, teaching that is characterised by rote memorisation and drill for the national systemic assessment at the expense of higher order reasoning, rich mathematics and problem solving skills; teachers focus on low performing learners to make the school results look good, (DFID, [Sa]; Kellaghan et al. 2009). Such challenges are evident in the ANA testing in South Africa. For example, aggregated scores reported in provinces and districts in Grade 9 mathematics show a downward slide, and this could be a reflection of the practices that the DBE performs in the ANA (DBE, 2014a). Such challenges may be related to the quality of the data, turn-around time or quality of supervision which the current study must address.