• Tidak ada hasil yang ditemukan

International Systemic Assessments

2. CHAPTER TWO

2.2 M ONITORING M ATHEMATICS E DUCATION

2.2.2 International Systemic Assessments

International systemic testing has been justified and it is widely agreed that it provides useful information on various educational systems (DFID, [Sa]; Schmidt & McKnight, 1998). South Africa participation in all the TIMSS, especially in mathematics has shown low achievement among participating countries (Howie, 2004). Some studies have focused on South Africa’s participation and related factors (Howie, 2003; 2004;

Leung, 2005; Wang, Osterlind & Bergin, 2012). However, little has been done to address concerns the raised from these results as South Africa’s performance still remained lower than the international average in the 2011 TIMSS Grade 8 mathematics (DBE, 2014a).

A report on the 2011 TIMSS Grade 8 mathematics revealed a significant improvement on South Africa’s achievement, however South Africa was positioned forty fourth out of forty five participating countries. The situation is bad considering that South Africa used Grade 9 learners for the Grade 8 response items when a majority of countries used Grade 8 learners. South Africa scored an average of 352, way below the 500 TIMSS Centre Point Human Sciences Research Council (HSRC)

20

(2013). These results still indicate that South Africa continues to perform poorly in mathematics when benchmarked internationally.

Figure 2.1 is an illustration of learners’ performance and domestic economic state. There is an indication that in the 2011 TIMSS Grade eight, (32%) of learners in the participating schools had more learners from affluent schools’ homes and had the highest achievement. Contrary to these results, (36%) of learners in participating schools from disadvantaged homes and these learners had the lowest achievement (International Association for the Evaluation of Educational Achievement IEA, 2013).

These results justify that learners from affluent homes achieved better than learners from disadvantaged homes.

Figure 2.1: International averages for student economic background, (IEA, 2013:

14)

In their study on four countries and their mathematics achievement levels in the TIMSS 2003, Wang et al. (2012) identified two categories of social contextual factors and these are: school climate and social-familial influences. The results of this study show that in South Africa, schools that are better resourced and well-managed showed high mathematics achievement. For parents who have higher educational

21

level, their children had high achievement. These results were consistent in the four countries, South Africa, Singapore, United States of America and Russia.

South Africa’s performance in mathematics systemic assessments remain low as compared to these countries (HSRC, 2013; DBE, 2013b). Visible change in South Africa in recent years has been in the curricula, firstly C2005, then the NCS, followed by the RNCS and most recently the CAPS. This has happened irrespective of warnings from researchers on systemic assessment such as Leung (2005) that major changes must not be implemented in South Africa before identifying factors that might have negatively affected teaching and learning.

A comparative study by Reddy (2006) explains that the TIMSS study requires a minimum of 150 schools in each participating country with a minimum of one whole class participating in one school. South Africa had 225 schools that were randomly selected and stratified per province. The results of the 1995 TIMSS showed that South African learners achieved 275 points in the National average out of 800 points in the mathematics test while the International average was 487. The top Province was Western Cape with 381 points (this is below the international average), second and third were Gauteng Province and Northern Cape Province both achieving 318 points and last was Limpopo Province the lowest with 226 points.

The results of review studies by Howie (2003; 2004) on both TIMSS 1995 and 1999 revealed that South African learners performed worse than other participating countries, including developing and developed countries. The study also revealed the following results in South Africa’s participation; (1) Learners Afrikaans and English as home languages performed better than African language learners. (2) Learners who believed they were strong mentally in mathematics performed better. (3) Learners in rural schools performed badly as compared to learners from urban schools. (4) Learners with teachers using traditional methods of teaching performed well as compared to teachers who used methods of the reformed curriculum. (5) Learners in large classes performed badly as compared with those in small classes (large classes are those with an average of 50 in a class).

22

A comparative study by Leung (2005) that focused on possibilities that mathematics achievement can be attributed to classroom practices showed that South African learners performed badly in the 2003 TIMSS study as compared to all participating countries. The study revealed the following findings: (1) In countries that had high learner achievement like East Asia, learners did not enjoy mathematics due to traditional methods of teaching by qualified teachers who argued that their teaching taught clear and simple procedures for pedagogical and efficiency reasons at the expense of rich mathematics concepts. (2) The quantitative results of the 2003 TIMSS video study showed a negative correlation between learner achievement and enjoyment of mathematics in East Asia. (3) The qualitative results showed advanced mathematics learning practices and relevant reasoning without compromise that could see more learners accessing mathematics at the expense of advanced mathematics. (4) South African learners were good in terms of enjoyment and self- confidence in mathematics which had no correlation with the achievement which was low. If change is done the positive attitude must be maintained. In his studies Leung (2014; 2005) warned that the poor achievement by South African learners did not imply a total revamp of the education system but rather a focus on diverse cultural factors that may have negatively affected teaching, learning and learners’

achievements.

The observation of this researcher is that such results have been consistent in all studies on international systemic assessment. A steady increase in achievement in South Africa 2011 Grade 9 TIMSS may be a sign of lessons from previous participation. However, this still remains below the TIMSS international average (HSRC, 2013). A need for studies that focus on international systemic assessment from a different lens is eminent if one considers what these assessments were initially aimed at doing. Dunne et al. (2012) pointed out two valid reasons why systemic assessment are not effective in educational reform. The first point is;

“A fairly recent expectation is that the results of systemic assessment be made available to parents. This new access to information may be well intentioned, but the form of the information is problematic, precisely because the data from a single and necessarily limited instrument are so fragmentary and imprecise. Systemic assessment is generally not fine-grained enough to report to teachers, or parents, the results of individual learners, as if these single test

23

performance results, ascertain from an instrument of about an hour’s duration are adequate summative insight into a year’s progress in the classroom.” (Dunne et al., 2012: 3).

This is an indication that the structure recently used in systemic assessment is deformed and does not necessarily cover all areas for the initial intentions of systemic assessments. The second reason is illustrated below;

On the basis of the systemic test score alone, a learner or parent is given a qualitative description that, however well intentioned, is simply arbitrary, invalid and possibly fraudulent, until other evidence justifies the descriptions offered. It is arguable that such descriptions are generally damaging, but especially when test design has not been informed at all by any criteria for item construction and selection that might relate to either the cut-points and the preferred 10% intervals or the objectives chosen.” (Dunne et al., 2012: 3).

In this instance Dunne et al. (2012) clearly show the insufficient information that is contained in the systemic assessment which is aimed at reporting on national or international achievement which is vast, and yet is narrowed to aggregated scores.

The need for coherent means of dealing with information in systemic assessment is rather obvious. Kellaghan et al. (2009) argue that the disadvantages of using international assessments are: (1) the test is used in more than one country; (2) its content may not be representative of the curriculum of a single country; (3) does not pay enough attention to contexts of individual participating country, and the technology used cannot adapt fully to diverse local cultural and contextual education complexities of all participating countries. This is an indication that although South Africa has participated in international systemic assessment, there is still a need to conduct national systemic assessments that respond to the South African context.