• Tidak ada hasil yang ditemukan

Numerical data South African tertiary education

Factors contributing to success or failure

From the literature, a number of features arise that have been significant influences on learners‟ academic achievements. Fraser and Killen (2005) and, separately, although evidently using the same methodology, Ngidi (2007), in studies conducted at three South African universities – one historically white, one historically black and one offering distance education – found a measure of agreement on factors contributing to students‟ academic success (see Appendix H). The top ten items identified by students and lecturers at the two contact universities had six factors in common, all of which had to do with motivation and application. While the remainder of the students‟ factors were related to similar aspects, those of the lecturers included one item related to cognitive skills, namely: ability to reason logically. General academic ability was ranked relatively low down: 33rd by students and 29th by lecturers (out of 34 items).

Only three factors relating to failure were common to the polls of both the students and the lecturers, but students and lecturers ranked them differently. These factors appear to relate to the quantity or quality of students‟ application to their studies. The students included in their top ten items two aspects that might relate to cognitive ability: inability to perform well and inability to distinguish between important and unimportant information. The lecturers included two cognitive factors in their top ten items: failure to reach the required depth of understanding and inability to use higher order thinking skills. Lack of academic ability was ranked 36th by students and 16th by lecturers.

The results of these studies imply some disagreement between learners and teachers

broader aspects of students‟ demographic background may be significant influences on their engagement with whatever pedagogy they are exposed to, and thus on their academic achievement.

Zeleza and Olekoshi (2004) commented on the greater proportion of young people in the populations of African countries and the concomitant problems of „massification‟, with there being insufficient underpinning of learning and financial support in Africa‟s universities, which continue to suffer a „brain drain‟ of their academic staff.

Massification is not a problem unique to the developing world, but lack of institutional support and loss of teaching staff are significant problems.

Medical education

In a Human Sciences Research Council (HSRC) study of the profession and professional education of doctors in South Africa (Breier & Wildschut, 2006), the observation was made that over the period 1999 – 2003, despite an increase in enrolment of medical students, there was a decrease in the number of graduations. Over the same period, enrolment of African and „Coloured‟ students at the country‟s medical schools increased, but the success rate of Africans diminished nationally, although that of Coloureds increased (granted, off a small base). At UKZN, whose student demographics most nearly correspond to the national statistics, both Africans and Coloureds under- achieved, while Indians and Whites achieved better graduation rates compared to the country as a whole. The writers of the report made the point that, particularly for students from disadvantaged backgrounds, physical access to higher education does not necessarily equate to epistemological access.

Demographic analysis – NRMSM students

In this chapter I analyse numerical data related to factors that might be expected to influence perceptions and experiences of the pedagogy under scrutiny. For lack of a direct measure of students‟ engagement with PBL, I have used as a surrogate their test results over the first three years of the MBChB programme – the years in which PBL is employed in the form of „problems‟ set in a clinical context that are dealt with by way of collaborative learning in small groups. The university‟s student record system30 and medical school records both contain information that sheds light – albeit indirectly – on learners‟ engagement with the pedagogy.

I analyse data on the 202 students who started in 1st year in 2007; of these, 182 passed through to 3rd year in 2009. The analysis of numerical and categorical data takes three forms:

1. Expression of relationships in visual form as graphs, plus statistical comparisons using a general linear model (GLM) in SPSS®

2. Students‟ and teachers‟ comments on graphs

3. Statistical analysis of learners‟ marks using SPSS® in the form of a generalised estimating equation (GEE)

The graphical and GLM analyses view students‟ achievements in the PBL milieu through different facets individually. The question arises: since all of these facets apply simultaneously to any given learner, what are the interactions between the different demographic factors? In order to explore these matters, the dataset was restructured so as to allow exploration of interrelationships; for this, the GEE was used.

The GEE relates the dependent variable (theme test marks) to the factors (e.g. sex or

„race‟) and covariates (non-categorical features such as matric points score and student‟s age) that might influence it, and incorporates the aspects that are added to the calculation. Here, I am interested in influences over time on not one but several test marks. The GEE enables me to capture a wider view of each factor compared to other factors, and to see effects against time. Using the GEE function in SPSS®, I added factors in succession in order to see which of them would have an independent influence on students‟ test results. The GEE assumes that cases are dependent within subjects (e.g. an individual student over all 18 tests) and independent between subjects (each student is distinct from all the others as far as assessment marks are concerned). This calculates correlations to portray factors that are mutually dependent within each subject.

Three caveats arise, relevant to the use of these modes of analysis (1, 2 and 3 above):

 Graphical representation and statistical testing of this cohort‟s test results according to various criteria appear to indicate certain relationships. One could surmise intuitively that some of these relationships are interlinked, and when subjected to GEE analysis – which tests for independent influences on test results – it can be seen that some factors fall away, since they are in fact dependent upon other factors.

 Despite the complexity of the numerical analysis, the statistical procedures can describe only what factors were significant; I thus turn again to my respondents to establish why these factors might have been influential.

 Since the students interviewed had not completed their 3rd year when I interviewed them, the complete set of their own test results was not available at that time. The graphs I used as stimulus for comment on numerical and demographic relationships were those from a study on a previous cohort of students31, and thus the numerical

31 Unpublished audit, 2004

values and their relationships were not those of the cohort under study32. However, I believe that the comments made are still relevant.

Bearing these caveats in mind, I present the descriptive statistics, graphical relationships, respondents‟ comments, and results of the multifactorial GEE analysis, for the ten demographic factors that I explored. The order of presentation is according to the relative influence (according to the GEE analysis33) of each factor.

High school

The high schools previously attended by 127 of the initial cohort of 202 students could be classified according to their quintiles34. There were 9 students from quintile 1 (Q1) schools, 4 from Q2, 8 from Q3, 13 from Q4 and 93 from Q535. For the purpose of comparison, I have added, oxymoronically, a „sixth quintile‟: data on the 21 students from independent (i.e. non-state) schools.

I am assuming that the government‟s categorisation of schools into socioeconomically- based quintiles provides an index of the quality and quantity of the resources available to those schools. I am aware that assigning a particular school to a particular quintile

32 The graphs used as stimuli for discussion are displayed in Appendix E.

33 The table showing the results of the GEE analysis, the relative weightings, and the degree of statistical significance of the different factors forms Appendix I.

34 An indication of the socioeconomic status of the community surrounding the school – used by the government in calculating differential funding of schools based on “income, unemployment rates and the level of education of the community” (http://www.create-rpc.org/pdf_documents/Policy_Brief_7.pdf).

Thus Q1 schools in the lowest socio-economic communities receive more funding per capita than Q2, etc.

Q5 schools, however, are probably all levying substantial contributions from scholars‟ parents, and „Q6‟

schools generally rely entirely on parents‟ contributions.

does not automatically imply that the school is equivalent to all other schools in that quintile; indeed, it has been shown that some schools in straitened circumstances can deliver good quality teaching while others in similar positions cannot (Christie, Butler,

& Potterton, 2007; Chutgar & Kanjee, 2009).

The graph below (Figure 5.1a) confirms the statistics, showing that students from Q1 high schools do significantly (p < 0.001) worse than the rest (in the GLM analysis), which are indistinguishable from one another. Two interesting observations may (tentatively) be made. Firstly, having attended an independent (private) school does not confer on one a particular academic advantage, possibly because resources at such schools are used for extracurricular activities as well as for directly academic pursuits. Secondly, although Q2 schools are not statistically distinct from Q3 to Q6 schools, the Q2 line on the graph tends to lie above the others. (There were only four Q2 students and these might simply have been exceptional. All were Zulu-speaking females, from different schools; two had prior experience in tertiary education – see below for possible significance.)

Figure 5.1a Aggregate test marks over three years according to students‘ high school of origin, classified by quintile.

Q1-6: Quintiles as described in text 30

35 40 45 50 55 60 65 70 75 80

1.1 1.2 1.3 1.4 1.5 1.6 2.1 2.2 2.3 2.4 2.5 2.6 3.1 3.2 3.3 3.4 3.5 3.6

Mean group marks (%)

Tests over three years