• Tidak ada hasil yang ditemukan

INTRODUCTION

In this chapter I report on the findings of the field work in the two sites, UND and MLST. I open this chapter with an explanation of terms used so that their meaning or use is clear and unambiguous for the reader. In the first section of this chapter I describe how I worked with the data within an action research study. In this section, I explain how the direction of my research changed from its initial perspective. I then go on to analyse the questionnaires and the focus group discussions, focusing in particular on the themes and categories with which I worked. In the next section, I discuss the group oral assessments. The weighting of marks for the mixed-mode of assessment is reviewed before the penultimate section of the chapter which highlights the participants' recommendations for improving or enhancing the design and administration of the oral and the mixed-mode assessments. The final section summarises my findings.

Explanation of Terms Used

Please note the following clarifications with regards to the use of: UND and MLST; EFL and ESL; and pseudonyms. UND refers to the first round of assessments conducted at the

University of Natal, Durban, (now University of KwaZulu-Natal) and MLST refers to the second round of assessments which were conducted at the ML Sultan Technikon, Durban (now the Durban Institute of Technology), after adjustments were made to the procedure.

The classifications EFL and ESL were determined from: the demographic information on the questionnaires and the respondents themselves. All the participants in my project were adults and they are aware of their proficiency in the medium of English and the other languages they speak. Students and assessors classified themselves as either EFL or ESL - no proficiency testing was therefore done to confirm or deny these classifications.

145

CHAPTER FIVE FINDINGS

INTRODUCTION

In this chapter I report on the findings of the field work in the two sites, UND and MLST. I open this chapter with an explanation oftenns used so that their meaning or use is clear and unambiguous forthe reader. In the first section of this chapter I describe how I worked with the data within anactionresearch study. In this section, Iexplain how the direction of my research changed from its initial perspective.

r

tben go on to analyse the questionnaires and the focus group discussions, focusing in particular on the themes and categories with whichI worked. In the next section, I discuss the group oraJ assessments. The weighting of marks for the mixed-mode of assessment is reviewed before the penultimate section of the chapter which highlights the participants' recommendations for improving or enhancing the design andadministration of the oral and the mixed-mode assessments. The final section summarises my findings.

Explanation of TermsUsed

Please note the following clarifications with regards to the use of: UND and MLST; EFL and ESL; and pseudonyms. UND refers to the fLTst round of assessments conducted at the

University of Natal, Durban, (now University of KwaZulu-Natal) andMLSTrefers to the second round of assessments which were conducted at the ML Sultan Technikon, Durban (now the Durban Institute ofTechnoJogy), after adjustments were made to the procedure.

The classifications EFL and ESL were detennined from: the demographic infonnation on the questionnaires and the respondents themselves. All the participants in my project were adults and they are aware of their proficiency in the medium of English and the other languages they speak. Students and assessors classified themselves as either EFL or ESL - no proficIency testing was therefore done to confinn or deny these classifications.

145

Pseudonyms have been used only in cases where participants had to be identified or referred to individuals. Pseudonyms are used to protect the identity of participants discussed or quoted

in this chapter.

RESEARCH METHOD AND DIRECTION

As in most qualitative research, my research was not based on a fixed theoretically-based hypothesis, but was based on "the recognition that existing practice" in assessment "falls short of aspirations" (McNiff 1988: 74). As such, my study followed an action-reflection methodology (McNiff 1988: 73) of: identifying a problem, imagining a solution,

implementing the solution, observing the effects, evaluating the outcomes, modifying actions and ideas in light of the evaluation, [and] re-planning for the next action step.

"In action research study it is important to keep things in perspective and to remember that plans may well change as other issues are unearthed. Certainly the perspective may well alter" (McNiff 1988: 82). So, although I began my research with guiding questions the direction I eventually took came about as a result of working with the data, the assessment sessions and feedback from the participants. Hammersley and Atkinson (1990: 24) support such a change in their comment that "the strategy and even direction of the research can be changed relatively easily, in line with changing assessments of what is required by the process". Consequently, this chapter discusses issues surrounding the oral assessments conducted and abandons discussion or analyses of the questions that dealt with issues which later became redundant to my study, for obvious reasons. My perspective on the weighting of the oral and written marks was certainly "altered". Although the questionnaires and the focus group discussions (and subsequently the analyses thereof in this chapter) made reference to a 50/50 split in the oral/written marks, the perspective that emerged after the assessments and the collaboration with the participants led to a different understanding on the calculation of marks as explained later in this chapter (under the heading: weighting of marks).

In keeping with the mixed-mode of assessments, participant responses were sought in writing (via questionnaires) and orally (via the focus group discussions). The questionnaires provided me with biographical data as well as candidate perceptions of the assessments conducted. The focus group discussions allowed me to discuss issues related to the assessment sessions with Pseudonymshave been used only in cases where participants had to be identified or referred to individuals. Pseudonyms are used to protect the identity of participants discussed or quoted in this chapter.

RESEARCH METHOD AND DIRECTION

As in most qualitative research, my research was not based on a fixed theoretically~based

hypothesis, but was based on "the recognition that existing practice" in assessment "falls short of aspirations" (McNiff 1988: 74). As such,mystudy foHowed an action-reflection methodology (McNiff 1988: 73) of: identifying a problem, imagining a solution,

implementing the solution, observing the effects, evaluating the outcomes, modifying actions and ideas in light of the evaluation, [and] re~planningfor the next action step.

"In action research study it is important to keep things in perspective and to remember that plans may well change as other issues are unearthed. Certainly the perspective may well alter" (McNiff 1988: 82). So, although I began my research with guiding questions the direction J eventually took came about as a result of working with the data, the assessment sessions and feedback from the participants. Hammersley and Atkinson (1990: 24) support such a change in their comment that "the strategy and even direction of the research can be changed relatively easily, in line with changing assessments of what is required by the process". Consequently, this chapter discusses issues surrounding the oral assessments conducted and abandons discussion or analyses of the questions that dealt with issues which later became redundant to mystudy, for obvious reasons. My perspective on the weighting of the oral and written marks was certainly "altered". Although the questionnaires and the focus group discussions (and subsequently the analyses thereof in this chapter) made reference to a 50/50 split in the oral/written marks, the perspective that emerged after the assessments and the collaboration with the participants led to a different understanding on the calculation of marks as explained later in this chapter (under the heading: weighting of marks).

In keeping with the mixed-mode of assessments, participant responses were sought in writing (viaquestionnaires) and orally (via the focus group discussions). The questionnaires provided me with biographical data as well as candidate perceptions of the assessments conducted. The focus group discussions allowed me to discuss issues related to the assessment sessions with

the participants more candidly and in-depth. The purpose of seeking these responses, was to obtain feedback to enable me to enhance and to make necessary changes to the design and administration of the assessments. As the questionnaire contained some open-ended questions and other questions which required participants to substantiate their responses, and the focus group discussion responses by their very nature being subjective, the data gathered was content-analyzed. Content analysis entails "summarizing, standardizing and comparing"

(Smith 1975: 147). Verma and Bagley (1975: 247) caution that whenever results have to be content-analyzed, there "is a possible source of bias". Responses in this study are therefore presented verbatim without any additions or deletions by the assessors or myself. These verbatim responses are presented in italics so that they may be easily distinguished as direct quotes from the respondents.

QUESTIONNAIRE AND FOCUS GROUP ANALYSES: ASSESSMENTS

Of the one hundred and sixty one students surveyed at UND, only seventy eight unspoilt questionnaires were received. Sixty five unspoilt questionnaires were received from MLST.

The number of returns was also due to the fact that some students did oral assessments twice, (that is, they did oral assessments for more than one subject), but answered the questionnaire once only. The students at UND were asked to complete the questionnaires immediately after their oral assessment session. They had to fill in the questionnaire and deposit them into a box provided before they left the exam room. Analysis of these questionnaires revealed largely negative sentiments about their state of mind.

Students were unaware of how they had performed in the oral and were in a state of anxiety about their performance and their marks (see Alpert and Haber 1960). As one student said (in the focus group discussion), "we had just finished the oral and we still had a very nervous feeling in our throat". To which another added, "yes, we didn 't even know how we did. We

didn 't want to say it went well, in case we didn't do well, then we would look really foolish".

In the MLST assessments then, students were asked to complete the questionnaires and to hand them in to their lecturers within one week of the assessment, giving them time to reflect calmly on their experience.

the participants more candidly and in-depth. The purpose of seeking these responses, was to obtain feedback to enable me to enhance and to make necessary changes to the design and administration of the assessments. As the questionnaire contained some open-ended questions and other questions which required participants to substantiate their responses, and the focus group discussion responses by their very nature being subjective, the data gathered was content-analyzed. Content analysis entails "summarizing, standardizing and comparing"

(Smith 1975: 147). Venna and Bagley (1975: 247) caution that whenever results have to be content-analyzed, there "is a possible source of bias". Responses in this study are therefore presented verbatim withoutanyadditions or deletions bythe assessors

or

myself. These verbatim responses are presented in italics so that they may be easily distinguished as direct quotes from the respondents.

QUESTIONNAIRE AND FOCUS GROUP ANALYSES: ASSESSMENTS

Of the one hundred and sixty one students surveyed at UNO, only seventy eight unspoilt questionnaires were received. Sixty five unspoilt questionnaires were received from MLST.

The number of returns was also due to the fact that some students did oral assessments twice, (that is, they did oral assessments for more than one subject), but answered the questionnaire once only. The students at UND were asked to complete the questionnaires immediately after their oral assessment session. They had to fill in the questionnaire and deposit them into a box provided before they left the exam room. Analysis of these questionnaires revealed largely negative sentiments about their state of mind.

Students were unaware of how they had performed in the oral and were in a state of anxiety about their performance and their marks(seeAlpert and HabeT 1960). As one student said (in the focus group discussion),"we hadjust finished the oral and we still had a very nervous feeling in our throat".To which another added, "yes. we didn't even know how we did We

didn't want10sayitwent well. in case we didn't do well, then we would look really foolish".

In the MLST assessments then, students were asked to complete the questionnaires and to hand them in to their lecturers within one week of the assessment, giving them time to reflect calmly on their experience.

Dokumen terkait