• Tidak ada hasil yang ditemukan

BACKGROUND AND RATIONALE FOR STUDY

5.5 Research Instruments

161

are referred to as Disciplinary Specialists, which has been abbreviated as DSs50. In this study, the RPs who taught the academic literacy module are referred to as Academic Literacy Specialists, which has been abbreviated as ALSs51. The RPs’ academic qualifications were in one of the following fields of tertiary study: pure sciences, social sciences, education and English literature. To assure participant anonymity, I assigned a pseudonym to each RP selected for this study. Since all the RPs taught one of the modules offered in the FP, I was keen to identify similarities and differences among the RPs’ views in response to the research problem and the research questions.

162

In qualitative research, each idea, interpretation, and plan is filtered through your eyes, through your mind, and through your point of view ... You will take the role of constructing and subsequently interpreting the reality of the person being interviewed, but your own lens is critical ... You, as the researcher, serve as the filter through which information is gathered, processed, and organized (117).

Conceptualising the interview is particularly relevant in this study, especially with the way in which it relates to the interpretive paradigm that involves an understanding of the subjective world of human experience. The semi-structured interviews were used as an instrument in this study in order to gather detailed data from the RPs “to understand beliefs, perceptions or accounts of the research focus” (de Vos et al. 2005: 296), viz. the acquisition of discipline-specific literacies in science in the FP. After I had decided that the semi-structured interviews was one of the suitable research instruments that will yield the relevant data designed to understand the phenomenon, I formulated a set of questions that I intended to pose to the RPs (the interviewees) (See Appendix 3 for the questions posed to the RPs).

I then engaged in the piloting of the questions. The initial questions that were devised were piloted among some colleagues, many of whom were involved in post-graduate research themselves. The main purpose of piloting the questions was to assess whether the questions would yield the kind of data required for this study and to ensure that the questions were clear, unambiguous, valid and relevant.

Amendments were made to the questions that contained specific terminology e.g.

‘literacies’ and ‘genres’, which were clear to me but opaque to others. This was done because the clarification of concepts peculiar to this study was necessary in the interview.

Questions that yielded bias or were considered as leading questions - especially where the questions could influence the responses of the RPs had to be rephrased.52

I interviewed all RPs myself, mainly at a time and day that was convenient to them. This was done to ensure that the RPs were not inconvenienced from their activities. By personal

52 Examples of the initial questions that had to be rephrased: The following question was changed as it conveyed some degree of bias: “As an AL, do you think that the DSs expect the teaching of literacies needed for science to be your task? Questions that did not allow for elaboration such as “Are you content with the changes made to the module?” had to be subsequently rephrased to“What change/s were made to the module?” In other instances, follow-up questions were included to allow for further elaboration, e.g. “Why was/were this/these change/s made?” or “Have this/these change/s been effective?

163

choice, all the interviews were conducted in the RPs’ offices at UKZN. The purpose was to ensure that they were comfortable in their own personal space. Before commencing with the interview, I informed each RP of the purpose of the interview and the need to have it recorded. With the consent of the RPs, interviews were recorded by means of a digital voice recorder. This enabled verbatim transcriptions of verbal data.

The added advantage of doing it this way by myself was that it allowed me to become entirely familiar with the data. The process of interviewing was time-consuming and scheduling them had to be planned well in advance in consideration of the RPs’ own teaching and research commitments. The duration of the interviews was from forty minutes to an hour. This meant an enormous amount of time spent on transcribing verbal data.

The interview questions guided, rather than dictated, the way in which data was collected. I tried to keep to a logical sequence in the interview by arranging questions from the simple to complex; and from the broad to the more specific. Participants were also given the opportunity to provide further data that the interview questions probably failed to elicit. I wanted to make the interview process flexible and exploratory so that the RPs would feel more comfortable and less pressurized, and I also intended to give them an opportunity to determine how the interview proceeded. This also cast me as a researcher in a less- controlling light.

The interview schedule (Appendix 3) included open-ended questions and, in some instances, I had to probe the RPs to tease out meanings and clarify certain issues which meant that I had to be particularly attentive and needed to listen actively. Where the RPs were not particularly articulate or forthcoming with data, I had to rely on the strategy of

“probes” to “stimulate a respondent to produce more information” (Bernard, 2000: 196).

Probing enables the interviewer to ask respondents to “extend, elaborate, add to, provide detail for, clarify or qualify their response thereby addressing richness, depth of response, comprehensiveness and honesty” (Cohen et al. 2011: 420) that characterize successful interviewing. For example, in response to exploring in this study whether the DSs had any notion of the nature of SCOM and its role in FP, especially in respect of science discipline- specific literacies, one of the DSs dismissed the question by stating that he had absolutely no understanding of the nature and/or the efficacy of SCOM. Through probing, I was able to elicit a verbal response, specifically in respect of the existence and nature of inter-

164

disciplinary engagement within the FP. I also used the technique of ‘funnelling’ (de Vos et al. 2005: 297), especially when I had to elicit, not only the RP’s general views, but his/her response to more specific concerns as well. In this study, for example, I explored the RPs’

views of the rationale of the FP in general, the students they taught and more specifically, the way/s in which the discipline that he or she taught satisfied the philosophy of the FP and the student profile in the FP.

Observation was another research instrument used to collect data in this study.53 This entailed direct, overt observation that I personally undertook at lectures, tutorials, laboratory practicals and field trips. Once again, sole observation by the researcher allowed for consistency in terms of the purposes of such observation. My role in each of these natural settings was made clear to the FP students: I was undertaking research at higher education level. The appropriate ethical considerations of students’ consent to being observed had been undertaken54. In the initial observation sessions of the lectures, tutorials and laboratory practicals that I had attended my presence was fairly noticeable by students, many of whom constantly glanced in my direction as the lessons proceeded. With the progress of time – in fact, by the third observation session, I had become a very familiar face within the context of the science ‘classes’ and I noticed that after a period of time, my presence was more easily accepted. Observation, as a research instrument, yielded data within an interactional setting where actions and behaviour of the RPs within context were easily accessible and available. Data from observation were merged with those obtained through the semi-structured interviews and from documentary evidence. In relation to this study, observation of lessons, tutorials, laboratories practicals and field trips was a necessary facet of the study. It is through the observation of these in natural settings that I was able to collate data based around the critical research questions formulated for this study: the perceived challenges that emerge in the use of the language of science and the discipline-specific literacies in the foundation modules in the foundation programme; and, whether the DSs assisted the FP students in the acquisition of discipline-specific literacies required for science discourse. I looked for evidence of focus on (or the absence thereof) on discipline-specific literacies in science and the ways in which the DSs, in particular, alerted the FP students to the language in science.

53 See Appendix 4 for a sample of an observation schedule.

54 See Appendices 5 for Student Consent to Participate in Research.

165

I used the semi-structured observation to gather data for this study. This type of observation is “hypothesis-generating; the researcher has an agenda of issues but will gather data to illuminate these issues in a far less predetermined or systematic manner” (Cohen et al.

2011: 457). I found that with this type of observation, I had to gather key elements as they emerged or flowed from the situation, which I later linked to or compared with other elements that emerged from the other research instruments used (viz. semi-structured interviews and documentary evidence). For Cohen et al. (2001), this is “enabling the elements of the situation speak for themselves” (305). Observation necessitated comprehensive field notes that formed part of my memo writing, the purpose of which has already been outlined earlier in this Chapter.

On the advantage of observation, Cohen et al. (2001) note that “it affords the researcher the opportunity to gather ‘live’ data from ‘live’ situations ... in situ rather than at second hand”

(305). On this note, observation allows the researcher “to see things that might otherwise be unconsciously missed and to move beyond perception-based data (e.g. opinions in interviews) (Cohen et al. 2011: 456). On the significance of observation and how it contributes to credibility, Conrad and Serlin (2006) state: “Observation can be an important part of the empirical process of triangulating what people say they do (as in interviews) and what they actually do (as in their observation of their behaviour)” (381). This means matching the responses given in interviews to observed behaviour or, what Heck (2006) calls the provision of “evidence about the extent to which something is in fact being implemented” (381). Furthermore, “observational data enables the researcher to discover things that participants might not freely talk about in interview situations” (Cohen et al.

2011: 456). These points tie up with the earlier reference to how triangulation can contribute to ensuring validity and reliability of findings.

A limitation of observation was the issue of simultaneously making comprehensive field notes and actively observing. Another limitation of observation is the issue of “reactivity where participants may change their behaviour if they know that they are being watched”

(Cohen et al. 2001: 305). As much as I had no proper method of controlling or avoiding this, I attempted to convey to the RPs the expectancy of natural behaviour by asking them to ‘pretend I was not there’. Having informed the RPs of my intention to observe and

166

having had them voluntarily sign consent55 to allow me to do so, I sometimes slipped unobtrusively in and out of the lecture/laboratory venue.

The third instrument for data collection used in this study was documentary evidence. In the context of this study, primary documentary evidence was discipline-based course manuals used in foundation modules in science and SCOM offered within the FP; the FP students’ laboratory practical workbooks, laboratory reports and field reports, and test scripts. In addition to these, monthly discipline-based reports, compiled by course co- ordinators (which offered comments on discipline-based issues in the FP) were read and incorporated into the study. This, however, was done where it was deemed necessary.

In this study, I looked at the module course manuals in the FP to ascertain how the course content of the modules had been amended. These had to be accessed in conjunction with the RPs’ comments on any amendments made to the discipline-based course manuals. The course content yielded data with regard to language of science viz. scientific discourse, science genres and scientific literacy required for the pursuit of tertiary studies in science and particularly, for students from educationally disadvantaged schooling backgrounds for whom the LoLT is their additional language. The student profile of students enrolled in the FP in science at UKZN has already been presented in Chapter 2 of this study.

In analysing students’ writing and answers based on their work (i.e. documentary evidence), I looked at the discipline-specific literacies in science and the language of science that might have compromised students’ understanding of science, and thus their participation in science discourse. Using inductive data analysis, I looked specifically at the elements, patterns and categories involving discipline-specific literacies (as a major criterion) that emerged from the written work and compared and contrasted them with the data on the discipline-specific literacies that had emanated from the semi-structured interviews and observation sessions. This was a type of diagnostic analysis that set out to explore the particular strengths and perceived challenges with the science discipline- specific literacies that might have featured in students’ work. In doing so, I had to also explore the nature of the responses from DSs with regard to the students’ use of the discipline-specific literacies needed to learn science. Written tasks accompanied by a rubric

55 See Appendix 6 for Consent Form for DSs.

167

or marking criteria/guide were looked at, for evidence of any focus on assessment of language usage in science (See Appendix 7). Documentary evidence formed part of the triangulation of data from the other two forms of obtaining data, i.e. the semi-structured interviews and observation, thus ensuring validity. This enabled me to gain a holistic picture of the research problem.

Documentary evidence required careful analysis and interpretation. In this study, data collection from documentary evidence was labour intensive. I spent many hours either scanning and/or photocopying volumes of students’ work. All documentary evidence used in this study required me to acquire memoranda. I had to work at a feverish speed because I had to meet specific deadlines, especially those for each of the science modules in the FP.

For example, laboratory practical workbooks were used each week and had to be returned promptly, as were tests that had to be reviewed within a week of having been written and marked. Access to students’ work was dependent on the students’ own personal and academic schedule.

The use of multiple data sources in this study yielded copious data. I analysed the data inductively. This meant comparisons within data of RPs views, actions, responses, accounts and experiences. I coded the emerging data as I collected them to gain a firmer understanding of the research problem. Cohen et al. (2011) define a code as “simply a name or label that the researcher gives to a piece of text that contains an idea or a piece of information” (559). According to Strauss and Corbin (1990), coding is defined as “the process of breaking down segments of text data into smaller units (based on whatever criteria are relevant), and then examining, comparing, conceptualizing and categorizing the data (61). Strauss and Corbin (1990) divide coding into three stages that are open coding, axial coding and selective coding56.

I began the data analysis by first implementing open coding where I read the text/s

“reflectively to identify relevant categories” (Strauss and Corbin, 1990: 69). With this type of coding, I went through the text in stages: line-by-line, phrase-by-phrase, sentence-by- sentence and paragraph-by-paragraph. This type of open coding generated categories.

56 This type of coding is commonly used in Grounded Theory (Glaser and Strauss, 1967). I have used the elements of open coding, axial coding and selective coding here as a way of assisting with the categorising of the data gathered in this study. For information on this type of coding, see Cohen et al. 2011: 559-563 and Charmaz, 2000.

168

Open coding enabled me to “remain attuned to [the] [participants’] views of their realities”

(Charmaz, 2000: 515), letting me make meaning of the data by asking questions such as

“What is this about?” or “What is being referred to here?” I did this by grouping and naming the general categories that arose from the data, e.g. ‘the issue of schooling’.

Thereafter, I implemented axial coding, which meant that the categories had to be “refined, developed and interconnected” (Strauss and Corbin, 1990: 69). With this type of coding, the general categories that had emerged from open coding are recombined into a larger category. I had to look for connections and interconnections between categories. For example, linked to the open category of ‘students’ schooling’, I interconnected categories of ‘teaching methodology’ and ‘learning strategies’. The data was becoming more interpretive. I then worked through the stage of selective coding, which Strauss and Corbin (1990) define as “identifying the core category around which all the other categories that have been identified and created are integrated” (116) to acquire a deep understanding of what is identified as “the main story line” (117). Interpretation of the data enabled me to find patterns and themes; providing the evidence needed to draw conclusions with regard to the research focus of the study.

All three instruments of data collection used in this study were essential in answering the critical questions of this study which focused on the discipline-specific literacies required for science, viz. reading, writing, talking and doing science. These research instruments also assist in isolating any perceived challenges that these literacies could present, especially for students who formed the FP cohort, i.e. students from disadvantaged schooling backgrounds (who gained entry into higher education through access programmes); and for whom English, the LoLT at UKZN, is not their home language (At the time of data collection for this study, the student distribution by home language in the FP was: isiZulu: 86.04%; Xhosa: 5.81%; Swati: 2.32%; English: 1.93%; Sotho: 1.93%;

Tsonga: 0.77%: Other Black Language: 0.77%; Ndebele: 0.38%). Another significant component of the research problem that was explored was the ways in which discipline- specific literacies are conveyed in the foundation modules in science in the FP, i.e. biology, chemistry, mathematics and physics.

169