• Tidak ada hasil yang ditemukan

Mapping the study: Some guiding Principles

Part 2 5.8 In the field

5.9 Working with data

5.9.1 Thematic Analysis

Braun and Clarke (2006) argue that although thematic analysis is widely used in analysing qualitative data, it is poorly demarcated and acknowledged in its own right as a method. In fact, it is debatable as to whether thematic analysis is a method on its own or not (Boyatzis, 1998; Braun & Clarke, 2006; Roulston, 2001). Scholars who do not recognise it as a method mainly argue that thematic analysis just provides core skills, such as “thematising meanings” (Holloway & Todres, 2003, p. 347), and the process of thematic coding (Ryan & Bernard, 2000). As such thematic analysis can be used in many forms of qualitative data analysis. Thus it is considered as the foundation for qualitative methods searching for patterns or themes, such as conversation analysis, interpretative phenomenological analysis, discourse analysis, and narrative analysis (Braun & Clarke, 2006). Thus, scholars argue that it cannot be regarded as method in its own right (Holloway & Todres, 2003; Ryan & Bernard, 2000).

While Ryan and Bernard (2000) and Holloway and Todres (2003) do not regard thematic analysis as a method, other scholars argue that it “should be considered as a method in its own right” (Braun and Clarke, 2006, p. 78). They observe that the only difference between thematic analysis and the other qualitative methods is that these other methods stem from or are tied to a particular epistemological or theoretical position while thematic analysis is not (Braun & Clarke, 2006). Despite that difference, thematic analysis has a clear theory and procedure of analysing qualitative data. As such, it can and should be considered as a method of qualitative data analysis (Braun & Clarke, 2006). Siedel (1998) contends that qualitative data analysis is as much an art as it is a science, since the data generated have to respond to the research question and be presented in a way that makes sense. As such, data analysis is not simply a matter of revealing structures and patterns but is a creative and personal process, guided by a rigorous analytical procedure (Siedel, 1998).

Thematic analysis is also described as “an inductive, thematic analysis” (Roulston et al., 2003). It is inductive because themes have to be generated from the data. In this study themes have been generated from women teachers‟ experiences of teaching sexuality

education in rural schools. In conducting inductive thematic analysis, I have rigorously followed the analytical procedure proposed by Braun and Clarke (2006) and Marshall and Rossman (2006). As noted, in the debate surrounding thematic analysis, the phases are generic to all qualitative data analysis (Holloway & Todres, 2003; Ryan & Bernard, 2000). So, whether thematic analysis is acknowledged as a method in its own right or not, it does not affect the procedure. It is along this generic procedure that I have conducted thematic analysis of the qualitative data of the study.

5.9.1.1 Familiarising myself with data

There are five main “field texts” (Clandinin & Connelly, 1994, p. 418) used in this study.

These are written data from drawings, photo-voice stories, and field notes, transcripts of the audio-tape recorded focus group discussions with the participants, and written data from the memory work exercise. Ely et al (1991) argue that having audio-taped and transcribed conversations allows researchers to reflect on events and experiences such that they can supplement the details. Riessman (2002) also observes that the process of transcription is one excellent way of starting familiarising oneself with data. Commenting on the importance of transcribing as an aspect of thematic analysis, Bird (2005, p. 227) argues that transcription should be regarded as “a key phase of data analysis”. It is also further argued that transcribing interviews is an interpretative act where meanings are created. It also involves making decisions on correcting the language during translation, thus changing the voice and expression of the participant (Braun & Clarke, 2006; Crotty, 1998; Gay, 1992; Lapadat & Lindsay, 1999).

After transcribing and typing the data, I edited the transcripts by checking them against the recorded tapes. The aim was to ensure that the transcripts had retained the information from the verbal accounts of the interviews in a way which was true to the original accounts. Transcribing and typing of the memory work, drawings and photo- voice data were important tasks in familiarising myself with the data and helped me gain a deeper understanding of the data (Braun & Clarke, 2006). Following the transcribing and typing, another aspect of familiarising myself with the data was what Marshall and Rossman (2006) describe as immersion in the data, which involved reading and re-

reading through the data. Thus, I read and re-read through the data corpus, consisting of the transcriptions, typed copies of the drawings and photo-voice data, and the memory accounts. This enabled me to become intimately familiar with the data. The data sets were taken back to the participants for member checking and ensuring that they were comfortable with what was recorded, thus allowing them to change their statements if they felt the need. None of the women teachers changed their documented statements.

5.9.1.2 Generating Codes

After familiarising myself with the qualitative data, the next phase was data coding.

Coding data involves transforming raw data for the purposes of analysis. Based on my familiarization with the data, I started with open-coding. This is described as manifest content analysis (Gay, 1992; Sarantakos, 2005), where the data are opened for ideas, themes, categories, or patterns emerging from the manifest content. Open-coding is conducted “to identify first-order concepts and substantive codes” (Sarantakos, 2005, p.

349). I coded the data by using highlighters and writing notes in and on the margin of the text, to mark ideas. Coding involved reading and re-reading, coding and re-coding the data. This was done to identify segments of the data that reflected some ideas about the understandings and experiences of teaching sexuality education in rural Lesotho schools in the age of HIV and AIDS.

The importance of generating codes as an aspect of thematic analysis is acknowledged by Braun and Clarke (2006) who observe that codes identify data features that are interesting to the analyst. In addition, Boyatzis (1998, p. 63) describes codes as “the most basic segment, or element, of the raw data or information that can be assessed in a meaningful way regarding the phenomenon.” Miles and Huberman (1994) acknowledge coding as an important part of data analysis, whilst Tuckett (2005) argues that coding helps the analyst to organise the data into meaningful groups. Thus, coding is a critical aspect of thematic analysis since it finally leads to the development of themes in the next phase of thematic analysis (Braun & Clarke, 2006).

Commenting on codes, Siedel (1998, p. 14) differentiates between codes as “heuristic tools and codes as objectivist, transparent representations of facts.” Heuristic codes are used as tools to smooth the progress of further investigation of the data. As Siedel (1998, p. 14) observes, in “a heuristic approach, code words are primarily flags or signposts that point to things in the data.” On the other hand, objectivist codes are condensed representations “of the facts described in the data and can be treated as surrogates for the text, and the analysis can focus on the codes instead of the text itself” (Siedel, 1998, p.

14). In this study, codes have been used as heuristic tools about the understandings and experiences of teaching sexuality education in rural schools. The codes have thus helped me to organise and develop themes from the data.

Braun and Clarke (2006, p. 89) also discuss two types of coding namely: data-driven coding and theory-driven coding. The former leads to the development of themes that

“depend on the data” and this can also be described as inductive or grounded coding.

Theory-driven coding is done “with specific questions in mind that you wish to code around” (Braun & Clarke, 2006, p. 89), and can be described as deductive or theoretical, or a priori coding. The coding employed in this study was data-driven, inductive or grounded coding in the sense that the codes were generated from and not imposed on the data.

It can be argued that coding was one phase of the research process where my role as a researcher, analyst, co-producer and manipulator of knowledge was critical. This is because what was coded as interesting features of the data depended on my personal interests and creativity. The data set was coded inductively by generating codes that had relevance to the key research questions within my subjective and bounded perspective.

Thus, the same data could have been coded differently by different analysts (Bruner, 1996, Eisner, 1997; Hamilton & Pinnegar, 2006; Lyons & LaBoskey, 2002; Richardson, 2003; van Mannen, 1990). Thus Clandinin and Connelly (2000) warn that it is important for researchers to be mindful that how they conceive and enact their roles will influence the research process. They maintain that researchers must strive to be open and self- reflective about their roles when conducting research and research texts.

I tried to code the data for as many potential interesting features as possible. Secondly, I coded the data inclusively by keeping a little of the relevant surrounding data as a way of remaining true to the context of the data and avoiding the common criticism of losing the context in the process of coding (Bryman, 2001; Gay, 1992). Thirdly, data extracts were coded in as many different ways as possible, so that some parts of text were un-coded while others were coded several times. At the end of this phase, all data extracts were pulled together within each code.

5.9.1.3 Generating Initial Themes

Generating themes can be likened to the second level of coding by Sarantakos (2005, p.

350) described as “axial coding”. At this level of coding the codes generated under open- coding were interconnected to construct higher-order concepts called themes. Sarantakos (2005) also describes this phase of generating initial themes as theoretical coding or latent content analysis. He considers it a more advanced level of coding than open- coding, since it involves interconnecting “first-order concepts to construct higher-order concepts” (Sarantakos, 2005, p. 350). Whilst open-coding just opens data to theoretical possibilities, axial-coding finds relationships between the first order codes in order to reach a higher level of abstraction. Sarantakos (2005) labels this task as generating initial themes since it involves identifying relationships between and among the generated codes in order to come up with themes on the social phenomenon being studied.

The generation of initial themes was based on the generated codes, which led to the development of a thematic sketch on the experiences of teaching sexuality education in rural schools. Leininger (1985, p. 60) argues that themes can be identified through

“bringing together components or fragments of ideas or experiences, which often are meaningless when viewed alone.” In keeping with my key research question I engaged the codes to generate the themes reflecting women teachers‟ experiences of teaching sexuality education in rural schools.

Patton (2002, pp. 457-458) differentiates between themes as “indigenous typologies” and themes as “analyst-constructed typologies”. Indigenous typologies are those themes

created, expressed and used by the research participants whilst the analyst-constructed typologies are those themes created by the researcher and grounded in the data but not necessarily used by the research participants themselves (Patton, 2002). Using the notion of themes as analyst-constructed typologies, I constructed an initial thematic map on women teachers‟ experiences of teaching sexuality education in rural schools.

Patton (2002), however, warns that the use of analyst-constructed typologies has the limitation of running the risk of imposing a world of meaning on the participants that better reflects the analyst‟s world than that of the research participants. To mitigate this limitation, for each theme I used data extracts with enough detail to remain true to the context of the study and perspectives of the research participants. As Terre-Blanche et al (2006, p. 321) point out, the “key to doing a good interpretive analysis is to stay close to the data, to interpret it from a position of empathic understanding.”

5.9.1.4 Reviewing Themes

Reviewing themes involves the refinement of the initial thematic map (Braun & Clarke, 2006). Through this task, some of the themes generated for the initial thematic map could no longer stand as main themes and had to collapse into sub-themes. The guiding principles followed here were Patton‟s (2002) twin constructs of „internal homogeneity‟

and „external heterogeneity‟ of themes. These constructs respectively denote that data

“within the themes should cohere together meaningfully, while there should be identifiable distinctions between themes” (Braun & Clarke, 2006, p. 91).

I reviewed the themes at two levels. I considered the validity of each theme in relation to my data, and also checked whether the generated thematic framework accurately reflected “the meanings evident in the data set as a whole” (Braun & Clarke, 2006, p. 91).

This was done by re-reading my entire data set to check for coherence between the themes and data sets and to identify any data in the themes that might have been missed during the earlier coding phase.

5.9.1.5 Defining and Naming Themes

After generating and reviewing a satisfactory list of themes from the data, I defined and named them in a way they were to be presented as research findings. I also analysed the data within the themes to ensure the internal homogeneity and external heterogeneity of themes. Sarantakos (2005) describes refining themes as selective coding, which denotes the selection of higher-order themes with theoretical saturation and high explanatory power.