CHAPTER 1: INTRODUCTION
4.9 Data analysis
Data analysis entails thoroughly reading the gleaned data, repeating until it is clearly understood, coupled with that will be to engage in breaking up information and putting it back together again (Terreblanche et al., 2002). Data analysis entails closer scrutiny of gathered data in relation to theoretical notions (Elsenhard, 1989). The data that is being interacted with is then organized and transformed into chewable chunks, synthesizing items, searching for patterns and deducing what is valuable and what is to be learnt (Bogden & Biklen, 1992; Leed, 1993). In addition Cohen, Manion and Morrison (2000, p. 147) argue that “data analysis entails accounting for and explaining the data, in a nutshell, making sense of the data in terms of the participants’ definitions of the situation, themes and regularities”.
Strauss & Cobin (1997) suggest that the analysis process entails arranging the information gathered and create clarity on ambiguous information. It takes a lot of time to reach a stage where the researcher would make sense of the data collected;
however it remains an interesting process. It is mainly general statements and assist in building ground theory.
The data was analysed using both inductive and deductive methods.
Firstly, inductive analysis involves reading the data and theory developed from the observation of empirical reality (Welman et al., 2005). Through this approach hypotheses and theories are generated (Welman et al., 2005). In addition to this, Leedy and Ormrod (2005, p. 32) argue that people use specific occurrences to draw conclusions about the entire population. Therefore, it can be viewed as a creative
110 reasoning mode by the researcher to which scientific knowledge can be added (de Vos et al., 2011). Themes were drawn from the interviews held with the teachers, subject advisors and the subject head.
Deductive analysis moves from general to specific. Through this approach the researcher tests whether the expected pattern actually occurs (de Vos et al., 2005).
“From a general theoretical understanding the researcher derives an expectation and, finally, a testable hypothesis” (Babbie, 2007, p. 46). The deductive process involved using the key concepts from the Communities of Practice theory, namely, joint enterprise, shared repertoire and mutual engagement to analyse the data.
Mutual engagement entails a community with diverse expertise and knowledge engaging with each other (Wenger, 1998). According to Wenger (1998, p. 72),
“practices exist because people are engaged in actions whose meanings they negotiate with one another”. This dimension helped me to understand how teachers negotiate the strategies of teaching problematic topics to the learners and in the process how they relate with each other and with the subject advisors and the subject head. What the participants know and how they do what they know is reflected when they mutually engage with each other. The joint enterprise entails focusing on what brings the community members together and is informed by three key features: the enterprise is as a result of a negotiated collective responsibility which is mutually agreed; secondly the participants are key in ensuring that they define the process and own it; thirdly mutual accountability relations are created (Wenger, 1998). This dimension assisted me in understanding what brought Life Sciences teachers together. The shared repertoire focuses on the resources required to facilitate mutual engagement and the enterprise that brings the teachers together. Included in the repertoire of a community are routines, words, tools, ways of doing things, stories, gestures, symbols, genres, actions or concepts (Wenger, 1998, p. 83). According to Wenger (1998), “the repertoire includes the discourse by which they express their meaningful statements about the world as well as the styles by which they express the forms of membership and their identities as members”.
Several steps were undertaken to analyse data. This was done to ensure credibility.
Data in this study was analysed by a combination of data analysis techniques from several authors, as seen in Figure 5.1.
111 Figure 7 The 11 steps of analysing semi-structured interviews data (synthesised from Creswell, 2014
The eight steps of data analysis I used are as follows:
Organising and preparing data for analysis: I organised data and inductively came up with themes. Transcribing recorded interviews, coding field notes and the notes I took during the observations prepared the data for analysis.
Choosing of scripts: I took the scripts from the file and read them to establish the thought processes of the respondents to the similar questions that I had asked all the respondents.
Reading of scripts: At this stage I was able to get a general sense of what the Life Sciences cluster entails. I got that through reading the responses. I also looked at the overall depth of their responses. I read these scripts together with my field notes taken while I was conducting the interviews.
Step 1: Organising and preparing data for analysis [From Step 1 of Creswell (2014:198), step 1 of Denscombe (2014:247) and Step 1 of Hesse-Biber & Leavy (201:302)]
Step 2: Choosing of script [From step 2 of Tesch (1990, in Creswell, 2009:186)]
Step 3: Reading of scripts [From step 2 of Creswell (2014:1197), step 2 of Denscombe (2014:247), step 2 of Hesse-Biber and Leavy (2011:305) and step 1 of Tesch (1990, in Creswell (2009:186)]
Step 4: Identification of themes [From step 3 of Tesch (1990, in Creswell, 2009:186)]
Step 6: Step 3: Coding the data. [From step 3 of Creswell (2014:1197), Step 3 of Hesse-Biber and Leavy (2011:305) and step 4 of Tesch (1990, in Creswell, 2009:186)]
Step 5: Grouping descriptive words into themes [From step 4 of Creswell, 2014:1197 and step 5 of Tesch (1990, in Creswell, 2009:186)]
Step 7: Abbreviation of categories [From step 6 of Tesch (1990, in Creswell, 2009:186)]
Step 8: Advancing how the description of themes will be represented in the qualitative narrative [From Step 5 of Creswell, 2014:200)]
Step 9: Performing preliminary analysis [From step 7 of Tesch (1990, in Creswell, 2009:186)] Step 10: Recording existing data [From step 8 of Tesch (1990, in Creswell, 2009:186)]
Step 11: Interpreting qualitative data [From Step 6 of Creswell (2014:200), step 4 of Denscombe (2014:247), step 4 of Hesse-Biber and Leavy (2011:315)]
112 Identification of themes: From reading all the scripts, I then listed themes that I inductively drew from the analysis of the responses from the respondents.
Grouping descriptive words into themes: To be able to come up with these themes, I used descriptive terms to get assistance in coming up with the themes that I identified.
Advancing how the description of themes will be represented in the qualitative narrative: According to Cresswell (2104), the narrative of the data is meant to convey the findings. The themes were narrated to analyse data and this was complemented by the texts of the responses and the notes I took at the workshops.
Recording existing data: The transcribed scripts were from the recorded data.
Interpreting qualitative data: The data was interpreted using the learning dimensions from Wenger’s (1998) CoP conceptual framework. These were presented as findings.
In data analysis, I answered the study’s research questions (using the three dimensions of the Communities of Practice), namely:
• What are the activities that take place in the LS Cluster?
• In what ways do these activities support teacher learning?
• To what extent does the Life Sciences cluster contribute to the professional learning of the teachers?
“Almost invariably, one crucial step in content analysis it to tabulate the frequency of each characteristic found in the material being studied. Thus, content analysis is qualitative as well as quantitative” (Leedy & Ormrod, 1989, p. 143). Moutton (1996) suggests two steps that are involved in data analysis: firstly, reducing to a manageable proportion the wealth of data that one has collected or has available;
secondly, identifying patterns and themes in that data.
Cohen, Manion and Morrison, (2000, p. 147) suggest three principles, namely Completeness: a check that there is an answer for every question; Accuracy: a
113 check that all questions are as far as possible answered accurately and Uniformity:
a check to ensure that all respondents interpreted the questions uniformly”.