• Tidak ada hasil yang ditemukan

4. Introduction

4.3 Research Methodology

4.3.3 Data Analysis

61 Building on this theme, the third issue of investigation – SQA awareness, is directed at exploring whether (and to what extent) SQA is known among Scrum teams.

The SQA practices in the fourth theme focusses on understanding the patterns which emerge about how SQA is (or is not) being implemented in Scrum teams.

Irrespective of the status of SQA implementation, the fifth theme is designed to explore the success rate of Scrum projects, with inferences related to quality assurance practices.

Questions pertaining to explanations of the status quo are captured under the sixth theme – explanations referring to the success/ failure rate of Scrum projects in each of the five teams.

The sources of data to the rest of the five themes are the team-leaders and Scrum-masters from each of the Scrum teams at EOH Microsoft Coastal (EOHMC), ScrumSense, Old Mutual South Africa, Saratoga and Truworths companies in Cape Town, selected for similar reasons as outlined under the first theme.

Qualitative analytical methods were then used to analyse the data obtained from these themes of focus.

62 Methods used under qualitative analysis include ethnographic analysis, narrative analysis, phenomenological analysis, interpretive analysis, discourse analysis, grounded theory analysis, constant comparative and content analysis (Morrill et al., 2000, p.521; Mlitwa, 2011;

Onwuegbuzie, Leech & Collins, 2012).

4.3.3.1 Content Analysis

Content analysis is a systematic qualitative data analysis method of studying and analysing communication (Kawulich, 2004) and social life by interpreting words and images from documents, film, art, music and other cultural products or media (Crossman, 2014). In this technique, procedures are used to make valid inferences from text. Inferences incorporate the message sender, the message itself and the audience (ibid). It determines the objective, meaning or effect of any type of communication. Content analysis works best for interpretive research studies that have textual data (Kondracki, Wellman & Amundson, 2003) obtained from field notes, open ended questions, focus group, observations and open ended interview responses (Mlitwa, 2011). It is mostly used for understating social themes, cultural change, changing trends in the theoretical content of different disciplines, verification of authorship, changes in the mass media content and nature of news coverage of social issues or problems (Hsieh & Shannon, 2005).

In essence, the current study deals with large volumes of contextual and subjective qualitative data obtained from direct (but non-participant) observations and open-ended interviews. For this reason, content analysis was considered to be the most appropriate technique for the analysis of the textual qualitative data in this study. However, the concept trustworthiness is critical, not only in the sampling process and the design of questions, but also in the analysis and interpretation of data. In qualitative studies trustworthiness often relates to dependability and credibility of the findings determined by the unbiased researcher’s degree of confidentiality (Cameron, 2011). The concept of validity and reliability are key to the phenomenon of trustworthiness in research. In this instance, validity refers to the extent to which the research instrument is appropriate to the measurement process (Mlitwa, 2011). The question that is usually asked in this respect is whether the tool is measuring what it was set out to measure (Babbie & Mouton, 2001). Reliability on the other hand pertains to issues of consistency in the tool and processes of measurement, in research. For example, it is concerned with the questions of whether the use of particular tool can yield similar result in a similar process in a different environment and time. The method of analysis therefore, should also be in line with the research approach and related data collection methods, if the findings are to reflect an accurate

63 interpretation of a research outcome. Thus, it is important that data analysis methods be used consistently throughout the analytical process (Zhang & Wildemuth, 2005). In effect, Illinois University (2014) argues that when analysing qualitative data, aspects such as stability, reproducibility and accuracy should be considered.

In interpretive studies therefore, content analysis is applied in line with the interpretive tradition of data analysis including the hermeneutics principle with emphasis on unpacking possible interpretations of data that have multiple meanings (Mlitwa, 2011). The process also incorporates theme identification, coding and ultimately, the iterative translation and interpretation of data from these themes. Since this works with interpretive data with multiple possible meanings, the hermeneutics aspect enriches the process in the selected analytical technique.

In the current study coding relates to organizing data into categories and sorting it according to the issue of investigation. Within each code there are sub-categories. For example testing can be divided into types like performance, usability and functional testing, which all fall under one theme (SQA processes).

The first step in this study thus, was to go through the research notes collected from the interviews and direct observations. The text data was broken-down into smaller chunks ready for coding. Coding refers to a process within data analysis which occurs prior to data interpretation, where data is re-organized by assigning a unique code to each theme reflected in the data chunks. A chunk of coded data can be a sentence, a phrase or a word. In the current study these chunks are obtained from the interview transcripts and direct observations’ notes (Fereday & Muir-cochrane, 2006). Similar chunks (by theme) are grouped under the same code.

This enabled the researcher to examine and compare the frequencies of the codes.

64