A core activity in qualitative data analysis is the coding phase. Huberman et al. (2013, p. 72) emphasise the significance of coding by suggesting that qualitative analysis has commenced once a researcher engages in the activity of coding. There are various explanations offered to elucidate the purpose of coding (e.g. Huberman et al., 2013; Saldana, 2009) From these explanations, it becomes apparent that coding is part of the data reduction or data condensation process to enable the researcher to succinctly capture the essence of the volumes of data that
is typically gathered in a qualitative study. According to Huberman et al. (2013, p.
71) codes are labels that are used to categorise data as well as to convey the essence of the data by making use of meaningful, descriptive names. The codes are also a convenient strategy to enable the researcher to quickly retrieve and reference
‘chunks of textual transcripts’ to set the stage for further analysis and the development of a construct or a theory. The process of coding is not an exact science and is largely heuristic, based on the researcher’s intuition and careful reading and reflection in order to obtain intimate understanding of the message that is being conveyed in the textual transcripts. Saldana makes reference to 25 approaches that may be followed for First Cycle coding. From this list, the most relevant choices for the current study are the following:
Descriptive: The use of a noun or an expression to succinctly capture the essence of a passage of text. The set expression eventually provides an inventory of topics that serve as an abstraction of the raw data;
In Vivo coding: A popular coding strategy that has the same objective as descriptive coding, but is technically different in the sense that it makes use of short phrases taken from the participant’s own language as an initial code;
Process Coding: Entails the use of gerunds (a verb form that serves as a noun) to represent action or interaction sequences in the text.
The First Cycle coding approach is usually quite time consuming, but it adds some structure to the qualitative data. Corbin and Strauss (2014, p. 25) offer some respite by cautioning against an obsession with too much structure and the need to maintain an element of flexibility and fluidity so that intuition and insight from the researcher is not totally ignored.
5.3.1 The Use of Qualitative Data Analysis Software
Qualitative data analysis software (QDAS) is used as a supplementary qualitative data analysis tool. It is not meant to replace the ‘time-honoured’
tradition of manually examining data to establish relationships and patterns.
However, software tools such as the Nvivo software package that was developed to support the qualitative researcher, serves as an ideal mechanism to “manage” the data thereby enabling the researcher to focus on the meaning conveyed by the data (Bazeley & Jackson, 2013, p. 2). The current study adopts a strategy of using the Nvivo 11 Professional Version for the qualitative data analysis based on the premise that Nvivo will provide an enhanced capacity for recording, sorting, matching and linking of qualitative data while also maintaining access to the source data or contexts from which the data have come.
The Initial Mind Map
At the outset, Bazeley and Jackson (2013, p. 28) advise that the qualitative researcher should develop an initial concept/mind map that documents assumptions and also clarifies the conceptual framework that underpins the study.
From a QDAS perspective, this is quite beneficial because it allows the software to make comparisons between emerging concepts and the initial pre-conceptual constructs that are introduced by the researcher at the start of the analysis process. For the current study, preference is given to a graphical version of such a pre-conceptual map. This is illustrated in Figure 5.2
Figure 5.2: A Pre-Conceptual Mind Map
The main constructs of the pre-conceptual mind map illustrated in Figure 5.2 are the pre-conceived heuristics (largely emanating from the literature review) of software development that are used by the researcher to add structure to the engagement with the software practitioner. The researcher and practitioner perspectives are guided by socio-technical elements represented by organisational culture and software development methodological rigour that has a strong technical orientation. The essence of the mind map is that it has to make reference to the traditional approach to software development epitomised by the Waterfall approach so that a comparison standard can be created. There also has to be a reference to agile software development methodology because it epitomises current software development practice and it is a core aspect of the current study. Based on the outcome of the literature review, the hybridisation of agile methodology plays a prominent role in the actual implementation of the methodology. The traditional, modern and hybrid approaches to software development provide the terms of reference that may be used to optimise the insight obtained from the engagement with the software practitioner, paving the way for a synthesis phase that produces a model/framework that enhances existing software process models.
5.3.2 Initial Coding
Initial coding is a technique that is a subset of Saldana’s First Cycle (see Saldana, 2009) coding methodology where the researcher engages in a process of breaking down the ‘mass’ of qualitative data into manageable parts. This is referred to as the process of conceptualising the raw data into a higher level of abstraction which represents a meaningful form of data reduction. Creswell (2013) warns of the challenges associated with qualitative data analysis because of the volume of data that needs to be analysed. In order to manage this process, Creswell eloquently suggests that qualitative data analysis conforms to a general contour that is referred to as a “data analysis spiral” (Creswell, 2013, p. 182). The spiral analogy is used to convey a methodology where the researcher moves iteratively between the phases of data collection, data capture, data analysis and reporting.
This approach is used to decipher the complexity that is usually found within the