INTRODUCTION
In recent years “qualitative research” is rapidly coming up in medical researches as a new tool which is simple and inexpensive but highly effective. It basically deals with the “why” and “how” of a problem or a situation. Unlike quantitative research in which the data usually deal with how often and how much of the point in question and that therefore quantifiable, the data in qualitative researches are not quantifiable but have to be analyzed thoroughly (like content and thematic analysis) ultimately giving the answer to the research question.1 Quantitative researches need huge numerical data from a large number of subjects and have recourse to an initial hypothesis generation as well as complicated statistical methods to find the level of significance giving rise to delicate and exact numerical picture.2 On the contrary, qualitative researches need detailed descriptive data from a small number of subjects, needing no initial hypothesis generation as well as practically no statistical methods giving rise to a broad and big picture of the situation, leading in many cases to a final theory generation and the suggestions of the actions needed.
One interesting feature of data collection and analysis in qualitative research is “constant comparison.” This is a highly unique method in this type of research as compared to the quantitative ones where nothing such exists, and everything is pre-programmed and revolves around a “null hypothesis.” The term constant comparison is an ongoing process whereas data collection process is going on, simultaneously its analysis also continues.2,3 Not only that, because of our ongoing knowledge because of constant comparison, we come to know where to change your data collection techniques or subject choice and also when to stop recruiting subjects for collecting data any further.2
We also have COREQ which is a 32 item checklist for interviews and focus group discussions.
The purpose of qualitative research is often to delve into complex phenomena of clinical and social significance often experienced by doctors and other healthcare providers as well as by the patients and their relatives, policymakers and the people in general.1,4
Qualitative researches usually involve inductive reasoning, but not exclusively as in grounded theory, where deductive reasoning also follows suit. A theory is generated from the study, which is said to be “grounded” in the data.2,5
QUALITATIVE DATA ANALYSIS
Unlike quantitative researches, data analysis in qualitative research is ongoing through and through method. It starts from the very beginning of data collection and continues side by side as the process goes on. Simultaneously we also decide where to change our methodology, how to and when to stop data collection and many other facets of the research process. This is called “interim analysis”.1,2,5 Side by side reflective notes are recorded as interim analysis continuous and they are called “memo.” Once videos and audio recordings are completed (often in vernacular language) they are converted into “English transcripts” which is the written version of an interview or conversation.3,6 This transcript is then subjected to a selection of “codes” which are nothing but descriptive names applied to certain thoughts coming up repeatedly in a transcript.2,7,8 Several codes together make a broader “category.” Coding is, therefore, the fundamental unit of data analysis and category is a bigger unit.
The process of transcript analysis by above (Table 1) method is called “content analysis,” and this is at the end converted into “thematic analysis” which gives the meaning of the whole research and the answers to the original research question(s). Themes are broad categories of information. Themes can describe a setting and what occurred.
Table 1
SOFTWARE USED TO ANALYZE QUALITATIVE DATA
The common software used for the above data analysis in qualitative researches are ATLAS-ti, Ethnograph, NDUST, and Nvivo.2,9
“Qualitative data analysis is a process that requires astute questioning, a relentless search for answers, active observation, and accurate recall. It is a process of piecing together data, the invisible obvious, of the insignificant, of unrelated facts logically, of fitting with another, and of attributing consequences to antecedents. It is a process of conjecture and verification, and modification, of defense. It is a creative process of organizing data so that the analytic scheme will appear obvious”.