|
|
MEDICAL EDUCATION |
|
Year : 2013 | Volume
: 1
| Issue : 1 | Page : 67-72 |
|
Use of qualitative analysis to supplement program evaluation of a faculty development program
Thomas V Chacko
Department of Community Medicine and Medical Education, PSG Institute of Medical Sciences and Research, Coimbatore, Tamil Nadu, India
Date of Web Publication | 21-Jun-2013 |
Correspondence Address: Thomas V Chacko Professor and Head, Community Medicine and Medical Education, PSG Institute of Medical Sciences and Research, Coimbatore - 641 004, Tamil Nadu India
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/2321-4848.113581
Introduction: Generally for Program Evaluation of workshops, a structured questionnaire is used to measure learner satisfaction and learning and it yields results in quantitative terms and if there are open-ended questions, it yields a long list of responses which the evaluator finds it difficult to use. Hence, we used qualitative research methods to supplement it for a holistic understanding of the quality and extent of their learning. Materials and Methods: A faculty development program was used opportunistically to gather data using open-ended questions designed to elicit "thick descriptions" about their learning. Using the Grounded Theory approach where the data drives further analysis, data analysis was done by faculty in three groups using inductive reasoning leading to emergence of themes and categories of responses. The commonalities in the themes that emerged independently in the three groups as well as member checking to seek their agreement with the group's observation of the primary data were used to ensure validity of the findings. Results: Themes and categories of new learning in terms of knowledge, skills, attitudes and processes emerged which gives a better understanding of the extent of the learning and its applicability to future professional tasks. The open-ended questions encouraging participants to give "thick description" of their learning yielded many quotable quotes which indicate the quality of their learning. Conclusion: The use of open-ended questions and qualitative research method of data analysis using the Grounded Theory approach helps in holistic understanding of the focus and quality of learning that is not usually possible through quantitative research methods. Hence, it is recommended that when understanding of the extent and quality of learning is important, traditional method of program evaluation should be supplemented with qualitative research methods as they are more effective in convincing decision makers. Keywords: Faculty Development, Program Evaluation, Qualitative Research, Grounded Theory
How to cite this article: Chacko TV. Use of qualitative analysis to supplement program evaluation of a faculty development program. Arch Med Health Sci 2013;1:67-72 |
Introduction | |  |
The Department of Community Medicine at the PSG Institute of Medical Sciences and Research is mandated with preparing health care professionals for primary health care in rural areas. Towards achieving this goal, the students undergo rotations through many of our community-based learning experiences in blocks lasting from a week to a semester. One such one-week long rotation is the "Community Orientation Program" for first year students using the Participatory Rural Appraisal (PRA) approach, [1] in which the students are trained to use various tools for engaging the members of the community to identify their own community's assets and problems using focus group discussion, interview with key informants, drawing pictorial diagrams like seasonal diagram, village maps etc. and prioritize the problems using matrix ranking, Venn diagram etc., and then arrive at an "action plan" to solve their problems by using local resources. This rural empowerment exercise to enable the rural people to improve themselves is facilitated by training the students to do it initially at the classroom setting where they practice the PRA tools under supervision of the faculty and field staff. This was possible because a couple of the department staff had undergone PRA training and were then guiding the students with the help of other staff.
However, after several years, towards the end of the student rotation during student presentations of their learning experience, it was noticed that they had not fully understood some of the key principles and tools of PRA. It was soon realized that the main reason for this was because the big batch of 100 students who were divided into smaller batches during field work were under the supervision of some staff who had themselves not undergone the PRA experience, and so were unable to correctly guide the students in the field. Hence, it was decided to hold a 4-day faculty capacity building workshop on PRA, which included practice of usage of PRA tools in the class room as well as field setting for the entire staff of the department of community medicine with the help of an expert visiting faculty.
Generally, for all such workshops, an end of workshop program evaluation as described by Cook [2] (Kirkpatrick level 1 i.e. learner reaction / satisfaction and level 2 i.e. learning in terms of change in knowledge, skills and attitudes) is elicited through a structured questionnaire that is then subjected to a quantitative data analysis. However, many open-ended questions in the feedback format are not scholarly analyzed using the principles of qualitative research. Besides, for fully eliciting the degree of learning and internalizing the finer nuances of PRA principles and practice gained through the experiential learning approach, the training program evaluation also requires usage of approaches that capture the feelings and understanding of the participants of the training, which is possible only through use of qualitative research. This was done by designing a program evaluation questionnaire directed towards probing these elements.
Harris [3] (2002) quoting Cronbach (1974) and Campbell (1974) highlighted the inadequacy of traditional scientific paradigm to address the complex problems of understanding and behavior involved in education research and advocated for the usage of qualitative research. The focus here is on the quality of a particular activity than on how often it occurs or how it would otherwise be evaluated. According to Fraenkel and Wallen [4] (2006) "Research studies that investigate the quality of relationships, activities, situations, or materials are frequently referred to as Qualitative Research."
Whereas simple narrative descriptive style of reporting of the data that emerges from open-ended questions in qualitative research is a simplistic way, the Grounded theory approach described by Glaser and Strauss [5] (1967) by providing a conceptual framework, lets the data drive the emergence of themes and concepts and through inductive reasoning and constant comparison of these to a rigorous analytical emergence of theories; and the researcher then stops collecting further explorative data to elicit the how's and whys when a theoretical saturation is reached, thereby contributing to a clearer understanding of issues under study. Harris [6] (2003) highlighted the important contribution the re-discovery of the grounded theory by medical education scholars has made to better understand the field of medical education and the processes that contribute in its development.
Purpose of the study
This paper describes the process and outcomes of this qualitative study, the purpose of which was to have a holistic understanding of the quality of learning by the participants of a faculty development workshop on PRA conducted for the faculty and field supervisors involved in the program implementation for training the students to use PRA tools in the community.
Study Design | |  |
As most qualitative research studies are, this study setting is also opportunistic, being conducted within the department's natural setting with a purposive sampling of all participants of a faculty development program on PRA aimed at improving their PRA skills for guiding students to use them during their rotation in the department. The data that emerged from the feedback on this faculty development program in response to open ended but probing questions to elicit the quality of their learning and its usefulness to guide their students was analyzed using qualitative research methods based on "Grounded Theory" where the data drives further analysis.
Materials and Methods | |  |
The Faculty Development Program on PRA was a 4-day workshop held in the department of Community Medicine as well as in the field practice village. Twenty faculty and field staff of the Department of Community Medicine, who usually assist the first year MBBS students in the Community Orientation Program, which uses the PRA tools to arrive at a Community Diagnosis and an "Action Plan for a Community Health and Development Program," underwent this faculty-development program. The Workshop participants were all having an average of more than 5 years experience after their professional qualification and have been involved in Community Orientation Program for the MBBS students but had not formally undergone a structured training on PRA Principles and Practice. The 4-day workshop was designed to ensure active learning through participatory experiential learning and reflection on their learning.
Data gathering and analysis
The feedback form on the PRA learning experience was constructed in the form of open-ended questions to elicit "thick descriptions" on the quality of the participant's personal learning experiences and their perspectives on the processes used in the program and usefulness of the learning to them in future as a faculty member. The questions included:
- What are the most important things you learned during the workshop that was new for you?
- What knowledge and skills did you develop during this workshop that can be useful to you as a staff/ faculty? List, describe, explain:
- Please discuss the role of this workshop in helping you become more comfortable in problem-solving and facilitating change:
This feedback form was administered to them after ensuring anonymity of the responders and eliciting their consent for analyzing the data.
The data analysis was done based on the grounded theory and the guiding principles for qualitative data analysis (as described by Alan, 2003) including the processes and dilemmas [7] faced by a qualitative researcher using grounded theory including the process of identifying the themes and coding.
Strategy for Analysis
This was done through group-work, in which a group of 3 to 4 faculty got themselves engaged in inductive analysis and creative synthesis of the responses to a question at first independently and then sharing with each other the themes, which they perceived as emerging from the data to arrive at a group consensus on the theme or coding, which best brought out a better understanding of the worldview of the responses to the open-ended question that they analyzed. This was then presented to the larger group. All the themes that emerged from the data were then member-checked to verify whether they too agreed with the group observations and thereby improve their validity.
Results | |  |
- New Learning by the Workshop Participants: Usually, after any Faculty development program, a Program Evaluation elicits new learning by the participants who underwent the program as a measure of program effectiveness in enhancing knowledge and skills. This, unless using a structured questionnaire, usually yields a long list of compiled responses with each participant expressing in their own words and so confuses rather than helps evaluate effectiveness of the program in achieving its objectives clearly. By doing a qualitative analysis of the responses, it was possible to categorize them into themes and categories so that a more compressed list emerges to give a much clearer picture about the overview of the new learning as perceived by the participants of the workshop as shown in [Table 1]. Here, the responses (new learning) from the participants got categorized into knowledge, skills, and attitudes. And further analysis of those listed under the skills got categorized into PRA Tools, Program Management tools, and Field-work skills. Those which did not fit into knowledge-attitude-skill category got classified under "process" category under "New Experience."
 | Table 1: Important new learning identified by participants of the PRA workshop - categories / themes that emerged
Click here to view |
- Applicability of new learning by participants. As part of the experiential learning model, the learner is encouraged to reflect and document how the new learning can be put to use in future professional tasks that they would be engaged in. This was also used here for Program Evaluation to judge whether the workshop led to learning among the participants, which they perceive they can "apply on the job." This was elicited from the participants by asking the question: What knowledge and skills did you develop during this workshop that can be useful to you as a staff / faculty? List, describe, and explain. [Table 2] shows how the responses from the participants of the workshop got categorized into two broad categories - Usable Knowledge and Usable Skills. Further examination of the responses that were categorized under usable skills lead to further classification of these into 3 subthemes / categories namely Professional Competency, Field Training Skills, and Competency in PRA tools. The numbers in brackets against some of those listed in the table indicates the number of times multiple respondents have listed the same item (quantification of qualitative data). Further scrutiny of this data did not lead to emergence of any new themes / categories, and so, the process stopped at this stage.
 | Table 2: Usable knowledge and skills identified by participants of the Workshop- categories / themes that emerged
Click here to view |
- Participant's perceptions about effectiveness of the program to improve their problem-solving and facilitation skills. Since PRA program is aimed at improving student's problem-solving and change-facilitation skills, one of the main objectives of the faculty-development workshop was to improve the faculty's problem-solving and change-facilitation skills using the PRA process. Whether this program objective was achieved was evaluated by asking the question - Please discuss the role of this workshop in helping you become more comfortable in problem-solving and facilitating change.
Analysis of data [Table 3] led to its categorization into 2 broad outcomes among the participating faculty; namely gaining confidence and gaining skills. Upon further examining the data that got classified under "Gained Skills," it led to further categorization of the responses into problem-solving skills and facilitating-change skills. Further attempts at looking for themes within the responses under problem-solving led to two categories / themes emerging. | Table 3: Workshop's role in improving problem solving and facilitating change skills - categories and themes that emerged
Click here to view |
- Participant's perceptions about their experiential learning in the field. The participants' description about their field experience could be categorized into three categories as shown in [Table 4], 1) The experience itself, 2) their learning during the field experience that got listed as "Skills," and 3) the outcomes as a result of their field experience.
 | Table 4: Participant's most positive perception about the field experience - categories and themes that emerged
Click here to view |
Discussion | |  |
From the above observations that emerge from doing a qualitative analysis of the program feedback data in the form of categories / themes, we can appreciate better that the purpose / objectives of the PRA workshop as a faculty capacity-building effort has been fulfilled and that they have understood the underlying principles involved in the PRA process than what would have been possible from quantitative analysis of data captured by a structured closed-ended program evaluation questionnaire, which would have just yielded a long confusing "laundry-list" of what they learned by attending the 4-day workshop.
[Table 1] shows that using qualitative research method, the skills listed by workshop participants as newly learned could be meaningfully categorized as a) specific tools used in PRA, b) those which are program management skills, and c) those that are field skills. The workshop has been, thus, been successful in bringing about the desired attitude towards usage of PRA as well as towards field-work in the community.
Constructing the open-ended questions and encouraging the participants to use "thick descriptions" on their learning experiences as is generally done in qualitative research studies yielded good "quotable quotes," which are much more convincing to the program evaluator (and through them to the program administrators and academic leaders) about the usefulness of the 4-day workshop compared to just numbers and percentages that the usual quantitative research methods yield. For example, one of the participants found the hands-on field training as part of a capacity-building workshop a new learning experience: "Got acquainted with the PRA tools, which will help me (guide the students) in the student training and planning and implementing community health and development activities." For some, it was a re-invention of the PRA tools, about which they had only heard in the classroom.
Apart from the emergence of themes as shown in [Table 2] regarding usability of their learning to actual practice of guiding students, the following quotes by participants about their learning experience illustrates the value of doing qualitative research as part of program evaluation to supplement quantitative research data often generated by a closed-ended questionnaire. It communicates the effectiveness of the program more effectively than the numbers and percentages representing the improvement in knowledge skills etc. while using quantitative methods that we commonly use:
"We are made very clear how PRA is conducted, what are the things to be done and not done, how it is to be done very clearly. Each step was clear, and so we can conduct the PRA with the students and make them do (these) things very easily," and "I can use these skills not only for PRA program but also for rapport building necessary for any community outreach program. As a public health physician, these are the integral part of any (public health) program." Another learner said, "Usually, when we need a map (of the village), we usually ask for it from the Union office. But, in this PRA training, did the observations when they did the (transect) walk with us and came forward to draw the map themselves. This helped us in understanding the resources in the village. We got lot of information from the villagers."
Since the PRA process is essentially an empowerment tool for the rural people that facilitates improvement in rural health and development through their own efforts, it is important for the facilitators of this process to have good facilitation skills in problem-solving and also in facilitating change. From the analysis of the feedback from the participants about the workshop's role in improving their problem-solving and facilitating change skills, it becomes clear that they gained problem-solving skills by learning rapport building with the people in the rural areas and thereby finding it easier to ensure a participatory approach by involving people in solving their own problems using their own resources.
The workshop's favorable learning environment which encouraged learning by observation as well as through hands-on experiential learning, it enabled them to gain confidence to be able to teach PRA facilitation skills to MBBS students besides becoming confident to facilitate people in rural areas to help themselves. Some of these sentiments (feelings) come out quite well from their comments:
"As we were taken to the field with experienced faculty, we could observe them and take tips from them on how to communicate, build rapport, and win (their) trust. We were also able to identify the "saboteurs" in the community and how they can be managed," and "We learnt how to identify the problem in the community from the people and while discussing, we came to (know) various ways (of) how to facilitate the changes in the community by themselves…"
Since capacity building of the faculty for PRA has a critical element of field-based training, it was important for the workshop planners to understand the participant's perception about their field experience during the workshop. The participants found the field experience very useful since it was interactive and presented them with an opportunity for hands-on training and practice in the community. This led to their improvement of practical knowledge and skills in using PRA tools, which helped them to build rapport with people in the rural areas, which, in turn, helped them in information gathering, problem identification, as well as finding solutions and communicate these effectively. The field experience has enabled them to feel confident that they can now be able to facilitate and correctly guide the students in using the PRA process effectively in the community. This was reflected in their feedback at the end of the training program:
"What we were taught in the class was put into action in the field. This was very helpful so that the whole process was (made) very clear." and "Guidance of experienced faculty from the rural development field helped us immensely in improving our skills in the field. We were given prior briefing and demonstration via role plays on what will happen in a typical village and how to manage such situations. Conducting PRA and applying the various tools were very useful and helped us understand their advantages and disadvantages clearly by hands-on (field) experience."
These quotes from the participants in the form of thick descriptions add face validity to the analysis presented in the form of emerging themes and categories.
Limitations
Although in the PRA program feedback form, open-ended probing questions (why, how) were asked and the respondents were prompted to give details, not many "thick descriptions" of their experiences and insights learning emerged. This could have been due to
- Less time, asked towards the end of the workshop when they were keen to "finish" and go home.
- Too many open-ended questions resulted only in simple listing.
- Not taking time to check their understanding of the question. To me, as the knowledgeable researcher, the question was quite simple but to some respondents, based on their response, it was obvious that they had not either understood the question or the intended purpose. It would have also helped if I had given an example as an illustration of the meaning of the question.
The study only analyzes the feedback received on the open-ended questions related to the PRA workshop. After analysis and observing the emergent themes, no attempt has been made to ask further probing questions as is usually done in qualitative research to have an even better understanding of the issue under investigation.
Future Research | |  |
There is a need to devise and design appropriate qualitative research studies for all the rotations in the department to gain a full and qualitative understanding of the strengths, weaknesses of the department's programs, as well as invite suggestions for making them even better learning experiences.
References | |  |
1. | Chambers R. The origins and practice of participatory rural appraisal. World Dev 1994;22:953-69.  |
2. | Cook D. A: Twelve tips for evaluating educational programs. Med Teacher 2010;32:296-301  |
3. | Harris I. Qualitative Methods. In: Norman GR, Van der Vleuten, C.P.M, Newble DI, editors. International Handbook of Research in Medical Education. Dordrecht: Klewer Academic Publishers; 2002. p. 45-95.  |
4. | Jack RF, Norman EW. How to Design and Evaluate Research in Education. 7 th ed. Boston: McGraw Hill; 2008.  |
5. | Glaser BG, Strauss AL. The Discovery of Grounded Theory: The strategies for qualitative Research. New York: Aldine; 1967.  |
6. | Harris I. What Does "The Discovery of Grounded Theory" Have to Say to Medical Education? Adv Health Sci Educ Theory Pract 2003;8:49-61.  |
7. | Allan G. A critique of using grounded theory as a research method. Electron J Bus Res Methods 2003;2:1-10.  |
[Table 1], [Table 2], [Table 3], [Table 4]
|