Originally posted here:
Use of Online Focus Groups Discussions (FGD) has expanded over the past months, as well as in the Monitoring and Evaluation (M&E) field since the outbreak of the COVID-19 pandemic. Although guidelines on how to successfully conduct online focus groups might have been issued and published elsewhere, this blog focuses on my experiences and reflections in conducting online FGD during the times of the COVID-19 pandemic in South Africa. Explicit suggestions and critical perspectives for conducting online FGD using Google Meet are given to those who may be new in using online platforms for FGD.
2. Describing focus group discussions A Focus Group Discussion (FGD) is described as a qualitative research method focused on group interactions, thoughts and views, concentrating on what people are thinking and what they are doing. It is a research method for gathering data to explore patterns and opportunities
Regular FGDs consist of more than 4 members who are invited to the dialogue, led by a facilitator.
3. Ways and means I have recently conducted online FGD that forms part of the evaluation study. The underlying aim of the evaluation is to provide key lessons for future scholarship programme strategy. In the evaluation study, 6 online FGDs of about 90 minutes each with undergraduate’s scholarship recipients from different institutions were conducted. The FDG focused on gathering the student’s perceptions about the relevance of the scholarship and their experiences with it as well as its impact on their studies. This blog however, does not report on the findings from the evaluation study, the evaluation is still ongoing and will be made available elsewhere, depending on the client’s pronouncement.
The online FDG were accomplished in real-time over a web conferencing service, Google Meet. Participants were recruited via telephone and e-mail in a two-phase process. The first phase involved a list of student drawn from those who had not participated in the survey. As part of evaluation ethical considerations, participants who were at the age of 18 years and above were eligible to participate. The second phase entailed finding expression in participating in the FGDs for the evaluation. Eligible participants were asked for their consent to record the meeting after I read a consent disclosure statement to them. Prior this process, it should be noted that letters were sent out to the student’s universities communicating the evaluation and encouraging participation from the students. Email invitations were distributed to their official email addresses and students emails. In the invitations, participants were briefed beforehand on how to download and use Google Meet and also encouraged to use their smart devices (particularly laptops) and to ensure that their devices meet the minimum criteria for using this service. Some participants who did not meet the minimum criteria (e.g. not having access to smart devices and stable internet) were excluded from the FGD.
This therefore introduced a selection bias to the evaluation as I had to recruit those who meet the minimum criteria to participate in online FDG. During the recruiting phase, participants were asked whether they would like to be provided with airtime incentives to purchase data as part of reimbursement. This is critical to ensure that participants do not incur any expenses (e.g. using their own airtime) to participate in the FGD. Very few participants indicated that they have uncapped and limited WIFI at home and some mentioned that there are using university WIFI and therefore do not want the airtime incentive. Thus, the incentive was only provided 60–30 minutes before the meeting to those who agreed to be reimbursed.
Participants were further advised to log into the meeting 45-30 minutes prior to allow time to test their microphones and cameras. Furthermore, 60 minutes before the actual FDG participants were contacted and reminded of the meeting. Facilitation of all the 6 online FGDs was carried out by the same evaluator, with the intent of retaining facilitation style close to that of conventional face-to-face FGD. This involved permitting many participants at the same time during the online FGD, rather than limiting the platform to enable just one participant to speak at a time. Participants were also encouraged to turn on their cameras if possible, this is just to check if the participants are engaged, following along or confused.
However turning on of cameras during the FGD was not mandatory. At the completion of the FGD, I utilized reflective exercise to evaluate the content and quality of the online FDG. This included analysing audio files and transcripts from the FGD and evaluating what occurred, focusing on my conduct and engagement with respondents, and what I might do better next time. All the observations described below are focused on my experience of how the online FGD operated and what I considered to be the most notable things that have emerged during the discussions. While I have not questioned the respondents specifically about their perception of using Google Meet for the FGD, I have also been able to gain some feedback from the comments they made during and after the online FGD.
4. Facilitator’s reflections
Attendees were sincerely active and responsive, while personal concerns such as; disruptions from devices, children, as well as other background noise often disrupted the FGD. Further, I have noticed that communication was often noticeably weaker and slower; this mainly has to do with issues of language barriers, some of the participants did not necessarily apprehend the facilitator’s diction and accents. In addition to this, some were not comfortable using English as the language of communication. In response to this, I posted some of the questions on the chat box while at the same time asking those questions directly, this enabled them to see the questions on the chats and some were also able to respond directly from the chats. Secondly I also allowed the participants to communicate in a language that they felt comfortable with, given that the as a facilitator I’m multi-lingual with almost all South African languages.
As a result of these measures the communication and participation rate improved enormously. In addition, some respondents spent a considerable amount of time reacquainting themselves and each other with Google Meet, as well as exploring the complexity of Google Meet and any technological challenges they encountered. However, my extensive experience with Google Meet enabled me to rapidly troubleshoot with the participants when necessary. This involved interacting with respondents who used the chat function while their microphones were not functioning or when the facilitator could not hear them. The fact that other participants contributed to the discussions through chat added a layer of complexity and implied that the facilitator had to provide extra time for respondents to text and other attendees to read and respond to the messages posted on chat. That being said, this did not seem to influence the discussion in any significant way, although it is likely that individuals using chat functionality could have taken short cuts to accelerate typing.
Finally, I observed that several of the attendees tended to pull back during the FGD online. This was due to technical malfunctions and connectivity issues encountered and some had continuous interruptions from their homes. Sometimes I will pause the FDG for a bit and try contacting them directly to their phones in an attempt to get them back from the meeting. I also observed that prior to the FGD participants were more likely to drop out before the time or just not show up as compared to a face-to-face FGD where there is a greater investment required in time/travel if you say you are going to show up. For example, I had initially recruited 15 respondents for each of the 6 online FGDs, but 3 had retreated a few hours before their scheduled meeting started in the first FDG. After I did some follow ups with them, I found out that some of the mentioned reasons for dropping out include issues such as Eskom load shedding.
5. Lesson learnt and possible suggestions
My perception in using web conferencing platforms such as the Google Meet for online FGDs holds a promise for a feasible and roughly equivalent alternative approach to face-to-face FDG, notably for geographically diverted communities such as in rural areas. Significantly, they overcome the barriers inherent in face-to-face FGD, which as results strengthen the evaluation methodology. This also stresses the significance of continuing to study emerging technologies so that hard-to-reach populations can engage in all forms of research more efficiently. Moreover, it was not unexpected that some of the participants spent considerable time discussing technical difficulties they were experiencing or familiarising with Google Meet. To avoid this issue, I sent the participants details on how to download and use Google Meet app and urged participants to log in early, all of which were performed by at least some participants. I also allocated some time during the recruitment phase to provide a brief tutorial telephonically on to use Google Meet, however regardless some participants spent more time trying to familiarise themselves with the app at the expense of the actual FGD. This indicates that more should be done to tackle this problem, providing, for instance, scheduling more time for online FGD than for comparable face-to-face FGD. I have chosen to use Google Meet as the web conferencing program for this FDG because of my extensive knowledge and experience with it.
The use of this platform also meant that participants could utilize it with no need to create new online accounts without any need to download any applications, it works on any device; one can join a meeting from a desktop/laptop, Android, or iPhone/iPad. It furthermore provides an option for noise cancelling audio filter, a function not provided by some other video conferencing platforms. It also provides an in-built recording system, which ensures that an evaluator may eliminate the need to source or buy a stand-alone recording tool. However, for comparison purposes, it is advisable that evaluators consider the assessment of other available web conferencing platforms, when designing their evaluations. To anticipate the possibility of people dropping out prior the meeting, it is suggested that evaluators schedule with more people, however it is suggested that evaluators also consider having smaller group sizes (4-6) during the FGD, as their much easier to manage in the online environment. This also made it easier for me to ask follow up questions and allow the participants to express themselves fully. Is it further suggested that during the FGD there is someone else sort of like a co-facilitator to help the main facilitator with monitoring chats while he/she is engaging with the participants.
A key limitation for this blog is that it only focuses on my experiences and reflections in conducting online FGD. This then implies that only my reflections and experience of the online FGD were taken into account, and not of the participants. Nevertheless, I therefore believe that these considerations are likely to be worthwhile for evaluators who want to use this platform, as there to date there is little guiding principles for the carrying out of online FDG during health crisis. Finally, further formal assessments is needed for this platform, however my reflection as presented in this blog should contribute to improved development and implementation of this platform while is carried out.
Developing Focus Group Research: Politics, Theory and Practice, edited by Rosaline S. Barbour and Jenny Kitzinger (Sage Publications, Thousand Oaks, CA, 1999)
Add a Comment