How do we as researchers know that the data we collect reflects the full unbiased experience of the student, especially when asking sensitive questions with no right or wrong answer? A team at RTI International designed a small pilot to unpack this very question.
This article was co-written by: Julianne Norman, Maitri Punjabi and Lauren Edwards
In recent years, donors and implementers working in the international education sector have increasingly expanded their attention to factors beyond the training and teaching of hard skills (e.g., reading and mathematics) needed to improve learning outcomes and have recognized the importance of social and emotional learning (SEL), school-related gender-based violence (SRGBV), and school climate in promoting academic achievement. SEL, SRGBV, and school climate have become common terms within the sector’s nomenclature. Yet, these school-based learning influences, like SRGBV, are difficult to measure due to the reliance on self-reporting of these survey items, which are of a sensitive nature. As a result, SRGBV prevalence rates are hard to pinpoint and the reliable data to inform policy makers, funders, and implementers is simply not there.
Why is the manifestation of these experiences so difficult to measure? Often the victims of SRGBV do not report their experiences out of fear of re-victimization, cultural norms within their school community, and social taboos. School climates conducive to violence, opaque and feeble reporting structures, and neglectful government policies often limit the safe spaces available for children and youth to report their experiences under strict confidentiality and privacy. Yet, it is their very voices we researchers need to hear.
We applaud the mental, physical, and emotional safety precautions that the research community has upheld. Most institutional review boards require that SRGBV data collections include counselors as part of the research teams, to accompany assessors to the schools and remain on hand to immediately respond to children that re-experience trauma because of the nature of the survey questions. However, despite these protections, researchers cannot know if a respondent’s answers are genuine. Unlike a diagnostic test, there are no right or wrong answers in SRGBV surveys, so no way to ascertain if the respondent is divulging the true extent of his or her experiences. Ultimately, without genuine feedback from SRGBV survivors, researchers and practitioners cannot address violence and its root causes.
A team at RTI International sought to circumvent these obstacles to data collection and to increase respondents’ level of comfort by employing a data collection method that allowed for nonidentifiable information, safeguarding of data, and full privacy during the completion of the survey. Such a method was already utilized within the HIV/AIDS research community and facilitated administration of a survey without an assessor. This method is called Audio Computer-Assisted Self-Interview (ACASI). We found extensive literature on the use of ACASI when administering surveys of a sensitive nature like that of sexual behavior and drug use. Moreover, the research highlighted an important asset of ACASI’s privacy component: it's potential to reduce social-desirability bias—responding to a question based on how you think your answers will be perceived by others, rather than responding honestly based on your personal and true experience. With greater privacy, the likelihood of social desirability bias diminishes and, thus, respondents are more likely to respond accurately and not based on what they perceive to be more desirable.
In contrast to the conventional face-to-face interview method, ACASI “removes” the data collector from the interview and presents survey questions in an audio format, by which the respondent hears the questions through earphones. He or she responds by selecting a choice presented on the computer device without interacting with a data collector.
However, the RTI team, working predominantly with primary school-aged children, noticed a gap in the ACASI literature: no studies had used ACASI to administer surveys to pre-adolescents or younger, nor was there any literature to be found on using ACASI to administer questions to young children in school on their experiences of violence in school, an area of research in which the team was already working. Moreover, we suspected that face-to-face surveys that required young children to divulge their experiences of violence in school would certainly be prone to social-desirability bias, and therefore, we designed a study to pilot the use of ACASI.
Would ACASI be a feasible method to administer surveys of experiences of violence to children as young as 11-years old? Does ACASI have potential to reduce social-desirability bias in SRGBV surveys among the 11-14 age group? We anticipated that ACASI would be a more effective method of SRGBV data collection compared to the face-to-face method for this age group and may even hold potential in correcting “misinformation” (inaccurate reporting) that creates a skewed perception of students’ experience of violence, which results in misleading data. We built the survey in RTI's Tangerine(TM) data collection system, using new features designed for this purpose, like embedding audio and video in the survey question format.
For this pilot, the team utilized a section of the Survey of Student Experiences of SRGBV that the United States Agency for International Development (USAID)-funded Literacy Achievement and Retention Activity’s Longitudinal Study administered in a selection of primary schools in Uganda. The Longitudinal Study had used this survey instrument in their baseline the prior year to gauge experiences of SRGBV in their schools of implementation. Since the Experiences of Violence survey employed by the Longitudinal Study was of a sensitive nature that may be more susceptible to social desirability bias, it was selected as the pilot instrument for ACASI. This pilot used a convenience sample of nine peri-urban schools outside of Kampala. The team chose primary P5 and P7 students to participate as we estimated that this age group (11-14 years-old) would best be able to manipulate a tablet and navigate ACASI. Twelve students were randomly selected from each of the nine schools—six P5 pupils and six P7 pupils—to participate in either the ACASI or face-to-face-administered Student Experiences of SRGBV Survey. However, instead of using just the traditional face-to-face method of administering the survey, we added an ACASI version and compared the two methods.
Qualitative and Quantitative Data
The reception around the use of ACASI was very positive among data collectors, counselors, and students. The data collectors particularly enjoyed the improved efficiency of ACASI compared to face-to-face, as they only needed to introduce the surveys and explain the tablet functionalities before they were able to continue with the next student. As a result, more students could be assessed at one time. For example, using all available tablets the research team brought to each school, there could be four students completing the ACASI interview, while only one student could be assessed at a time using the face-to-face method. However, this number could increase to accommodate the number of assessors and tablets available for each school.
Counselors were also enthusiastic about the platform's confidentiality and responsiveness to sensitive questions.
Counselors were also enthusiastic about the platform's confidentiality and responsiveness to sensitive questions. First, a student only had to disclose his or her personal situation with one person, instead of two (only reporting to the counselor instead of an assessor and counselor). Next, the platform was programmed to confidentially “flag” questions that implied a student was in immediate danger--if a student answered affirmatively to any of the nine items pertaining to sexual violence, the tablet would state, at the end of the instrument, that an item had been flagged. The assessor would see this notification when collecting the tablet once the student was finished. To safeguard the privacy of the student, the tablet did not list which question was flagged, only that one or more of them had been flagged. The assessor then knew to get a counselor to speak further with the student.
Students were also largely in favor of the ACASI method. Through informal cognitive interviewing, multiple students independently cited ACASI’s “repeat” function as their reason for favoring the tool; these students stated they often felt intimidated to ask a data collector to repeat a question. However, in ACASI, students can press the replay button to hear the question repeated. Other students directly stated that they would not have felt comfortable speaking to a data collector about their experiences and preferred the privacy offered by ACASI. Only a small proportion of students stated that they would have preferred talking to a data collector, citing their preference of having the face-to- face relationship and rapport that a data collector provides. All students reported that after the data collector provided additional explanation, they felt comfortable using tablets.
In addition to the qualitative data, we also analyzed the collected quantitative data. Descriptive findings highlight certain response trends between the two methods. Students using ACASI reported, on average, experiencing two more acts of SRGBV than students who participated in the face-to-face method. This remained true after controlling for demographic characteristics, including sex, age, and socioeconomic status. Furthermore, of the 35 total questions surveyed, five had nearly identical response distributions between ACASI and face-to-face, and 23 questions had greater variability in the ACASI distributions than the face-to-face distributions; that is, students using ACASI were either providing the same responses as the face-to-face students, or were providing a wider range of responses.
For example, Figure 1 displays the response distribution for question 25, “During this past school year, how many times did anyone try to get you to do something sexual with them, other than kissing, but you didn’t do it?” Response distributions for the face-to-face (FTF) interviews exhibit less variability across the response choices than for the ACASI method: more students in the face-to-face method reported that they "never" experienced this act of sexual violence and no students in the face-to-face method reported that they experienced this act of violence "many times".
Additionally, Figure 2 displays the frequency of responses to question 6, “During this past school year, how many times did anyone physically hurt you on purpose by pushing you down, kicking you, or hitting you with a hand, clenched fist, object, or a weapon?” Although only 6% of the students using the face-to-face method reported this happening to them "many times", 23% of the students using ACASI reported this happening "many times". In response to the survey item that stated, “During this past school year, how many times did anyone offer to give you good marks if you did something sexual?”, 100% of students using the FTF methodology reported that this had never occurred to them. However, only 83% of students using the ACASI methodology reported this never occurred, with 4% reporting that this had occurred a few times during the past school year.
At the end of the survey’s sexual harassment and violence section, the students were asked follow-up questions about the nature of the offense, including if and how often it was a teacher who perpetrated any of the offenses. Of the face-to-face students, 97% said a teacher had "never" committed any of the offenses, while 3% said the teacher committed the offenses "a few times". Of the ACASI students, 81% reported it was "never" a teacher, 5% reported it was the teacher "a few times", and 14% responded “No response/I don’t know.” The differences in variation suggest that students may respond differently based on survey administration method. These findings require additional testing for confirmation.
This feasibility study provides strong evidence that the ACASI methodology can be used to collect data from students, even those as young as 11. After a brief introduction to the tablet and process from assessors, students were able to navigate through the survey on their own. Furthermore, the qualitative findings suggest that overall, P5 and P7 students in this study preferred the ACASI methodology over the face-to-face method. Results from the descriptive analysis suggest that students respond differently when using ACASI versus face-to-face. Given these results, more extensive research comparing the face-to-face and ACASI methods for assessing SRGBV and related areas is warranted to provide more conclusive data as this preliminary pilot was designed only to determine: 1) is ACASI feasible in its implementation within the 11-14 age group (i.e. is this age group able to effectively use the ACASI interface)?; and 2) does using ACASI provide different enough data to warrant a deeper dive into the methodology?
Interested in learning more? The research team did expand this study in July of 2019 with a semi-randomized sample of 410 students across 40 schools. A brief of the study, Audio Computer-Assisted Self-Interview: Surveys of a sensitive nature require a sensitive method of data collection, is now available and a published research paper is forthcoming. To learn more about Tangerine, contact the helpdesk.
 Mierzwa, S., S. Souidi, I. Friedland, L. Katzen, and S. Littlefield. 2013. "Effective approaches to user-interface design with ACASI in the developing world." Interactions, May-June 2013. Retrieved from: https://dl.acm.org/doi/pdf/10.1145/2451856.2451870