I am Nicki Dell, a PhD student at the University of Washington in Seattle. I’d like to tell you about a project that I did recently during an internship at Microsoft Research India with the Technology for Emerging Markets lab.
Those of us working in ICTD frequently work with groups of people that differ significantly from ourselves. However, little attention has been paid to the effects these differences have on the evaluation of technological systems. In this project, we investigated the effects of participant response bias in field studies in developing countries. Via 450 interviews in Bangalore, India, we measure participant response bias due to interviewer demand characteristics and the role of social and demographic factors in influencing that bias.
In our research, we find that respondents are about 2.5x more likely to prefer a technological artifact they believe to be developed by the interviewer, even when the alternative is identical. When the interviewer is a foreign researcher requiring a translator, the bias towards the interviewer’s artifact increases to 5x. In fact, the interviewer’s artifact is preferred even when it is degraded to be obviously inferior to the alternative.
In light of these findings, we recommend that researchers and practitioners pay more attention to the types of response bias that might result from working with any participant population and actively take steps to minimize this bias. This could be done by:
- Dissociate yourself as much as possible from any particular design or solution. If participants are aware of your personal stake in the outcome of the study, the results are more likely to be affected by demand characteristics.
- Collecting and reporting subjective information from participants as a primary method of evaluation is problematic and should be avoided. We found that even though participant comments might be detailed and convincing, they do not necessarily reflect the merit of the solutions at hand.
- As far as possible, the focus of participant interviews and feedback should be on obtaining factual, rather than subjective, information. Using triangulation to validate the data collected could further increase confidence in the results of the study.
- Minimizing the differences between the interviewer and the participants could help minimize the response bias resulting from interviewer demand characteristics.
- Take time to understand the complications and errors that may result from the influence of researchers working in communities that are vastly different from their own.
Finally, our work focuses on the ways in which social and demographic factors may affect participant response bias. The study was done within a particular culture and city, and with two specific participant populations. The effects are likely to vary with different cultures and populations, and with factors like age, gender, ethnicity etc. Further research is required to understand how these individual factors might separately affect participant responses.
Sorry, the comment form is closed at this time.