16 February 2024 | Nicola B. Campoamor¹ · Christi J. Guerrini² · Whitney Bash Brooks² · John F. P. Bridges¹ · Norah L. Crossnohere³
This paper provides a practical guide for researchers conducting pretesting of discrete-choice experiments (DCEs) to improve the quality and relevance of preference elicitation. Pretesting is a critical stage in DCE design that involves engaging with target population representatives to enhance the readability, presentation, and structure of the preference instrument. The goal of pretesting is to improve the validity, reliability, and relevance of the survey while reducing bias, burden, and error in preference elicitation, data collection, and interpretation. Despite its importance, pretesting lacks documented best practices or clear examples.
The paper defines pretesting as a flexible process where representatives of the target population are engaged to improve the survey instrument. It presents a practical guide and pretesting interview discussion template to help researchers conduct a rigorous pretest. The guide is organized into four domains: content, presentation, comprehension, and elicitation. These domains help researchers consider aspects of the DCE during pretesting. The paper also provides an illustrative example of how these resources were used to inform the design of a complex DCE aimed at eliciting trade-offs between personal privacy and societal benefit in the context of investigative genetic genealogy (IGG).
Pretesting involves engaging with target population members to review and provide feedback on the instrument. It can be used to evaluate and improve the content, format, and structure of the survey. Pretesting can also help reduce survey burden, improve clarity, identify potential ethical issues, and mitigate sources of bias. The paper describes various methods used in pretesting, including cognitive interviewing, debriefing, and behavioral coding approaches. It also discusses the importance of pretesting in the context of DCEs, emphasizing the need for transparency and good practices in the design process.
The paper applies the pretesting guide to a complex DCE on public preferences regarding the use of IGG. The pretesting process involved recruiting participants from the target population, conducting interviews, and making modifications based on feedback. The results showed that pretesting helped identify areas for improvement, such as the clarity of attribute descriptions and the presentation of choice tasks. The pretesting process also helped ensure that the survey was easy to understand and that participants were able to make informed trade-offs.
The paper concludes that pretesting is an essential but often under-described stage in the DCE design process. It provides practical guidance to help facilitate comprehensive and relevant pretesting of DCEs and operationalizes this guidance using a pretesting interview discussion template. These resources can facilitate future activities and discussions to develop good practices for pretesting, which may ultimately help facilitate higher quality preference research with greater value to decision makers.This paper provides a practical guide for researchers conducting pretesting of discrete-choice experiments (DCEs) to improve the quality and relevance of preference elicitation. Pretesting is a critical stage in DCE design that involves engaging with target population representatives to enhance the readability, presentation, and structure of the preference instrument. The goal of pretesting is to improve the validity, reliability, and relevance of the survey while reducing bias, burden, and error in preference elicitation, data collection, and interpretation. Despite its importance, pretesting lacks documented best practices or clear examples.
The paper defines pretesting as a flexible process where representatives of the target population are engaged to improve the survey instrument. It presents a practical guide and pretesting interview discussion template to help researchers conduct a rigorous pretest. The guide is organized into four domains: content, presentation, comprehension, and elicitation. These domains help researchers consider aspects of the DCE during pretesting. The paper also provides an illustrative example of how these resources were used to inform the design of a complex DCE aimed at eliciting trade-offs between personal privacy and societal benefit in the context of investigative genetic genealogy (IGG).
Pretesting involves engaging with target population members to review and provide feedback on the instrument. It can be used to evaluate and improve the content, format, and structure of the survey. Pretesting can also help reduce survey burden, improve clarity, identify potential ethical issues, and mitigate sources of bias. The paper describes various methods used in pretesting, including cognitive interviewing, debriefing, and behavioral coding approaches. It also discusses the importance of pretesting in the context of DCEs, emphasizing the need for transparency and good practices in the design process.
The paper applies the pretesting guide to a complex DCE on public preferences regarding the use of IGG. The pretesting process involved recruiting participants from the target population, conducting interviews, and making modifications based on feedback. The results showed that pretesting helped identify areas for improvement, such as the clarity of attribute descriptions and the presentation of choice tasks. The pretesting process also helped ensure that the survey was easy to understand and that participants were able to make informed trade-offs.
The paper concludes that pretesting is an essential but often under-described stage in the DCE design process. It provides practical guidance to help facilitate comprehensive and relevant pretesting of DCEs and operationalizes this guidance using a pretesting interview discussion template. These resources can facilitate future activities and discussions to develop good practices for pretesting, which may ultimately help facilitate higher quality preference research with greater value to decision makers.