RTI uses cookies to offer you the best experience online. By clicking “accept” on this website, you opt in and you agree to the use of cookies. If you would like to know more about how RTI uses cookies and how to manage them please view our Privacy Policy here. You can “opt out” or change your mind by visiting: http://optout.aboutads.info/. Click “accept” to agree.
The role of survey training materials in stated-preference studies
Vass, C. M., Davison, N. J., Vander Stichele, G., & Payne, K. (2019). A picture is worth a thousand words: The role of survey training materials in stated-preference studies. The Patient. Advance online publication. https://doi.org/10.1007/s40271-019-00391-w
BACKGROUND: Online survey-based methods are increasingly used to elicit preferences for healthcare. This digitization creates an opportunity for interactive survey elements, potentially improving respondents' understanding and/or engagement.
OBJECTIVE: Our objective was to understand whether, and how, training materials in a survey influenced stated preferences.
METHODS: An online discrete-choice experiment (DCE) was designed to elicit public preferences for a new targeted approach to prescribing biologics ("biologic calculator") for rheumatoid arthritis (RA) compared with conventional prescribing. The DCE presented three alternatives, two biologic calculators and a conventional approach (opt out), described by five attributes: delay to treatment, positive predictive value, negative predictive value, infection risk, and cost saving to the national health service. Respondents were randomized to receive training materials as plain text or an animated storyline. Training materials contained information about RA and approaches to treatment and described the biologic calculator. Background questions included sociodemographics and self-reported measures of task difficulty and attribute non-attendance. DCE data were analyzed using conditional and heteroskedastic conditional logit (HCL) models.
RESULTS: In total, 300 respondents completed the DCE, receiving either plain text (n = 158) or the animated storyline (n = 142). The HCL showed the estimated coefficients for all attributes aligned with a priori expectations and were statistically significant. The scale term was statistically significant, indicating that respondents who received plain-text materials had more random choices. Further tests suggested preference homogeneity after accounting for differences in scale.
CONCLUSIONS: Using animated training materials did not change the preferences of respondents, but they appeared to improve choice consistency, potentially allowing researchers to include more complex designs with increased numbers of attributes, levels, alternatives or choice sets.