What They See Is What We Get
- 1 February 2004
- journal article
- other
- Published by SAGE Publications in Social Science Computer Review
- Vol. 22 (1), 111-127
- https://doi.org/10.1177/0894439303256555
Abstract
Several alternative response formats are available to the web survey designer, but the choice of format is often made with little consideration of measurement error. The authors experimentally explore three common response formats used in web surveys: a series of radio buttons, a drop box with none of the options initially displayed until the respondent clicks on the box, and a scrollable drop box with some of the options initially visible, requiring the respondent to scroll to see the remainder of the options. The authors reversed the order of the response options for half the sample. The authors find evidence of response order effects but stronger evidence that visible response options are endorsed more frequently, suggesting that visibility may be a more powerful effect than primacy in web surveys. The results suggest that the response format used in web surveys does affect the choices made by respondents.Keywords
This publication has 6 references indexed in Scilit:
- An Evaluation of the Effect of Response Formats on Data Quality in Web SurveysSocial Science Computer Review, 2002
- Web Survey Design and AdministrationPublic Opinion Quarterly, 2001
- Usability Evaluation of Computer-Assisted Survey InstrumentsSocial Science Computer Review, 2000
- Response strategies for coping with the cognitive demands of attitude measures in surveysApplied Cognitive Psychology, 1991
- Task Conditions, Response Formulation Processes, and Response Accuracy for Behavioral Frequency Questions in SurveysPublic Opinion Quarterly, 1991
- An Evaluation of a Cognitive Theory of Response-Order Effects in Survey MeasurementPublic Opinion Quarterly, 1987