Document Type
Article
Publication Date
2025
DOI
10.1111/acer.70024
Publication Title
Alcohol, Clinical and Experimental Research
Volume
Article in Press
Pages
11 pp.
Abstract
Background: The accuracy of survey responses is a concern in research data quality, especially in college student samples. However, examination of the impact of removing participants from analyses who respond inaccurately or carelessly is warranted given the potential for loss of information or sample diversity. This study aimed to understand if careless responding varies across a number of demographic indices, substance use behaviors, and the timing of survey completion.
Method: College students (N = 5809; 70.7% female; 75.7% White, non-Hispanic) enrolled in psychology classes from six universities completed an online survey assessing a variety of demographic and substance use-related information, which included four attention check questions dispersed throughout the hour-long survey. Differences in careless responding were assessed across multiple demographic groups, and we examined the impact of careless responding on data quality via a confirmatory factor analysis of a validated substance use measure, the Drinking Motives Questionnaire-Revised Short Form.
Results: Careless responding varied significantly by participant race, sex, gender, sexual orientation, and socioeconomic status. Substance use was generally unassociated with careless responding, though careless responding was associated with experiencing more alcohol-related problems. Careless responding was more prevalent when the survey was completed near the end of the semester. Finally, the factor structure of the drinking motives measure was affected by the inclusion of those who failed two or more attention check questions.
Conclusions: Including attention checks in surveys is an effective method to detect and address careless responding. However, omitting participants from analyses who evidence any careless responding may bias the sample demographics. We discuss recommendations for the use of attention check questions in undergraduate substance use cross-sectional surveys, including retaining participants who fail only one attention check, as this has a minimal impact on data quality while preserving sample diversity.
Rights
© 2025 The Authors.
This is an open access article under the terms of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) License, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.
Data Availability
Article states: "The data that support the findings of this study are available from the corresponding author upon reasonable request."
Original Publication Citation
Braitman, A. L., Petrey, A. M., Shipley, J. L., Ayala Guzman, R., Renzoni, E., Looby, A., & Bravo, A. J. (2025). Check your data before you wreck your model: The impact of careless responding on substance use data quality. Alcohol, Clinical and Experimental Research. Advance online publication. https://doi.org/10.1111/acer.70024
ORCID
0000-0003-2259-1094 (Braitman), 0000-0002-5203-4486 (Shipley), 0000-0001-6433-7313 (Ayala Guzman)
Repository Citation
Braitman, Abby L.; Petrey, Anna M.; Shipley, Jennifer L.; Ayala Guzman, Rachel; Renzoni, Emily; Looby, Alison; and Bravo, Adrian J., "Check Your Data Before You Wreck Your Model: The Impact of Careless Responding on Substance Use Data Quality" (2025). Psychology Faculty Publications. 219.
https://digitalcommons.odu.edu/psychology_fac_pubs/219
Included in
Data Science Commons, Design of Experiments and Sample Surveys Commons, Substance Abuse and Addiction Commons
Comments
This project was completed by the Stimulant Norms and Prevalence 2 (SNAP2) Study Team, which includes the following investigators (in alphabetical order): Adrian J. Bravo, William & Mary; Bradley T. Conner, Colorado State University; Laura J. Holt, Trinity College; Alison Looby, University of Wyoming (PI); Mark A. Prince, Colorado State University; Ty S. Schepis, Texas State University; Ellen W. Yeung, George Washington University.