Web-Based Versus Paper-Pencil Assessment of Behavioral Problems Using the Youth Self-Report
Abstract
Abstract. This is the first study investigating the use of a web-based version of the Youth Self-Report (YSR) compared to paper-pencil (PP) assessment. Students aged 10–18 years were recruited from Austrian schools and either completed a PP (N = 841) or a web-based (N = 2,769) version. Psychometric properties and indicators of data quality were analyzed. Moreover, cost estimations for the web-based and PP assessments were given. Acceptable model fits (RMSEA < .05) of the 8-syndrome model were observed for both versions and measurement invariance testing revealed strong invariance between both formats. While there was no significant difference in the number of missing items, a slightly higher proportion of web-based datasets were analyzable (97.7% vs. 93.7%). A larger amount of information was provided in the open-ended questions in the web-based version (p < .001). Mean problem scores were equivalent, with the exception of the thought problems syndrome scale in which a slightly higher score was observed in the web-based version (p < .001). In our study, the web-based format reduced costs by 72% compared to the PP format. Our findings suggest that large web-based epidemiological surveys are cost-efficient, can be applied without the risk of disadvantages compared to PP assessments and might be superior regarding some aspects of data quality.
References
1991). Manual for the Youth Self-Report and 1991 profile. Burlington, VT: Department of Psychiatry, University of Vermont.
(2016). Internalizing/externalizing problems: Review and recommendations for clinical and research applications. Journal of the American Academy of Child & Adolescent Psychiatry, 55, 647–656. https://doi.org/10.1016/j.jaac.2016.05.012
(2014). Interformat reliability of digital psychiatric self-report questionnaires: A systematic review. Journal of Medical Internet Research, 16, e268. https://doi.org/10.2196/jmir.3395
(1990). Comparative fit indices in structural models. Psychological Bulletin, 107, 238–246. https://doi.org/10.1037/0033-2909.107.2.238
(1993). Alternative ways of assessing model fit. Sage Focus Editions, 154, 136.
(2019). What makes the difference? The impact of item properties on mode effects in reading assessments. Studies in Educational Evaluation, 62, 1–9. https://doi.org/10.1016/j.stueduc.2019.04.005
(2007). Internet vs. paper and pencil administration of questionnaires commonly used in panic/agoraphobia research. Computers in Human Behavior, 23, 1421–1434. https://doi.org/10.1016/j.chb.2005.05.002
(1998). Fragebogen für Jugendliche. Deutsche Bearbeitung des Youth Self-Report (YSR) der Child Behaviour Checklist. Einführung und Anleitung zur Handauswertung
([Questionnaire for adolescents. German adaptation of the Youth Self-Report (YSR) of the Child Behaviour Checklist. Introduction and scoring instruction] (2nd ed.). Köln, Germany: Arbeitsgruppe Kinder, Jugend- und Familiendiagnostik.2016). Interformat reliability of the patient health questionnaire: Validation of the computerized version of the PHQ-9. Internet Interventions, 5, 1–4. https://doi.org/10.1016/j.invent.2016.06.006
(2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149–1160. https://doi.org/10.3758/BRM.41.4.1149
(2017). Socially desirable responding in web-based questionnaires: A meta-analytic review of the candor hypothesis. Assessment, 24, 746–762. https://doi.org/10.1177/1073191115624547
(2003). Web‐based and mail surveys: A comparison of response, data, and cost. Journal of Business Logistics, 24, 237–258. https://doi.org/10.1002/j.2158-1592.2003.tb00053.x
(2009). Prevention of depression and anxiety in adolescents: A randomized controlled trial testing the efficacy and mechanisms of Internet-based self-help problem-solving therapy. Trials, 10, 93. https://doi.org/10.1186/1745-6215-10-93
(2007). The generalizability of the Youth Self-Report syndrome structure in 23 societies. Journal of Consulting and Clinical Psychology, 75, 729–738. https://doi.org/10.1037/0022-006X.75.5.729
(2007). Response rate and completeness of questionnaires: A randomized study of Internet versus paper-and-pencil versions. Journal of Medical Internet Research, 9, e25. https://doi.org/10.2196/jmir.9.3.e25
(2017). LimeSurvey: An open source survey tool. Hamburg, Germany: LimeSurvey GmbH. Retrieved from http://www.limesurvey.org
. (2008). Computer- vs. paper-based tasks: Are they equivalent? Ergonomics, 51, 1352–1375. https://doi.org/10.1080/00140130802170387
(2006). Collecting data on alcohol use and alcohol-related victimization: A comparison of telephone and Web-based survey methods. Journal of Studies on Alcohol, 67, 318–323. https://doi.org/10.15288/jsa.2006.67.318
(2018). Prevalence of emotional and behavioral problems and subthreshold psychiatric disorders in Austrian adolescents and the need for prevention. Social Psychiatry and Psychiatric Epidemiology, 53, 1325–1337. https://doi.org/10.1007/s00127-018-1586-y
(2015). Using computer-assisted survey instruments instead of paper and pencil increased completeness of self-administered sexual behavior questionnaires. Journal of Clinical Epidemiology, 68, 94–101. https://doi.org/10.1016/j.jclinepi.2014.09.011
(2011). Testing measurement invariance and comparing latent factor means within a confirmatory factor analysis framework. Journal of Psychoeducational Assessment, 29, 347–363. https://doi.org/10.1177/0734282911406661
(1998). Prevalence of child and adolescent psychiatric disorders: The Zürich Epidemiological Study. Acta Psychiatrica Scandinavica, 98, 262–271. https://doi.org/10.1111/j.1600-0447.1998.tb10082.x
(2020). Multiple-group invariance with categorical outcomes using updated guidelines: An illustration using M plus and the lavaan/semTools packages. Structural Equation Modeling: A Multidisciplinary Journal, 27, 111–130. https://doi.org/10.1080/10705511.2019.1602776
(2009). Design of web questionnaires: The effects of the number of items per screen. Field Methods, 21, 200–213. https://doi.org/10.1177/1525822X08330261
(2007). Psychological assessment via the internet: a reliability and validity study of online (vs paper-and-pencil) versions of the General Health Questionnaire-28 (GHQ-28) and the Symptoms Check-List-90-Revised (SCL-90-R). Journal of Medical Internet Research, 9, e2. https://doi.org/10.2196/jmir.9.1.e2
(2016). Validation of online psychometric instruments for common mental health disorders: A systematic review. BMC Psychiatry, 16, 45. https://doi.org/10.1186/s12888-016-0735-7
(2010). Web-based questionnaires: The future in epidemiology? American Journal of Epidemiology, 172, 1292–1298. https://doi.org/10.1093/aje/kwq291
(2016). Cardiac arrest in infants, children, and adolescents: Long-term emotional and behavioral functioning. European Journal of Pediatrics, 175, 977–986. https://doi.org/10.1007/s00431-016-2728-4
(2014). Does assessment type matter? A measurement invariance analysis of online and paper and pencil assessment of the Community Assessment of Psychic Experiences (CAPE). PLoS One, 9, e84011. https://doi.org/10.1371/journal.pone.0084011
(2014). Paper/pencil versus online data collection: An exploratory study. Journal of Leisure Research, 46, 84–105. https://doi.org/10.1080/00222216.2014.11950314
(2007). Reliability, equivalence and respondent preference of computerized versus paper-and-pencil mental health questionnaires. Computers in Human Behavior, 23, 1958–1970. https://doi.org/10.1016/j.chb.2006.02.005
(2016). Identification of confirmatory factor analysis models of different levels of invariance for ordered categorical outcomes. Psychometrika, 81, 1014–1045. https://doi.org/10.1007/s11336-016-9506-0
(2018). The Mental Health in Austrian Teenagers Study (MHAT): Design, methodology, description of study population. Neuropsychiatrie, 32, 121–132. https://doi.org/10.1007/s40211-018-0273-2
(2017). Survey method matters: Online/offline questionnaires and face-to-face or telephone interviews differ. Computers in Human Behavior, 71, 172–180. https://doi.org/10.1016/j.chb.2017.02.006
(