Skip to main content
Original Article

Web-Based Versus Paper-Pencil Assessment of Behavioral Problems Using the Youth Self-Report

Published Online:https://doi.org/10.1027/1015-5759/a000585

Abstract. This is the first study investigating the use of a web-based version of the Youth Self-Report (YSR) compared to paper-pencil (PP) assessment. Students aged 10–18 years were recruited from Austrian schools and either completed a PP (N = 841) or a web-based (N = 2,769) version. Psychometric properties and indicators of data quality were analyzed. Moreover, cost estimations for the web-based and PP assessments were given. Acceptable model fits (RMSEA < .05) of the 8-syndrome model were observed for both versions and measurement invariance testing revealed strong invariance between both formats. While there was no significant difference in the number of missing items, a slightly higher proportion of web-based datasets were analyzable (97.7% vs. 93.7%). A larger amount of information was provided in the open-ended questions in the web-based version (p < .001). Mean problem scores were equivalent, with the exception of the thought problems syndrome scale in which a slightly higher score was observed in the web-based version (p < .001). In our study, the web-based format reduced costs by 72% compared to the PP format. Our findings suggest that large web-based epidemiological surveys are cost-efficient, can be applied without the risk of disadvantages compared to PP assessments and might be superior regarding some aspects of data quality.

References

  • Achenbach, T. M. (1991). Manual for the Youth Self-Report and 1991 profile. Burlington, VT: Department of Psychiatry, University of Vermont. First citation in articleGoogle Scholar

  • Achenbach, T. M., Ivanova, M. Y., Rescorla, L. A., Turner, L. V., & Althoff, R. R. (2016). Internalizing/externalizing problems: Review and recommendations for clinical and research applications. Journal of the American Academy of Child & Adolescent Psychiatry, 55, 647–656. https://doi.org/10.1016/j.jaac.2016.05.012 First citation in articleCrossrefGoogle Scholar

  • Alfonsson, S., Maathz, P., & Hursti, T. (2014). Interformat reliability of digital psychiatric self-report questionnaires: A systematic review. Journal of Medical Internet Research, 16, e268. https://doi.org/10.2196/jmir.3395 First citation in articleCrossrefGoogle Scholar

  • Bentler, P. M. (1990). Comparative fit indices in structural models. Psychological Bulletin, 107, 238–246. https://doi.org/10.1037/0033-2909.107.2.238 First citation in articleCrossrefGoogle Scholar

  • Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. Sage Focus Editions, 154, 136. First citation in articleGoogle Scholar

  • Buerger, S., Kroehne, U., Koehler, C., & Goldhammer, F. (2019). What makes the difference? The impact of item properties on mode effects in reading assessments. Studies in Educational Evaluation, 62, 1–9. https://doi.org/10.1016/j.stueduc.2019.04.005 First citation in articleCrossrefGoogle Scholar

  • Carlbring, P., Brunt, S., Bohman, S., Austin, D., Richards, J., Öst, L. G., & Andersson, G. (2007). Internet vs. paper and pencil administration of questionnaires commonly used in panic/agoraphobia research. Computers in Human Behavior, 23, 1421–1434. https://doi.org/10.1016/j.chb.2005.05.002 First citation in articleCrossrefGoogle Scholar

  • Döpfner, M., Plück, J., Bölte, S., Lenz, K., Melchers, P., & Heim, K. (1998). Fragebogen für Jugendliche. Deutsche Bearbeitung des Youth Self-Report (YSR) der Child Behaviour Checklist. Einführung und Anleitung zur Handauswertung [Questionnaire for adolescents. German adaptation of the Youth Self-Report (YSR) of the Child Behaviour Checklist. Introduction and scoring instruction] (2nd ed.). Köln, Germany: Arbeitsgruppe Kinder, Jugend- und Familiendiagnostik. First citation in articleGoogle Scholar

  • Erbe, D., Eichert, H. C., Rietz, C., & Ebert, D. (2016). Interformat reliability of the patient health questionnaire: Validation of the computerized version of the PHQ-9. Internet Interventions, 5, 1–4. https://doi.org/10.1016/j.invent.2016.06.006 First citation in articleCrossrefGoogle Scholar

  • Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149–1160. https://doi.org/10.3758/BRM.41.4.1149 First citation in articleCrossrefGoogle Scholar

  • Gnambs, T., & Kaspar, K. (2017). Socially desirable responding in web-based questionnaires: A meta-analytic review of the candor hypothesis. Assessment, 24, 746–762. https://doi.org/10.1177/1073191115624547 First citation in articleCrossrefGoogle Scholar

  • Griffis, S. E., Goldsby, T. J., & Cooper, M. (2003). Web‐based and mail surveys: A comparison of response, data, and cost. Journal of Business Logistics, 24, 237–258. https://doi.org/10.1002/j.2158-1592.2003.tb00053.x First citation in articleCrossrefGoogle Scholar

  • Hoek, W., Schuurmans, J., Koot, H. M., & Cuijpers, P. (2009). Prevention of depression and anxiety in adolescents: A randomized controlled trial testing the efficacy and mechanisms of Internet-based self-help problem-solving therapy. Trials, 10, 93. https://doi.org/10.1186/1745-6215-10-93 First citation in articleCrossrefGoogle Scholar

  • Ivanova, M. Y., Achenbach, T. M., Rescorla, L. A., Dumenci, L., Almqvist, F., Bilenberg, N., … Erol, N. (2007). The generalizability of the Youth Self-Report syndrome structure in 23 societies. Journal of Consulting and Clinical Psychology, 75, 729–738. https://doi.org/10.1037/0022-006X.75.5.729 First citation in articleCrossrefGoogle Scholar

  • Kongsved, S. M., Basnov, M., Holm-Christensen, K., & Hjollund, N. H. (2007). Response rate and completeness of questionnaires: A randomized study of Internet versus paper-and-pencil versions. Journal of Medical Internet Research, 9, e25. https://doi.org/10.2196/jmir.9.3.e25 First citation in articleCrossrefGoogle Scholar

  • LimeSurvey GmbH. (2017). LimeSurvey: An open source survey tool. Hamburg, Germany: LimeSurvey GmbH. Retrieved from http://www.limesurvey.org First citation in articleGoogle Scholar

  • Noyes, J. M., & Garland, K. J. (2008). Computer- vs. paper-based tasks: Are they equivalent? Ergonomics, 51, 1352–1375. https://doi.org/10.1080/00140130802170387 First citation in articleCrossrefGoogle Scholar

  • Parks, K. A., Pardi, A. M., & Bradizza, C. M. (2006). Collecting data on alcohol use and alcohol-related victimization: A comparison of telephone and Web-based survey methods. Journal of Studies on Alcohol, 67, 318–323. https://doi.org/10.15288/jsa.2006.67.318 First citation in articleCrossrefGoogle Scholar

  • Philipp, J., Zeiler, M., Waldherr, K., Truttmann, S., Dür, W., Karwautz, A. F., & Wagner, G. (2018). Prevalence of emotional and behavioral problems and subthreshold psychiatric disorders in Austrian adolescents and the need for prevention. Social Psychiatry and Psychiatric Epidemiology, 53, 1325–1337. https://doi.org/10.1007/s00127-018-1586-y First citation in articleCrossrefGoogle Scholar

  • Spark, S., Lewis, D., Vaisey, A., Smyth, E., Wood, A., Temple-Smith, M., … Hocking, J. (2015). Using computer-assisted survey instruments instead of paper and pencil increased completeness of self-administered sexual behavior questionnaires. Journal of Clinical Epidemiology, 68, 94–101. https://doi.org/10.1016/j.jclinepi.2014.09.011 First citation in articleCrossrefGoogle Scholar

  • Sass, D. A. (2011). Testing measurement invariance and comparing latent factor means within a confirmatory factor analysis framework. Journal of Psychoeducational Assessment, 29, 347–363. https://doi.org/10.1177/0734282911406661 First citation in articleCrossrefGoogle Scholar

  • Steinhausen, H. C., Metzke, C. W., Meier, M., & Kannenberg, R. (1998). Prevalence of child and adolescent psychiatric disorders: The Zürich Epidemiological Study. Acta Psychiatrica Scandinavica, 98, 262–271. https://doi.org/10.1111/j.1600-0447.1998.tb10082.x First citation in articleCrossrefGoogle Scholar

  • Svetina, D., Rutkowski, L., & Rutkowski, D. (2020). Multiple-group invariance with categorical outcomes using updated guidelines: An illustration using M plus and the lavaan/semTools packages. Structural Equation Modeling: A Multidisciplinary Journal, 27, 111–130. https://doi.org/10.1080/10705511.2019.1602776 First citation in articleCrossrefGoogle Scholar

  • Toepoel, V., Das, M., & Van Soest, A. (2009). Design of web questionnaires: The effects of the number of items per screen. Field Methods, 21, 200–213. https://doi.org/10.1177/1525822X08330261 First citation in articleCrossrefGoogle Scholar

  • Vallejo, M. A., Jordán, C. M., Díaz, M. I., Comeche, M. I., & Ortega, J. (2007). Psychological assessment via the internet: a reliability and validity study of online (vs paper-and-pencil) versions of the General Health Questionnaire-28 (GHQ-28) and the Symptoms Check-List-90-Revised (SCL-90-R). Journal of Medical Internet Research, 9, e2. https://doi.org/10.2196/jmir.9.1.e2 First citation in articleCrossrefGoogle Scholar

  • Van Ballegooijen, W., Riper, H., Cuijpers, P., van Oppen, P., & Smit, J. H. (2016). Validation of online psychometric instruments for common mental health disorders: A systematic review. BMC Psychiatry, 16, 45. https://doi.org/10.1186/s12888-016-0735-7 First citation in articleCrossrefGoogle Scholar

  • Van Gelder, M. M., Bretveld, R. W., & Roeleveld, N. (2010). Web-based questionnaires: The future in epidemiology? American Journal of Epidemiology, 172, 1292–1298. https://doi.org/10.1093/aje/kwq291 First citation in articleCrossrefGoogle Scholar

  • Van Zellem, L., Utens, E. M., Madderom, M., Legerstee, J. S., Aarsen, F., Tibboel, D., & Buysse, C. (2016). Cardiac arrest in infants, children, and adolescents: Long-term emotional and behavioral functioning. European Journal of Pediatrics, 175, 977–986. https://doi.org/10.1007/s00431-016-2728-4 First citation in articleCrossrefGoogle Scholar

  • Vleeschouwer, M., Schubart, C. D., Henquet, C., Myin-Germeys, I., van Gastel, W. A., Hillegers, M. H., & Derks, E. M. (2014). Does assessment type matter? A measurement invariance analysis of online and paper and pencil assessment of the Community Assessment of Psychic Experiences (CAPE). PLoS One, 9, e84011. https://doi.org/10.1371/journal.pone.0084011 First citation in articleCrossrefGoogle Scholar

  • Ward, P., Clark, T., Zabriskie, R., & Morris, T. (2014). Paper/pencil versus online data collection: An exploratory study. Journal of Leisure Research, 46, 84–105. https://doi.org/10.1080/00222216.2014.11950314 First citation in articleCrossrefGoogle Scholar

  • Wijndaele, K., Matton, L., Duvigneaud, N., Lefevre, J., Duquet, W., Thomis, M., … Philippaerts, R. (2007). Reliability, equivalence and respondent preference of computerized versus paper-and-pencil mental health questionnaires. Computers in Human Behavior, 23, 1958–1970. https://doi.org/10.1016/j.chb.2006.02.005 First citation in articleCrossrefGoogle Scholar

  • Wu, H., & Estabrook, R. (2016). Identification of confirmatory factor analysis models of different levels of invariance for ordered categorical outcomes. Psychometrika, 81, 1014–1045. https://doi.org/10.1007/s11336-016-9506-0 First citation in articleCrossrefGoogle Scholar

  • Zeiler, M., Wagner, G., Philipp, J., Nitsch, M., Truttmann, S., Dür, W., … Waldherr, K. (2018). The Mental Health in Austrian Teenagers Study (MHAT): Design, methodology, description of study population. Neuropsychiatrie, 32, 121–132. https://doi.org/10.1007/s40211-018-0273-2 First citation in articleCrossrefGoogle Scholar

  • Zhang, X., Kuchinke, L., Woud, M. L., Velten, J., & Margraf, J. (2017). Survey method matters: Online/offline questionnaires and face-to-face or telephone interviews differ. Computers in Human Behavior, 71, 172–180. https://doi.org/10.1016/j.chb.2017.02.006 First citation in articleCrossrefGoogle Scholar