Konditionierungseffekte in Panel-Untersuchungen
Systematische Übersichtsarbeit und Meta-Analyse am Beispiel sensitiver Fragen
Abstract
Zusammenfassung. Paneldaten sind für die Untersuchung kausaler Zusammenhänge und die Beantwortung längsschnittlicher Fragestellungen unverzichtbar. Es ist allerdings umstritten, welchen Effekt die wiederholte Befragung von Panelteilnehmern auf die Qualität von Paneldaten hat. Der zu erwartende Lerneffekt der Teilnehmer bei wiederholter Teilnahme wird als Panelkonditionierung bezeichnet und kann sowohl positive als auch negative Folgen für die Validität der Paneldaten aufweisen. Insbesondere bei sensitiven Items werden Auswirkungen auf die soziale Erwünschtheit der gemachten Angaben erwartet. Die verfügbare Evidenz zu Konditionierungseffekten bei sensitiven Fragen legt unterschiedliche Effekte je nach Art der Frage nahe und wurde bisher lediglich in Form narrativer Reviews aufgearbeitet. In der vorliegenden Meta-Analyse werden anhand der verfügbaren experimentellen Evidenz (154 Effektstärken aus 19 Berichten) Konditionierungseffekte in Abhängigkeit von der Art der Frage, sowie der Häufigkeit und der Abstände zwischen den Erhebungen (Dosiseffekte) untersucht. Standardisierte Mittelwertunterschiede zwischen wiederholt teilnehmenden und erstmalig teilnehmenden Probanden werden mittels Mehrebenen-Meta-Regressionen analysiert. Dabei zeigen sich nur geringe Effekte vorheriger Befragungen auf das Antwortverhalten in Folgewellen. Nach aktuellem Stand kann daher davon ausgegangen werden, dass die Qualität von Paneldaten nicht in relevantem Maße von Konditionierungseffekten beeinflusst wird. Grenzen der vorliegenden Meta-Analyse und relevante Forschungslücken werden diskutiert. Eine englische Übersetzung als Rohfassung dieses Artikels finden Sie als Elektronisches Supplement 1. ESM1.
Abstract. Panel data are indispensable for investigating causal relationships and answering longitudinal questions. However, it is controversial how the repeated survey of panel participants affects the quality of panel data. The expected learning effect of repeated participation is called panel conditioning and can have both positive and negative consequences for the validity of panel data. Sensitive items in particular are expected to have an impact on the social desirability of the information provided. The available evidence on conditioning effects for sensitive questions suggests different effects depending on the type of question and has so far only been processed in the form of narrative reviews. In the present meta-analysis, conditioning effects are examined on the basis of the available experimental evidence (154 effect strengths from 19 reports), depending on the type of question, as well as the frequency and intervals between surveys (dosage effects). Standardized mean differences between experienced and fresh participants are analyzed by multi-level meta-regressions. The effects of previous surveys on the response behaviour in subsequent waves are only minor. At present, it can therefore be assumed that the quality of panel data is not influenced to a relevant extent by conditioning effects. Limits of the present meta-analysis and relevant research gaps are discussed.
Literatur
Studien mit * wurden in der Meta-Analyse verwendet.
2018). An introduction to the understanding America study internet panel. Social Security Bulletin, 78 (2), 13 – 26.
(2016). Fitting three-level meta-analytic models in R: A step-by-step tutorial. The Quantitative Methods for Psychology, 12, 154 – 174.
(*2015). Response of sensitive behaviors to frequent measurement. Social Science Research, 49, 1 – 15.
(*2016). Effects of intensive longitudinal data collection on pregnancy and contraceptive use. International Journal of Social Research Methodology, 19, 205 – 222.
(2007). How emotion shapes behavior: Feedback, anticipation, and reflection, rather than direct causation. Personality and Social Psychology Review, 8 (1), 1 – 20.
(2018). What was I thinking? A theoretical framework for analysing panel conditioning in attitudes and (response) behaviour. International journal of social research methodology, 21, 333 – 345.
(2013). Panel conditioning in difficult attitudinal questions. Public Opinion Quarterly, 77.
(2018). : Establishing an open probability-based mixed-mode panel of the general population in Germany: The GESIS Panel. Social Science Computer Review, 36 (1), 103 – 115.
(*1977). Interviewing Changes Attitudes – Sometimes. Public Opinion Quarterly, 41 (1), 56 – 64.
(2014). Ein national gefördertes Onlinelabor als Infrastruktur für die psychologische Forschung. Psychologische Rundschau, 65, 75 – 85.
(2017). Testing the Representativeness of a Multimode Survey in South Korea: Results from KAMOS. Asian Journal for Public Opinion Research, 4 (2), 73 – 87.
(2001). Panel Bias from Attrition and Conditioning. A Case Study of the Knowledge Networks Panel. The American Association for Public Opinion Research (AAPOR), 56th Annual Conference.
(*1973). Problems of Contamination in Panel Surveys: A Brief Report on an Independent Sample, Taiwan, 1970. Studies in Family Planning, 4, 257.
(*2011). Nonparametric Tests of Panel Conditioning and Attrition Bias in Panel Surveys. Sociological Methods & Research, 40 (1), 32 – 56.
(2018).
(Open probability-based panel infrastructures (pp. 199 – 209) . In D.L. VannetteJ.A. Krosnick (Eds.), The Palgrave Handbook of Survey Research. London, UK: Palgrave Macmillan.2009). An effect size primer: A guide for clinicians and researchers. Professional Psychology: Research and Practice, 40, 532 – 538.
(2019). Does Repeated Measurement Improve Income Data Quality? Oxford Bulletin of Economics and Statistics, 0305 – 9049.
(*2007). License to Sin: The Liberating Role of Reporting Expectations. Journal of Consumer Research, 34 (1), 22 – 31.
(2008). Should we ask our children about sex, drugs and rock & roll? Potentially harmful effects of asking questions about risky behaviors. Journal of Consumer Psychology, 18 (2), 82 – 95.
(2006). Improving survey questions: Design and evaluation. Thousand Oaks, CA: Sage.
(*2014). Panel conditioning in a longitudinal study of illicit behaviors. Public Opinion Quarterly, 78.
(*2017). Panel Conditioning in the General Social Survey. Sociological Methods & Research, 46 (1), 103 – 124.
(*1974). How Being Interviewed Affects Voting: An Experiment. Public Opinion Quarterly, 37 (3), 398.
(2011). The Effects of Asking Filter Questions in Interleafed Versus Grouped Format. Sociological Methods & Research, 40 (1), 88 – 104.
(1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied cognitive psychology, 5 (3), 213 – 236.
(2009). Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med 6 (7): e1000097.
(*1988). The Hawthorne effect in the measurement of adolescent smoking. Journal of epidemiology and community health, 42, 304 – 306.
(2007). Online access panels and tracking research: the conditioning issue. International Journal of Market Research, 49, 573 – 594.
(2005). Response consistency in young adolescents’ drug use self-reports: a recanting rate analysis. Addiction, 100, 189 – 196.
(1972). Some Effects of „Social Desirability“ in Survey Studies. American Journal of Sociology, 77, 921 – 940.
(*2017). Does involvement in a cohort study improve health and affect health inequalities? A natural experiment. BMC Health Services Research, 17:79.
(2015). Straightlining in Web survey panels over time. Survey Research Methods, 9 (2), 125 – 137.
(2019). Does panel conditioning affect data quality in ego-centered social network questions? Social Networks, 56, 45 – 54.
(1959). Panel Mortality and Panel Bias. Journal of the American Statistical Association, 54 (285), 52 – 68.
(*2017). Rotation group bias in current smoking prevalence estimates using TUS-CPS. Survey Research Methods, 11, 383 – 404.
(*2016). Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments. Social Science Computer Review, 34 (1), 95 – 115.
(2009).
(Attitudes Over Time: The Psychology of Panel Conditioning . In: P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 113 – 126). Chichester, UK: Wiley.*2008). Effects of design in web surveys: Comparing trained and fresh respondents. The Public Opinion Quarterly, 72, 985 – 1007.
(*2012). Panel conditioning in a longitudinal study of adolescents’ substance use: Evidence from an experiment. Social Forces, 90, 891 – 918.
(Eds.). (2000). The psychology of survey response. New York, NY: Cambridge University Press.
(2013). Three-level meta-analysis of dependent effect sizes. Behavior Research Methods, 45, 576 – 594.
(2010). Conducting Meta-Analyses in R with the metafor Package. Journal of Statistical Software, 36 (3), 1 – 48.
(*2012). Panel Conditioning in Longitudinal Social Science Surveys. Sociological Methods and Research, 41, 491 – 534.
(*1989).
(Evidence of conditioning effects in the British social attitudes panel survey . In: D. Kasprzyk (Ed.), Panel surveys (pp. 319 – 339). New York, NY: Wiley.2020). Open probability-based panels. Wiley StatsRef: Statistics Reference Online.
(*2006). Simply asking questions about health behaviors increases both healthy and unhealthy behaviors. Social Influence, 1 (2), 117 – 127.
(