Addressing Publication Bias in Meta-Analysis
Empirical Findings From Community-Augmented Meta-Analyses of Infant Language Development
Abstract
Abstract. Meta-analyses are an indispensable research synthesis tool for characterizing bodies of literature and advancing theories. One important open question concerns the inclusion of unpublished data into meta-analyses. Finding such studies can be effortful, but their exclusion potentially leads to consequential biases like overestimation of a literature’s mean effect. We address two questions about unpublished data using MetaLab, a collection of community-augmented meta-analyses focused on developmental psychology. First, we assess to what extent MetaLab datasets include gray literature, and by what search strategies they are unearthed. We find that an average of 11% of datapoints are from unpublished literature; standard search strategies like database searches, complemented with individualized approaches like including authors’ own data, contribute the majority of this literature. Second, we analyze the effect of including versus excluding unpublished literature on estimates of effect size and publication bias, and find this decision does not affect outcomes. We discuss lessons learned and implications.
References
2012). The rules of the game called psychological science. Perspectives on Psychological Science, 7, 543–554. https://doi.org/10.1177/1745691612459060
(2016). Development of infants’ segmentation of words from native speech: A meta-analytic approach. Developmental Science, 19, 901–917. https://doi.org/10.1111/desc.12341
(2018). Promoting replicability in developmental research through meta-analyses: Insights from language acquisition research. Child Development, 89, 1996–2009. https://doi.org/10.1111/cdev.13079
(2017).
(Quantifying infants’ statistical word segmentation: A meta-analysis . In G. GunzelmannA. HowesT. TenbrinkE. DavelaarEds., Proceedings of the 39th Annual Meeting of the Cognitive Science Society (pp. 124–129). Austin, TX: Cognitive Science Society.2018). Separation and acquisition of two languages in early childhood: A multidisciplinary approach. (Unpublished doctoral dissertation). Ecole Normale Supérieure, Paris, France
(2018). Do published studies yield larger effect sizes than unpublished studies in education and special education? A meta-review. Educational Psychology Review, 30, 727–744. https://doi.org/10.1007/s10648-018-9437-7
(2018). Can infants learn phonology in the lab? A meta-analytic answer. Cognition, 180, 312–327. https://doi.org/10.1016/j.cognition.2017.09.016
(1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315, 629–634. https://doi.org/10.1136/bmj.315.7109.629
(2012). Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychological Methods, 17, 120–128. https://doi.org/10.1037/a0024445
(2012). A vast graveyard of undead theories: Publication bias and psychological science’s aversion to the null. Perspectives on Psychological Science, 7, 555–561. https://doi.org/10.1177/1745691612459059
(2018). SymBouki: A meta-analysis on the emergence of sound symbolism in early language acquisition. Developmental Science, 21, e12659. https://doi.org/10.1111/desc.12659
(2013). Is the coverage of Google Scholar enough to be used alone for systematic reviews. BMC Medical Informatics and Decision Making, 13, 7. https://doi.org/10.1186/1472-6947-13-7
(2011). GRADE guidelines: 5. Rating the quality of evidence-publication bias. Journal of Clinical Epidemiology, 64, 1277–1282. https://doi.org/10.1016/j.jclinepi.2011.01.011
(2011). Completeness and changes in registered data and reporting bias of randomized controlled trials in ICMJE journals after trial registration policy. PLoS One, 6, e25258. https://doi.org/10.1371/journal.pone.0025258
(2015). Likelihood of null effects of large NHLBI clinical trials has increased over time. PLoS One, 10, e0132382. https://doi.org/10.1371/journal.pone.0132382
(2011). Fixed effects and variance components estimation in three-level meta-analysis. Research Synthesis Methods, 2, 61–76. https://doi.org/10.1002/jrsm.35
(2017). Examining the reproducibility of meta-analyses in psychology: A preliminary report. BITSS Preprint, https://doi.org/10.31222/osf.io/xfbjf
(2006). The case of the misleading funnel plot. British Medical Journal, 333, 597–600. https://doi.org/10.1136/bmj.333.7568.597
(2001). A comparison of methods to detect publication bias in meta-analysis. Statistics in Medicine, 20, 641–654. https://doi.org/10.1002/sim.698
(2018). Infants’ evaluation of prosocial and antisocial agents: A meta-analysis. Developmental Psychology, 54, 1445–1455. https://doi.org/10.1037/dev0000538
(2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. British Medical Journal, 339, b2535. https://doi.org/10.1136/bmj.b2535
. (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of Educational Research, 86, 207–236. https://doi.org/10.3102/0034654315582067
(2019). A data-sharing agreement helps to increase researchers’ willingness to share primary data: results from a randomized controlled trial. Journal of Clinical Epidemiology, 106, 60–69. https://doi.org/10.1016/j.jclinepi.2018.10.006
(2019). The profile of abstract rule learning in infancy: Meta-analytic and experimental evidence. Developmental Science, 22, e12704. https://doi.org/10.1111/desc.12704
(2019). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from https://www.R-project.org/
. (2008). Few systematic reviews exist documenting the extent of bias: A systematic review. Journal of Clinical Epidemiology, 61, 422–434. https://doi.org/10.1016/j.jclinepi.2007.10.017
(2019). Associative word learning in infancy: A meta-analysis of the switch task. Developmental Psychology, 55, 934–950. https://doi.org/10.1037/dev0000699
(2014). Community-augmented meta-analyses: Toward cumulative data assessment. Perspectives on Psychological Science, 9, 661–665. https://doi.org/10.1177/1745691614552498
(2014). Perceptual attunement in vowels: A meta-analysis. Developmental Psychobiology, 56, 179–191. https://doi.org/10.1002/dev.21179
(2017). Estimates of between-study heterogeneity for 705 meta-analyses reported in Psychological Bulletin from 1990–2013. Journal of Open Psychology Data, 5, 4. https://doi.org/10.5334/jopd.33
(2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36, 1–48. Retrieved from http://www.jstatsoft.org/v36/i03/
(2018).
(A meta-analysis of infants’ mispronunciation sensitivity development . In T. T. RogersM. RauX. ZhuC. W. KalishEds., Proceedings of the 40th Annual Conference of the Cognitive Science Society (pp. 1159–1164). Austin, TX: Cognitive Science Society.1994).
(Scientific communication and literature retrieval . In H. CooperL. V. HedgesEds., The handbook of research synthesis (pp. 41–56). New York, NY: Russell Sage Foundation.2017). tidyverse: Easily Install and Load the ‘Tidyverse’ (R package version 1.2.1). Retrieved from https://CRAN.R-project.org/package=tidyverse
(