Skip to main content
Review Article

A Meta-Analytic Re-Appraisal of the Framing Effect

Published Online:https://doi.org/10.1027/2151-2604/a000321

Abstract. We reevaluated and reanalyzed the data of Kühberger’s (1998) meta-analysis on framing effects in risky decision making by using p-curve. This method uses the distribution of only significant p-values to correct the effect size, thus taking publication bias into account. We found a corrected overall effect size of d = 0.52, which is considerably higher than the effect reported by Kühberger (d = 0.31). Similarly to the original analysis, most moderators proved to be effective, indicating that there is not the risky-choice framing effect. Rather, the effect size varies with different manipulations of the framing task. Taken together, the p-curve analysis shows that there are reliable risky-choice framing effects, and that there is no evidence of intense p-hacking. Comparing the corrected estimate to the effect size reported in the Many Labs Replication Project (MLRP) on gain-loss framing (d = 0.60) shows that the two estimates are surprisingly similar in size. Finally, we conducted a new meta-analysis of risk framing experiments published in 2016 and again found a similar effect size (d = 0.56). Thus, although there is discussion on the adequate explanation for framing effects, there is no doubt about their existence: risky-choice framing effects are highly reliable and robust. No replicability crisis there.

References

  • Asendorpf, J. B., Connor, M., De Fruyt, F., De Houwer, J., Denissen, J. J. A., Fiedler, K., … Wicherts, J. M. (2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27, 108–119. https://doi.org/10.1002/per.1919 First citation in articleCrossrefGoogle Scholar

  • Bishop, D. V. M. & Thompson, P. A. (2016). Problems in using p-curve analysis and text-mining to detect rate of p-hacking and evidential value. PeerJ, 4, e1715. https://doi.org/10.7717/peerj.1715 First citation in articleCrossrefGoogle Scholar

  • Borah, P. (2011). Conceptual issues in framing theory: A systematic examination of a decade’s literature. Journal of Communication, 61, 246–263. https://doi.org/10.1111/j.1460-2466.2011.01539.x First citation in articleCrossrefGoogle Scholar

  • Borenstein, M., Hedges, L. V., Higgins, J. P. & Rothstein, H. (2009). Introduction to meta-analysis. Chichester, UK: Wiley. First citation in articleCrossrefGoogle Scholar

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). London, UK: Erlbaum. First citation in articleGoogle Scholar

  • Dickersin, K. (2005). Publication bias: Recognizing the problem, understanding its origins and scope, and preventing harm. In H. R. RothsteinA. J. SuttonM. BorensteinEds., Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 11–34). Chichester, UK: Wiley. First citation in articleGoogle Scholar

  • Dickersin, K., Chan, S., Chalmers, T. C., Sacks, H. S. & Smith, H. (1987). Publication bias and clinical trials. Controlled Clinical Trials, 8, 343–353. https://doi.org/10.1016/0197-2456(87)90155-3 First citation in articleCrossrefGoogle Scholar

  • Druckman, J. N. (2004). Political preference formation: Competition, deliberation, and the (ir)relevance of framing effects. American Political Science Review, 98, 671–686. https://doi.org/10.1017/S0003055404041413 First citation in articleCrossrefGoogle Scholar

  • Duval, S. & Tweedie, R. (2000). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56, 455–463. https://doi.org/10.1111/j.0006-341X.2000.00455.x First citation in articleCrossrefGoogle Scholar

  • Fanelli, D. (2010). ‘‘Positive’’ results increase down the hierarchy of the sciences. PLoS One, 5(4), e10068. https://doi.org/10.1371/journal.pone.0010068 First citation in articleCrossrefGoogle Scholar

  • Gallagher, K. M. & Updegraff, J. A. (2012). Health message framing effects on attitudes, intentions, and behavior: A meta-analytic review. Annals of Behavioral Medicine, 43, 101–116. https://doi.org/10.1007/s12160-011-9308-7 First citation in articleCrossrefGoogle Scholar

  • Gelman, A. & Loken, E. (2014). The statistical crisis in science [online]. American Scientist, 102. Retrieved from http://www.americanscientist.org/issues/feature/2014/6/the-statistical-crisis-in-science First citation in articleCrossrefGoogle Scholar

  • Hartgerink, C. H., van Aert, R. C., Nuijten, M. B., Wicherts, J. M. & van Assen, M. A. (2016). Distributions of p-values smaller than .05 in psychology: What is going on? PeerJ, 4, e1935. https://doi.org/10.7717/peerj.1935 First citation in articleCrossrefGoogle Scholar

  • Head, M. L., Holman, L., Lanfear, R., Kahn, A. T. & Jennions, M. D. (2015). The extent and consequences of p-hacking in science. PLoS Biol, 13, e1002106. https://doi.org/10.1371/journal.pbio.1002106 First citation in articleCrossrefGoogle Scholar

  • Inzlicht, M., Gervais, W. & Berkman, E. (2015). Bias-correction techniques alone cannot determine whether ego depletion is different from zero: Commentary on Carter, Kofler, Forster, & McCullough, 2015. Social Sciences Research Network. https://doi.org/10.2139/ssrn.2659409 First citation in articleGoogle Scholar

  • Kahneman, D. & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Economica, 47, 263–291. First citation in articleGoogle Scholar

  • Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B. Jr., Bahník, Š., Bernstein, M. J., … Cemalcilar, Z. (2014). Investigating variation in replicability. Social Psychology, 45, 142–152. https://doi.org/10.1027/1864-9335/a000178 First citation in articleLinkGoogle Scholar

  • Kühberger, A. (1998). The influence of framing on risky decisions: A meta-analysis. Organizational Behavior and Human Decision Processes, 75, 23–55. https://doi.org/10.1006/obhd.1998.2781 First citation in articleCrossrefGoogle Scholar

  • Kühberger, A. (2017). Framing. In R. PohlEd., Cognitive illusions (2nd ed., pp. 79–98). New York, NY: Psychology Press. First citation in articleGoogle Scholar

  • Kühberger, A., Fritz, A. & Scherndl, T. (2014). Publication bias in psychology: A diagnosis based on the correlation between effect size and sample size. PLoS One, 9, e105825. https://doi.org/10.1371/journal.pone.0105825 First citation in articleCrossrefGoogle Scholar

  • Kühberger, A., Schulte-Mecklenbeck, M. & Perner, J. (1999). The effects of framing, reflection, probability, and payoff on risk preference in choice tasks. Organizational Behavior and Human Decision Processes, 78, 204–231. https://doi.org/10.1006/obhd.1999.2830 First citation in articleCrossrefGoogle Scholar

  • Levin, I. P., Schneider, S. & Gaeth, G. J. (1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149–188. https://doi.org/10.1006/obhd.1998.2804 First citation in articleCrossrefGoogle Scholar

  • O’Keefe, D. J. & Jensen, J. D. (2008). Do loss-framed persuasive messages engender greater message processing than do gain-framed messages? A meta-analytic review. Communication Studies, 59, 51–67. https://doi.org/10.1080/10510970701849388 First citation in articleCrossrefGoogle Scholar

  • O’Keefe, D. J. & Jensen, J. D. (2009). The relative persuasiveness of gain-framed and loss-framed messages for encouraging disease detection behaviors: A meta-analytic review. Journal of Communication, 59, 296–316. https://doi.org/10.1111/j.1460-2466.2009.01417.x First citation in articleCrossrefGoogle Scholar

  • Piñon, A. & Gambara, H. (2005). A meta-analytic review of framing effect: Risky, attribute and goal framing. Psicothema, 17, 325–331. First citation in articleGoogle Scholar

  • Rosenthal, R. & Rubin, D. B. (1982). A simple, general purpose display of magnitude of experimental effect. Journal of Educational Psychology, 74, 166–169. First citation in articleCrossrefGoogle Scholar

  • Rothstein, H. R., Sutton, A. J. & Borenstein, M. (2005). Publication bias in meta-analysis: Prevention, assessment and adjustments. Hoboken, NJ: Wiley. https://doi.org/10.1002/0470870168 First citation in articleCrossrefGoogle Scholar

  • Schooler, J. (2011). Unpublished results hide the decline effect. Nature, 470, 437. https://doi.org/10.1038/470437a First citation in articleCrossrefGoogle Scholar

  • Sedlmeier, P. & Gigerenzer, G. (1989). Do studies of statistical power have an effect on the power of studies? Psychological Bulletin, 105, 309–316. https://doi.org/10.1037/0033-2909.105.2.309 First citation in articleCrossrefGoogle Scholar

  • Simmons, J. P., Nelson, L. D. & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. https://doi.org/10.1177/0956797611417632 First citation in articleCrossrefGoogle Scholar

  • Simonsohn, U., Nelson, L. D. & Simmons, J. P. (2014a). p-Curve and effect size: Correcting for publication bias using only significant results. Perspectives on Psychological Science, 9, 666–681. https://doi.org/10.1177/1745691614553988 First citation in articleCrossrefGoogle Scholar

  • Simonsohn, U., Nelson, L. D. & Simmons, J. P. (2014b). p-Curve: A key to the file-drawer. Journal of Experimental Psychology: General, 14, 534–547. https://doi.org/10.1037/a0033242 First citation in articleCrossrefGoogle Scholar

  • Simonsohn, U., Nelson, L. D. & Simmons, J. P. (2016). p-Curve app (Version 4.05) [Web application]. Retrieved from http://www.p-curve.com/app4/ First citation in articleGoogle Scholar

  • Simonsohn, U., Simmons, J. P. & Nelson, L. D. (2015). Better p-curves: Making p-curve analysis more robust to errors, fraud, and ambitious p-hacking, a Reply to Ulrich and Miller (2015). Journal of Experimental Psychology: General, 144, 1146–1152. https://doi.org/10.1037/xge0000104 First citation in articleCrossrefGoogle Scholar

  • Stanley, T. D. (2008). Meta-regression methods for detecting and estimating empirical effects in the presence of publication selection. Oxford Bulletin of Economics and Statistics, 70, 103–127. https://doi.org/10.1111/j.1468-0084.2007.00487.x First citation in articleGoogle Scholar

  • Stanley, T. D. & Doucouliagos, H. (2014). Meta-regression approximations to reduce publication selection bias. Research Synthesis Methods, 5, 60–78. https://doi.org/10.1002/jrsm.1095 First citation in articleCrossrefGoogle Scholar

  • Tversky, A. & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211, 453–458. https://doi.org/10.1126/science.7455683 First citation in articleCrossrefGoogle Scholar

  • van Aert, R. C., Wicherts, J. M. & van Assen, M. A. (2016). Conducting meta-analyses based on p values: Reservations and recommendations for applying p-uniform and p-curve. Perspectives on Psychological Science, 11, 713–729. https://doi.org/10.1177/1745691616650874 First citation in articleCrossrefGoogle Scholar

  • van Assen, M. A., van Aert, R. & Wicherts, J. M. (2015). Meta-analysis using effect size distributions of only statistically significant studies. Psychological Methods, 20, 293–309. https://doi.org/10.1037/met0000025 First citation in articleCrossrefGoogle Scholar

  • World Health Organization. (2015). WHO statement on public disclosure of clinical trial results. Retrieved from http://www.who.int/ictrp/results/reporting/en/ First citation in articleGoogle Scholar