A Meta-Analytic Re-Appraisal of the Framing Effect
Abstract
Abstract. We reevaluated and reanalyzed the data of Kühberger’s (1998) meta-analysis on framing effects in risky decision making by using p-curve. This method uses the distribution of only significant p-values to correct the effect size, thus taking publication bias into account. We found a corrected overall effect size of d = 0.52, which is considerably higher than the effect reported by Kühberger (d = 0.31). Similarly to the original analysis, most moderators proved to be effective, indicating that there is not the risky-choice framing effect. Rather, the effect size varies with different manipulations of the framing task. Taken together, the p-curve analysis shows that there are reliable risky-choice framing effects, and that there is no evidence of intense p-hacking. Comparing the corrected estimate to the effect size reported in the Many Labs Replication Project (MLRP) on gain-loss framing (d = 0.60) shows that the two estimates are surprisingly similar in size. Finally, we conducted a new meta-analysis of risk framing experiments published in 2016 and again found a similar effect size (d = 0.56). Thus, although there is discussion on the adequate explanation for framing effects, there is no doubt about their existence: risky-choice framing effects are highly reliable and robust. No replicability crisis there.
References
2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27, 108–119. https://doi.org/10.1002/per.1919
(2016). Problems in using p-curve analysis and text-mining to detect rate of p-hacking and evidential value. PeerJ, 4, e1715. https://doi.org/10.7717/peerj.1715
(2011). Conceptual issues in framing theory: A systematic examination of a decade’s literature. Journal of Communication, 61, 246–263. https://doi.org/10.1111/j.1460-2466.2011.01539.x
(2009). Introduction to meta-analysis. Chichester, UK: Wiley.
(1988). Statistical power analysis for the behavioral sciences (2nd ed.). London, UK: Erlbaum.
(2005).
(Publication bias: Recognizing the problem, understanding its origins and scope, and preventing harm . In H. R. RothsteinA. J. SuttonM. BorensteinEds., Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 11–34). Chichester, UK: Wiley.1987). Publication bias and clinical trials. Controlled Clinical Trials, 8, 343–353. https://doi.org/10.1016/0197-2456(87)90155-3
(2004). Political preference formation: Competition, deliberation, and the (ir)relevance of framing effects. American Political Science Review, 98, 671–686. https://doi.org/10.1017/S0003055404041413
(2000). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56, 455–463. https://doi.org/10.1111/j.0006-341X.2000.00455.x
(2010). ‘‘Positive’’ results increase down the hierarchy of the sciences. PLoS One, 5(4), e10068. https://doi.org/10.1371/journal.pone.0010068
(2012). Health message framing effects on attitudes, intentions, and behavior: A meta-analytic review. Annals of Behavioral Medicine, 43, 101–116. https://doi.org/10.1007/s12160-011-9308-7
(2014). The statistical crisis in science [online]. American Scientist, 102. Retrieved from http://www.americanscientist.org/issues/feature/2014/6/the-statistical-crisis-in-science
(2016). Distributions of p-values smaller than .05 in psychology: What is going on? PeerJ, 4, e1935. https://doi.org/10.7717/peerj.1935
(2015). The extent and consequences of p-hacking in science. PLoS Biol, 13, e1002106. https://doi.org/10.1371/journal.pbio.1002106
(2015). Bias-correction techniques alone cannot determine whether ego depletion is different from zero: Commentary on Carter, Kofler, Forster, & McCullough, 2015. Social Sciences Research Network. https://doi.org/10.2139/ssrn.2659409
(1979). Prospect theory: An analysis of decision under risk. Economica, 47, 263–291.
(2014). Investigating variation in replicability. Social Psychology, 45, 142–152. https://doi.org/10.1027/1864-9335/a000178
(1998). The influence of framing on risky decisions: A meta-analysis. Organizational Behavior and Human Decision Processes, 75, 23–55. https://doi.org/10.1006/obhd.1998.2781
(2017).
(Framing . In R. PohlEd., Cognitive illusions (2nd ed., pp. 79–98). New York, NY: Psychology Press.2014). Publication bias in psychology: A diagnosis based on the correlation between effect size and sample size. PLoS One, 9, e105825. https://doi.org/10.1371/journal.pone.0105825
(1999). The effects of framing, reflection, probability, and payoff on risk preference in choice tasks. Organizational Behavior and Human Decision Processes, 78, 204–231. https://doi.org/10.1006/obhd.1999.2830
(1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149–188. https://doi.org/10.1006/obhd.1998.2804
(2008). Do loss-framed persuasive messages engender greater message processing than do gain-framed messages? A meta-analytic review. Communication Studies, 59, 51–67. https://doi.org/10.1080/10510970701849388
(2009). The relative persuasiveness of gain-framed and loss-framed messages for encouraging disease detection behaviors: A meta-analytic review. Journal of Communication, 59, 296–316. https://doi.org/10.1111/j.1460-2466.2009.01417.x
(2005). A meta-analytic review of framing effect: Risky, attribute and goal framing. Psicothema, 17, 325–331.
(1982). A simple, general purpose display of magnitude of experimental effect. Journal of Educational Psychology, 74, 166–169.
(2005). Publication bias in meta-analysis: Prevention, assessment and adjustments. Hoboken, NJ: Wiley. https://doi.org/10.1002/0470870168
(2011). Unpublished results hide the decline effect. Nature, 470, 437. https://doi.org/10.1038/470437a
(1989). Do studies of statistical power have an effect on the power of studies? Psychological Bulletin, 105, 309–316. https://doi.org/10.1037/0033-2909.105.2.309
(2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. https://doi.org/10.1177/0956797611417632
(2014a). p-Curve and effect size: Correcting for publication bias using only significant results. Perspectives on Psychological Science, 9, 666–681. https://doi.org/10.1177/1745691614553988
(2014b). p-Curve: A key to the file-drawer. Journal of Experimental Psychology: General, 14, 534–547. https://doi.org/10.1037/a0033242
(2016). p-Curve app (Version 4.05) [Web application]. Retrieved from http://www.p-curve.com/app4/
(2015). Better p-curves: Making p-curve analysis more robust to errors, fraud, and ambitious p-hacking, a Reply to Ulrich and Miller (2015). Journal of Experimental Psychology: General, 144, 1146–1152. https://doi.org/10.1037/xge0000104
(2008). Meta-regression methods for detecting and estimating empirical effects in the presence of publication selection. Oxford Bulletin of Economics and Statistics, 70, 103–127. https://doi.org/10.1111/j.1468-0084.2007.00487.x
(2014). Meta-regression approximations to reduce publication selection bias. Research Synthesis Methods, 5, 60–78. https://doi.org/10.1002/jrsm.1095
(1981). The framing of decisions and the psychology of choice. Science, 211, 453–458. https://doi.org/10.1126/science.7455683
(2016). Conducting meta-analyses based on p values: Reservations and recommendations for applying p-uniform and p-curve. Perspectives on Psychological Science, 11, 713–729. https://doi.org/10.1177/1745691616650874
(2015). Meta-analysis using effect size distributions of only statistically significant studies. Psychological Methods, 20, 293–309. https://doi.org/10.1037/met0000025
(2015). WHO statement on public disclosure of clinical trial results. Retrieved from http://www.who.int/ictrp/results/reporting/en/
. (