Skip to main content
Research Spotlight

Can Defense Attorneys Detect Forensic Confirmation Bias?

Effects on Evidentiary Judgments and Trial Strategies

Published Online:https://doi.org/10.1027/2151-2604/a000414

Abstract. Knowledge of task-irrelevant information undermines the probative value of forensic evidence (i.e., forensic confirmation bias). Cross-examination may sensitize jurors to bias – but do attorneys recognize when bias has tainted evidence against their client and adjust their cross-examination strategy accordingly? To address this question, 130 defense attorneys imagined representing a man charged with manslaughter and reviewed a case file that included, among other things, an autopsy report from a medical examiner who was either aware or unaware of their client’s recanted confession before ruling the death a homicide. When the examiner knew of the confession, attorneys rated the autopsy as no less probative or reliable, they were no less confident in their client’s guilt, and only 46% raised the possibility of confirmation bias on cross-examination. Our findings suggest that defense attorneys underappreciate the impact of forensic confirmation bias, such that biased forensic testimony would be better avoided via procedural reform.

References

  • Appleby, S. C., & McCartin, H. R. (2019). Effective assistance of counsel? An empirical study of defense attorneys’ decision-making in false-confession cases. Cardozo Law Review De Novo, 7, 123–165. First citation in articleGoogle Scholar

  • Austin, J. L., & Kovera, M. B. (2015). Cross-examination educates jurors about missing control groups in scientific evidence. Psychology, Public Policy, and Law, 21, 252–264. https://doi.org/10.1037/law0000049 First citation in articleCrossrefGoogle Scholar

  • Chorn, J. A., & Kovera, M. B. (2019). Variations in reliability and validity do not influence judge, attorney, and mock juror decisions about psychological expert evidence. Law and Human Behavior. Advance online publication. https://doi.org/10.1037/lhb0000345 First citation in articleCrossrefGoogle Scholar

  • Cooper, G. S., & Meterko, V. (2019). Cognitive bias research in forensic science: A systematic review. Forensic Science International, 297, 35–46. https://doi.org/10.1016/j.forsciint.2019.01.016 First citation in articleCrossrefGoogle Scholar

  • Dror, I. E., & Charlton, D. (2006). Why experts make errors. Journal of Forensic Identification, 56, 600–616. First citation in articleGoogle Scholar

  • Dror, I. E., Thompson, W. C., Meissner, C. A., Kornfield, I., Krane, D., Saks, M., & Risinger, M. (2015). Context management toolbox: A linear sequential unmasking (LSU) approach for minimizing cognitive bias in forensic decision making. Journal of Forensic Sciences, 60, 1111–1112. https://doi.org/10.1111/1556-4029.12805 First citation in articleCrossrefGoogle Scholar

  • Kassin, S. M., Dror, I. E., & Kukucka, J. (2013). The forensic confirmation bias: Problems, perspectives, and proposed solutions. Journal of Applied Research in Memory and Cognition, 2, 42–52. https://doi.org/10.1016/j.jarmac.2013.01.001 First citation in articleCrossrefGoogle Scholar

  • Key, K. N., Neuschatz, J. S., Bornstein, B. H., Wetmore, S. A., Luecht, K. M., Dellapaolera, K. S., & Quinlivan, D. S. (2018). Beliefs about secondary confession evidence: A survey of laypeople and defense attorneys. Psychology, Crime & Law, 24, 1–13. https://doi.org/10.1080/1068316X.2017.1351968 First citation in articleCrossrefGoogle Scholar

  • Koehler, J. J. (2017). Intuitive error rate estimates for the forensic sciences. Jurimetrics, 57, 153–168. https://doi.org/10.2139/ssrn.2817443 First citation in articleGoogle Scholar

  • Kukucka, J., Kassin, S. M., Zapf, P. A., & Dror, I. E. (2017). Cognitive bias and blindness: A global survey of forensic science examiners. Journal of Applied Research in Memory and Cognition, 6, 452–459. https://doi.org/10.1016/j.jarmac.2017.09.001 First citation in articleCrossrefGoogle Scholar

  • Lieberman, J. D., Carrell, C. A., Miethe, T. D., & Krauss, D. A. (2008). Gold versus platinum: Do jurors recognize the superiority and limitations of DNA evidence compared to other types of forensic evidence? Psychology, Public Policy, and Law, 14, 27–62. https://doi.org/10.1037/1076-8971.14.1.27 First citation in articleCrossrefGoogle Scholar

  • McAuliff, B. D., & Duckworth, T. D. (2010). I spy with my little eye: Jurors’ detection of internal validity threats in expert evidence. Law and Human Behavior, 34, 489–500. https://doi.org/10.1007/s10979-010-9219-3 First citation in articleCrossrefGoogle Scholar

  • Mitchell, G., & Garrett, B. L. (2019). The impact of proficiency testing information and error aversions on the weight given to fingerprint evidence. Behavioral Sciences and the Law, 37, 195–210. https://doi.org/10.1002/bsl.2402 First citation in articleCrossrefGoogle Scholar

  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220. https://doi.org/10.1037/1089-2680.2.2.175 First citation in articleCrossrefGoogle Scholar

  • Pezdek, K., & O’Brien, M. (2014). Plea bargaining and appraisals of eyewitness evidence by prosecutors and defense attorneys. Psychology, Crime & Law, 20, 222–241. https://doi.org/10.1080/1068316X.2013.770855 First citation in articleCrossrefGoogle Scholar

  • Thompson, W. C. (2011). What role should investigative facts play in the evaluation of scientific evidence? Australian Journal of Forensic Sciences, 43, 123–134. https://doi.org/10.1080/00450618.2010.541499 First citation in articleCrossrefGoogle Scholar

  • Thompson, W. C., & Scurich, N. (2019). How cross‐examination on subjectivity and bias affects jurors’ evaluations of forensic science evidence. Journal of Forensic Sciences, 64, 1379–1388. https://doi.org/10.1111/1556-4029.14031 First citation in articleCrossrefGoogle Scholar

  • Westfall, J. (2015). PANGEA: Power analysis for general ANOVA designs. Retrieved from http://jakewestfall.org/publications/pangea.pdf First citation in articleGoogle Scholar