Skip to main content
Original Article

What Can We Learn From Factorial Surveys About Human Behavior?

A Validation Study Comparing Field and Survey Experiments on Discrimination

Published Online:https://doi.org/10.1027/1614-2241/a000161

Abstract. Factorial survey experiments are increasingly used in the social sciences to investigate behavioral intentions. The measurement of self-reported behavioral intentions with factorial survey experiments frequently assumes that the determinants of intended behavior affect actual behavior in a similar way. We critically investigate this fundamental assumption using the misdirected email technique. Student participants of a survey were randomly assigned to a field experiment or a survey experiment. The email informs the recipient about the reception of a scholarship with varying stakes (full-time vs. book) and recipient’s names (German vs. Arabic). In the survey experiment, respondents saw an image of the same email. This validation design ensured a high level of correspondence between units, settings, and treatments across both studies. Results reveal that while the frequencies of self-reported intentions and actual behavior deviate, treatments show similar relative effects. Hence, although further research on this topic is needed, this study suggests that determinants of behavior might be inferred from behavioral intentions measured with survey experiments.

References

  • Abraham, M., Auspurg, K. & Hinz, T. (2010). Migration decisions within dual-earner partnerships: A test of bargaining theory. Journal of Marriage and Family, 72, 876–892. https://doi.org/10.1111/j.1741-3737.2010.00736.x First citation in articleCrossrefGoogle Scholar

  • Ajzen, I., Brown, T. C. & Carvajal, F. (2004). Explaining the discrepancy between intentions and actions: The case of hypothetical bias in contingent valuation. Personality and Social Psychology Bulletin, 30, 1108–1121. https://doi.org/10.1177/0146167204264079 First citation in articleCrossrefGoogle Scholar

  • Armacost, R. L., Hosseini, J. C., Morris, S. A. & Rehbein, K. A. (1991). An empirical comparison of direct questioning, scenario, and randomized response methods for obtaining sensitive business information. Decision Sciences, 22, 1073–1099. https://doi.org/10.1111/j.1540-5915.1991.tb01907.x First citation in articleCrossrefGoogle Scholar

  • Armitage, C. J. & Conner, M. (2001). Efficacy of the theory of planned behaviour: A meta-analytic review. British Journal of Social Psychology, 40, 471–499. https://doi.org/10.1348/014466601164939 First citation in articleCrossrefGoogle Scholar

  • Auspurg, K. & Hinz, T. (2015). Factorial survey experiments. London, UK/Thousand Oaks, CA: Sage Publications. First citation in articleCrossrefGoogle Scholar

  • Auspurg, K., Hinz, T., Liebig, S. & Sauer, C. (2015). The factorial survey as a method for measuring sensitive issues. In U. EngelB. JannP. LynnA. ScherpenzeelP. SturgisEds., Improving survey methods: Lessons from recent research (pp. 137–150). New York, NY/Hove, UK: Routledge. First citation in articleGoogle Scholar

  • Berger, R. & Wolbring, T. (2015). Kontrafaktische Kausalität und eine Typologie sozialwissenschaftlicher Experimente [The counterfactual approach to causal inference and a typology of social science experiments]. Experimente in den Sozialwissenschaften. Soziale Welt Sonderband 22. Baden-Baden, Germany: Nomos, 34–52. First citation in articleCrossrefGoogle Scholar

  • Best, H. & Wolf, C. (2015). In H. BestC. WolfEds., Logistic regression. In M. KeuschniggT. WolbringEds., The Sage handbook of regression analysis and causal inference (pp. 153–172). London, UK: Sage. First citation in articleGoogle Scholar

  • Bushman, B. J. & Bonacci, A. M. (2004). You’ve got mail: Using e-mail to examine the effect of prejudiced attitudes on discrimination against Arabs. Journal of Experimental Social Psychology, 40, 753–759. https://doi.org/10.1016/j.jesp.2004.02.001 First citation in articleCrossrefGoogle Scholar

  • Campbell, D. T. (1957). Factors relevant to the validity of experiments in social settings. Psychological Bulletin, 54, 297–312. https://doi.org/10.1037/h0040950 First citation in articleCrossrefGoogle Scholar

  • Cronbach, L. J. (1982). Designing evaluations of educational and social programs. San Francisco, CA: Jossey-Bass. First citation in articleGoogle Scholar

  • Di Stasio, V. (2014). Education as a signal of trainability: Results from a vignette study with Italian employers. European Sociological Review, 30, 796–809. https://doi.org/10.1093/esr/jcu074 First citation in articleCrossrefGoogle Scholar

  • Diehl, C., Andorfer, V. A., Khoudja, Y. & Krause, K. (2013). Not in my kitchen? Ethnic discrimination and discrimination intentions in shared housing among university students in Germany. Journal of Ethnic and Migration Studies, 39, 1679–1697. https://doi.org/10.1080/1369183X.2013.833705 First citation in articleCrossrefGoogle Scholar

  • Diekmann, A. & Preisendörfer, P. (2003). Green and greenback: The behavioral effects of environmental attitudes in low-cost and high-cost situations. Rationality and Society, 15, 441–472. https://doi.org/10.1177/1043463103154002 First citation in articleCrossrefGoogle Scholar

  • Eifler, S. (2007). Evaluating the validity of self-reported deviant behavior using vignette analyses. Quality and Quantity, 41, 303–318. https://doi.org/10.1007/s11135-007-9093-3 First citation in articleCrossrefGoogle Scholar

  • Eifler, S. (2010). Validity of a factorial survey approach to the analysis of criminal behavior. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 6, 139–146. https://doi.org/10.1027/1614-2241/a000015 First citation in articleLinkGoogle Scholar

  • Eifler, S. & Petzold, K. (in press). Validity aspects of vignette experiments. Expected ‘What-If’ differences between reported and actual behavior. In P. J. LavrakasE. D. de LeeuwA. L. HolbrookC. KennedyEds., Experimental methods in survey research: Techniques that combine random sampling with random assignment. Hoboken, NJ: John Wiley & Sons. First citation in articleGoogle Scholar

  • Findley, M. G., Laney, B., Nielson, D. L. & Sharman, J. C. (2017). External validity in parallel global field and survey experiments on anonymous incorporation. The Journal of Politics, 79, 856–872. https://doi.org/10.1086/690615 First citation in articleCrossrefGoogle Scholar

  • Fishbein, M. & Ajzen, I. (2010). Predicting and changing behavior. The reasoned action approach. Hove, UK: Psychology Press. First citation in articleGoogle Scholar

  • Graeff, P., Sattler, S., Mehlkop, G. & Sauer, C. (2014). Incentives and inhibitors of abusing academic positions: Analysing university students’ decision about bribing academic staff. European Sociological Review, 30, 230–241. https://doi.org/10.1093/esr/jct036 First citation in articleCrossrefGoogle Scholar

  • Groß, J. & Börensen, C. (2009). Wie valide sind Verhaltensmessungen mittels Vignetten? Ein methodischer Vergleich von faktoriellem Survey und Verhaltensbeobachtung [How valid are measures of behavior using vignettes? A comparison of factorial survey and observed behavior]. In P. KriwyC. GrossEds., Klein aber fein! Quantitative Sozialforschung mit kleinen Fallzahlen. Wiesbaden, Germany: VS Verlag für Sozialwissenschaften, 149–178. First citation in articleCrossrefGoogle Scholar

  • Hainmueller, J., Hangartner, D. & Yamamoto, T. (2015). Validating vignette and conjoint survey experiments against real-world behavior. Proceedings of the National Academy of Sciences (112, 2395–2400). https://doi.org/10.1073/pnas.1416587112 First citation in articleCrossrefGoogle Scholar

  • Hox, J. J., Kreft, I. G. & Hermkens, P. L. J. (1991). The analysis of factorial surveys. Sociological Methods and Research, 19, 439–510. https://doi.org/10.1177/0049124191019004003 First citation in articleCrossrefGoogle Scholar

  • Hughes, R. & Huby, M. (2004). The construction and interpretation of vignettes in social research. Social Work & Social Sciences Review, 11, 36–51. First citation in articleCrossrefGoogle Scholar

  • Jasso, G. (2006). Factorial survey methods for studying beliefs and judgments. Sociological Methods and Research, 34, 334–423. https://doi.org/10.1177/0049124105283121 First citation in articleCrossrefGoogle Scholar

  • Karlson, K. B., Holm, A. & Breen, R. (2012). Comparing regression coefficients between same-sample nested models using logit and probit: A new method. Sociological Methodology, 42, 286–313. https://doi.org/10.1177/0081175012444861 First citation in articleCrossrefGoogle Scholar

  • Mood, C. (2010). Logistic regression: Why we cannot do what we think we can do, and what we can do about it. European Sociological Review, 26, 67–82. https://doi.org/10.1093/esr/jcp006 First citation in articleCrossrefGoogle Scholar

  • Morgan, S. L. & Winship, C. (2015). Counterfactuals and causal inference: Methods and principles for social research (2nd ed.). Cambridge, UK: Cambridge University Press. First citation in articleGoogle Scholar

  • Mutz, D. C. (2011). Population-based survey experiments. Princeton, NJ: Princeton University Press. First citation in articleCrossrefGoogle Scholar

  • Nisic, N. & Auspurg, K. (2009). Faktorieller Survey und Klassische Bevölkerungsumfrage im Vergleich – Validität, Grenzen und Möglichkeiten beider Ansätze [Comparing factorial survey and classical survey. Validity, limitations and potentials of both approaches] Klein aber fein! Quantitative Sozialforschung mit kleinen Fallzahlen. Wiesbaden, Germany: VS Verlag für Sozialwissenschaften, 211–245. First citation in articleCrossrefGoogle Scholar

  • Pager, D. & Quillian, L. (2005). Walking the talk? What employers say versus what they do. In P. KriwyC. GrossEds., American Sociological Review (70, 355–380. https://doi.org/10.1177/000312240507000301 First citation in articleCrossrefGoogle Scholar

  • Raub, W. & Buskens, V. (2008). Theory and empirical research in analytical sociology: The case of cooperation in problematic social situations. Analyse & Kritik, 30, 689–722. https://doi.org/10.1515/auk-2008-0218 First citation in articleCrossrefGoogle Scholar

  • Rossi, P. H. (1979). Vignette analysis: Uncovering the normative structure of complex judgments. In R. K. MertonJ. S. ColemanP. H. RossiEds., Qualitative and quantitative social research: Papers in honor of Paul F. Lazarsfeld (pp. 176–186). New York, NY: Free Press. First citation in articleGoogle Scholar

  • Rossi, P. H. & Anderson, A. B. (1982). The factorial survey approach: An introduction. In P. H. RossiS. L. NockEds., Measuring Social Judgments. The Factorial Approach (pp. 15–67). Beverly Hills, CA: Sage Publications. First citation in articleGoogle Scholar

  • Shadish, W. R., Cook, T. D. & Campbell, D. T. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston, MA/New York, NY: Houghton Mifflin. First citation in articleGoogle Scholar

  • Snijders, T. A. B. & Bosker, R. J. (2012). Multilevel analysis. An introduction to basic and advanced multilevel modeling. Los Angeles, CA: Sage. First citation in articleGoogle Scholar

  • Stern, S. E. & Faber, J. E. (1997). The lost e-mail method: Milgram’s lost-letter technique in the age of the internet. Behavior Research Methods, Instruments, & Computers, 29, 260–263. https://doi.org/10.3758/BF03204823 First citation in articleCrossrefGoogle Scholar

  • Stocké, V. (2007). Determinants and consequences of survey respondents’ social desirability beliefs about racial attitudes. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 3, 125–138. https://doi.org/10.1027/1614-2241.3.3.125 First citation in articleLinkGoogle Scholar

  • Tourangeau, R. & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133, 859–883. First citation in articleCrossrefGoogle Scholar

  • Tykocinski, O. E. & Bareket-Bojmel, L. (2009). The lost e-mail technique: Use of an implicit measure to assess discriminatory attitudes toward two minority groups in Israel. Journal of Applied Social Psychology, 39, 62–81. https://doi.org/10.1111/j.1559-1816.2008.00429.x First citation in articleCrossrefGoogle Scholar

  • Vaes, J., Paladino, M.-P. & Leyens, J.-P. (2002). The lost e-mail: Prosocial reactions induced by uniquely human emotions. British Journal of Social Psychology, 41, 521–534. https://doi.org/10.1348/014466602321149867 First citation in articleCrossrefGoogle Scholar

  • Wallander, L. (2009). 25 years of factorial surveys in sociology: A review. Social Science Research, 38, 505–520. https://doi.org/10.1016/j.ssresearch.2009.03.004 First citation in articleCrossrefGoogle Scholar