Comparing the Performance of Agree/Disagree and Item-Specific Questions Across PCs and Smartphones
Abstract
Abstract. The use of agree/disagree (A/D) questions is a common technique to measure attitudes. For instance, this question format is employed frequently in the Eurobarometer and International Social Survey Programme (ISSP). Theoretical considerations, however, suggest that A/D questions require a complex processing. Therefore, many survey researchers have recommended the use of item-specific (IS) questions, since they seem to be less burdensome. Parallel to this methodological discussion is the discussion around the use of mobile devices for responding to surveys. However, until now, evidence has been lacking as to whether the use of mobile devices for survey response affects the performance of established question formats. In this study, implemented in the Netquest panel in Spain (N = 1,476), we investigated the cognitive effort and response quality associated with A/D and IS questions across PCs and smartphones. For this purpose, we applied a split-ballot design defined by device type and question format. Our analyses revealed longer response times for IS questions than A/D questions, irrespective of the device type and scale length. Also, the IS questions produced better response quality than their A/D counterparts. All in all, the findings indicate a more conscientious response to IS questions compared to A/D questions.
References
1992). Working memory. Science, 255, 556–559.
(2013).
(Paradata in web surveys . In F. KreuterEd., Improving surveys with paradata. Analytic uses of process information (pp. 261–280). Hoboken, NJ: Wiley.2009). Response latency as an indicator of optimizing in online questionnaires. Bulletin de Méthodologie Sociologique, 103, 5–25.
(1975). Sentence comprehension: A psycholinguistic processing model of verification. Psychological Review, 82, 45–73.
(1969). Statistical power analysis for the behavioral science. New York, NY: Academic Press.
(2017). Reducing speeding in web surveys by providing immediate feedback. Survey Research Methods, 11, 45–61.
(1986). Survey questions: Handcrafting the standardized questionnaire. Beverly Hills, CA: Sage.
(2013). Using paradata to explore item level response times in surveys. Journal of the Royal Statistical Society, 176, 271–286.
(2017). Why do web surveys take longer on smartphones? Social Science Computer Review, 35, 357–377.
(1986). The independence of syntactic processing. Journal of Memory and Language, 25, 348–368.
(1995). Improving survey questions: Design and evaluation. Thousand Oaks, CA: Sage.
(2008).
(Mobile Web surveys: A preliminary discussion of methodological implications . In F. G. ConradM. F. SchoberEds., Envisioning the Survey Interview of the Future (pp. 77–94). New York, NY: Wiley.2004). Survey methodology. Hoboken, NJ: Wiley.
(2000). Understanding robust and exploratory data analysis. New York, NY: Wiley.
(2018). Scale direction effects in agree/disagree and item-specific questions: A comparison of question formats. International Journal of Social Research Methodology, 21, 91–103.
(2015). Investigating response order effects in web surveys using eye tracking. Psihologija, 48, 361–377.
(2018). New insights on the cognitive processing of agree/disagree and item-specific questions. Journal of Survey Statistics and Methodology, 6, 401–417. https://doi.org/10.1093/jssam/smx028
(2018). Investigating the adequacy of response time outlier definitions in computer-based web surveys using paradata “SurveyFocus”. Social Science Computer Review, 36, 369–378.
(2017). Investigating cognitive effort and response quality of question formats using paradata. Field Methods, 29, 365–382.
(1991). Response strategies for coping with the demands of attitude measures in surveys. Applied Cognitive Psychology, 5, 213–236.
(2010).
(Question and questionnaire design . In P. V. MarsdenJ. D. WrightEds., Handbook of survey research (pp. 263–313). Bingley, UK: Emerald.2017, March). Evaluation of agree-disagree versus construct-specific scales in a multi-device web survey. Paper presented at the General Online Research conference, Berlin, Germany
(2016). Improving social media measurement in surveys: Avoiding acquiescence bias in Facebook research. Computers in Human Behavior, 57, 82–92.
(2015). Much ado about acquiescence: The relative validity and reliability of construct-specific and agree-disagree questions. Research and Politics. https://doi.org/10.1177/2053168015604173
(2010). Cognitive burden of survey questions and response times: A psycholinguistic experiment. Applied Cognitive Psychology, 24, 1003–1020.
(2015). Comparing extreme response styles between agree-disagree and item-specific scales. Public Opinion Quarterly, 79, 952–975.
(2008). Completion time and response order effects in web surveys. Public Opinion Quarterly, 72, 914–934.
(2013). Data quality in PC and mobile web surveys. Social Science Computer Review, 31, 725–743.
(2013). Sensitive topics in PC web and mobile web surveys: Is there a difference? Survey Research Methods, 7, 191–205.
(2015).
(A meta-analysis of breakoff rates in mobile web surveys . In D. ToninelliR. PinterP. de PedrazaEds., Mobile research methods: Opportunities and challenges of mobile research methodologies (pp. 81–98). London, UK: Ubiquity Press.2008). Antwortreaktionszeiten in Survey-Analysen: Messung, Auswertung und Anwendungen
([Response times in surveys: Measurement, analyses, and applications] . Wiesbaden, Germany: VS.2017). Analyzing the Survey characteristics, participation, and Evaluation across 186 surveys in an online opt-in panel in Spain. Methods, Data, Analyses, 11, 135–162.
(2018a). Comparing grids with vertical and horizontal item-by-item formats for PCs and Smartphones. Social Science Computer Review, 36, 349–368.
(2018b). Testing different order-by-click question layouts for PC and Smartphone respondents. International Journal of Social Research Methodology. Advance online publication. https://doi.org/10.1080/13645579.2018.1471371
(2018). Willingness of online panelists to perform additional tasks. Methods, Data, Analyses. https://doi.org/10.12758/mda.2018.01
(2014). Choosing the number of categories in agree-disagree scales. Sociological Methods & Research, 43, 73–97.
(2016). Do online access panels really need to allow and adapt surveys to mobile devices? Internet Research, 26, 1209–1227.
(2010). Comparing questions with agree/disagree response options to questions with item-specific response options. Survey Research Method, 4, 61–79.
(2008). The use of client-side paradata in analyzing the effects of visual layout on changing responses in web surveys. Field Methods, 20, 377–398.
(2016a). Smartphones vs. PCs: Does the device affect the web survey experience and the measurement error for sensitive topics? A replication of the Mavletova & Couper’s 2013 experiment. Survey Research Methods, 10, 153–169.
(2016b). Is the Smartphone participation affecting the web survey experience? In M. PratesiC. Pena (Eds.), Proceedings of the 48th Scientific Meeting of the Italian Statistical Society (pp. 1–6). Salerno, Italy: ISS.
(in press). How mobile devices screen size affects data collected in web surveys. In P.C. BeattyA. WilmotD. CollinsL. KayeJ.L. PadillaG. Willis (Eds.), Advances in questionnaire design, development, evaluation and testing. New York, NY: Wiley.
(2013). Analyzing paradata to investigate measurement error. In F. Kreuter (Ed.). Improving surveys with paradata: Analytic uses of process information (pp. 73–96). Hoboken, NJ: Wiley.
(