The effect of interviewers’ motivation and attitudes on respondents‘ consent to contact secondary respondents in a multi-actor design

Jette Schröder, GESIS – Leibniz Institute for the Social Sciences, Germany
Claudia Schmiedeberg, University of Munich (LMU), Germany
Laura Castiglioni, University of Munich (LMU), Germany

In surveys using a multi-actor design, data is collected not only from sampled ‘primary’ respondents, but also from related persons such as partners, colleagues, or friends. For this purpose, primary respondents are asked for their consent to survey such ‘secondary’ respondents. The existence of interviewer effects on unit nonresponse of sampled respondents in surveys is well documented, and research increasingly focuses on interviewer attributes in the non-response process. However, research regarding interviewer effects on unit nonresponse of secondary respondents, more specifically, primary respondents’ consent to include secondary respondents into the survey, is sparse. We use the German Family Panel (pairfam) …

, , , ,

No Comments

A Case Study of Error in Survey Reports of Move Month Using the U.S. Postal Service Change of Address Records

Mary H. Mulry, U.S. Census Bureau, Washington, DC
Elizabeth M. Nichols, U.S. Census Bureau, Washington, DC
Jennifer Hunter Childs, U.S. Census Bureau, Washington, DC

Correctly recalling where someone lived as of a particular date is critical to the accuracy of the once-a-decade U.S. decennial census. The data collection period for the 2010 Census occurred over the course of a few months: February to August, with some evaluation operations occurring up to 7 months after that. The assumption was that respondents could accurately remember moves and move dates on and around April 1st up to 11 months afterwards. We show how statistical analyses can be used to investigate the validity of this assumption by comparing self-reports and proxy-reports of the month of a move in …

, , ,

No Comments

Measuring the survey climate: the Flemish case

Sara Barbier, Centre for Sociological Research, University of Leuven, Belgium
Geert Loosveldt, Centre for Sociological Research, University of Leuven, Belgium
Ann Carton, Research Centre of the Flemish Government, Belgium

Researchers in several countries have regularly reported decreasing response rates for surveys and the need for increased efforts in order to attain an acceptable response rate: two things that can be seen as signs of a worsening survey climate. At the same time, differences between countries and surveys with regard to the actual level and evolution of response rates have also been noted. Some of these differences are probably linked to differences in the survey content or design. This may hinder the study of the evolving survey climate over time, based on different surveys in different countries, because more readily …

, , , , , ,

No Comments

Web survey experiments on fully balanced, minimally balanced and unbalanced rating scales

Mingnan Liu, SurveyMonkey, Palo Alto, California, U.S.A.
Sarah Cho, SurveyMonkey, Palo Alto, California, U.S.A.

When asking attitudinal questions with dichotomous and mutually exclusive response options, the questions can be presented in one of three ways: a full balanced question, a minimally balanced question, and an unbalanced question. Although previous research has compared the fully vs. minimally balanced rating scales, as far as we know, these three types of rating scales have not been tested in a strict experimental setting. In this study, we report two web survey experiments testing these three types of rating scales among 16 different questions. Different from most previous studies, this study used visual display only without any auditory component. …

, , ,

No Comments

Comparing smartphones to tablets for face-to-face interviewing in Kenya

Sarah M. Hughes, Mathematica Policy Research, U.S.
Samuel Haddaway, Yale School of Management, U.S.
Hanzhi Zhou, Mathematica Policy Research, U.S.

Research conducted over the past 30 years has demonstrated a reduction in errors and improvement in data quality when face–to-face social surveys are carried out using computers instead of paper and pencil. However, research examining the quality of data collected by interviewers using mobile devices is in its infancy and is based in developed countries. In a small pilot study conducted during the World Bank’s Kenya State of the Cities Baseline Survey, a face-to-face survey on living conditions, infrastructure and service delivery, the authors compared the quality of data collected using smartphones to data collected using tablets. The study of …

, , , , ,

No Comments

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License