Introduction
The decline of survey response rates is widely recognized (Meyer, Mok, and Sullivan 2015; National Research Council 2013), especially among college-aged (Dey 1997) and younger populations (Bosch, Revilla, and Paura 2019). Surveys are not only a primary data collection tool in higher education research (Pike 2007), but researchers in other fields frequently target undergraduate students as a convenient sample frame for a host of other topics. Some research indicates that undergraduate student respondents might be qualitatively different from nonrespondents on dimensions that may be of interest to researchers. For example, females, students with high GPAs, traditional students, and students with fewer semesters of enrollment are more likely to respond to surveys (Standish and Umbach 2019). As a result, the experiences of some groups, such as males or the unengaged, may be hidden from view when they differ from likely survey respondents, resulting in a biased understanding of the survey topic.
A key priority for survey methodology is to develop methods to decrease nonresponse and mitigate possible nonresponse bias that are not cost-prohibitive (National Research Council 2013). Moreover, future research is needed to examine how to enhance the salience of surveys for groups that traditionally pose challenges for survey response, such as males. Leverage-salience theory offers a framework for understanding how to tailor a survey in order to increase the salience of features with high leverage for survey sample members. One low-cost method of enhancing survey salience is to develop effective communications. A growing body of work examines survey email invitations as a means to increase web survey response rates (Fan and Yan 2010), including for college-aged populations (e.g., Heerwegh 2005; Kaplowitz et al. 2012; Muñoz-Leiva et al. 2010).
In this study, we examine communication tone as a design feature of the email invitation that may impact a survey’s salience for respondents, through a split-ballot experiment that manipulated language in the salutation and body of the email message, on a population of undergraduate students at a large midwestern public university. We also call upon data from focus groups with targeted subgroups to help motivate the underpinnings of our experiment and situate it within a framework of the leverage-saliency theory of nonresponse.
Theory
Leverage-Saliency Theory is a key conceptual model for understanding acquiescence with the request to participate in a survey (Groves, Singer, and Corning 2000). The theory uses the metaphor of a scale to describe individuals’ survey participation, whereby certain attributes, such as the topic, sponsor, or as in our study, the communication tone in the email invitation, of a survey are likened to weights on the scale, and the farther from the fulcrum any given attribute is, the more “leverage” it has to help tip the individual into agreeing to the survey request, assuming the individual views the given attribute positively. Some attributes may already come with more leverage, or importance, than others to an individual, so they may have some ordered distance from the center which might differ from person to person. And some attributes may grow (or diminish) in size through the efforts of an interviewer or the survey design in an attempt to increase the “salience” of an attribute for the individual and thereby encourage participation.
Survey sponsorship has been demonstrated to affect response rates, with sponsors perceived as legitimate, such as government or academic sponsors, typically garnering higher response rates (Groves and Peytcheva 2008). Survey sponsorship also situates the survey within a social context for respondents, which shapes responses as well as response rates (Fan and Yan 2010). Features of the email invitation can increase sponsor salience and response rates when the sponsor has leverage with sample members (e.g., Kaplowitz et al. 2012). In this study, we examine whether communication tone affects response rates in a survey with a legitimate and relevant sponsor. We chose communication tone as an important design feature for manipulation for this population, taking a cue from an abundant literature that describes the communication style of millennial and postmillennial students as “informal” (Hanna 2003; Hartman and McCambridge 2011), offering tips on effectively engaging these young people. On the one hand, if the language of the email survey request is tailored in a way that is purportedly more familiar and agreeable to the target population, it may help increase the chance that the other relevant survey attributes might be heard. On the other hand, if the tone of an email survey request can affect the perceived legitimacy of the survey and its sponsor, and an informal tone violates respondents’ expectations for the survey and the sponsor, then this could make them less likely to participate.
Methods
The target population, broadly, for our study is undergraduate students at a large midwestern public university. Most of them aged 18 to 24, generationally fall into the so-called millennials and postmillennials (Pew Research Center 2014). The motivation for this experiment came from an attempt to increase response rates for a web survey on arts participation habits (Wave 2) that followed up on the results of a similar survey that had previously been administered on campus (Wave 1).
Wave 1 of the web survey consisted of simple random sample of 50% (16,430) of undergraduate students enrolled in classes during fall semester 2015.[1] The field period for Wave 1 of the survey was approximately one month beginning on Monday, September 21, 2015, with a closing date of Monday, October 19, 2015. In addition to the email survey invitation, we sent three reminder messages. All messages mentioned that respondents would be entered into a drawing for a series of prizes. The undergraduate survey response rate at the close of the survey was calculated as 13.3% using American Association for Public Opinion Research (AAPOR) RR2 (AAPOR 2011).[2]
We began by analyzing demographic and response characteristics of Wave 1 respondents and comparing these data to representative data on the sample frame. Our findings led us to identify three nonmutually exclusive target populations who were underrepresented in Wave 1 of the survey: males; upperclassmen (i.e., juniors and seniors); and non-arts majors. Based on these findings, we recruited undergraduate students in these three groups to participate in a series of focus groups in summer 2017. In the focus groups, we gauged student opinions about survey recruitment techniques, including type and amount of cash or noncash incentives, behavior regarding interaction with devices used to take surveys, email reading and response habits, and recruitment messaging.
The feedback from the focus groups helped us form the basis of our hypothesis that communication tone in an email survey request can serve as a moderator of sponsor salience and that this effect might differ by gender. Specifically, in a focus group involving exclusively undergraduate males, the following conversation occurred:
Moderator: [REFERRING TO EXAMPLE RECRUITMENT MESSAGE] “Do you have any sense in this email whether it’s too formal? Informal? Would you prefer something more conversational?”
Male Participant 1: “No. That doesn’t bother me.” [General agreement]
Male Participant 2: “If it’s formal, but not stuffy. But if it’s not formal, then it’s like, how professional is this?”
Male Participant 1: “I’m not friends with the survey. I’m not your pal.”
Male Participant 3: "I agree with that. I especially don’t like emails that start with, ‘Hey friends’ from someone I’ve never met. No. Don’t call me your friend. I don’t even know you.
To test the hypothesis that communication tone in survey recruitment messages can moderate sponsor salience, thereby reducing response rates, we implemented a split-ballot experiment in Wave 2 of the survey to test the difference in response between two versions of the email survey invitation based on communication style.[3] While the content remained consistent between both versions, the treatment was written in conversational tone including less strict grammar adherence, use of conversational diction, and exclamatory sentences.
Wave 2 of the survey consisted of a simple random sample of 20,700 undergraduate students enrolled in classes at a large midwestern public university during fall semester 2018. The field period for Wave 2 of the survey was approximately one month beginning on Monday, September 17, 2018, with a closing date of Monday, October 15, 2018. The sample included 50% of underclassmen (freshman, sophomores, and certificate/associate degree-seeking) and 75% of upperclassmen (juniors and seniors).[4] We excluded 49 cases from receiving the survey invitation due to lack of an email address in the sample list. Base weights were calculated to account for differences in survey design that are a result of unequal probabilities of being sampled. The weight is calculated as the inverse of the probability of selection into the sample list:
w= 1π
where
is the base weight and is the probability that an individual would have been selected into the sample (sample count divided by population count The base weights were calculated as described in Table 1.Each respondent in the sample had an equal probability of assignment into the control or treatment group for the survey invitation experiment.
The survey invitations were sent to a total of 20,651 students. All messages mentioned that respondents would be entered into a drawing for a $50 Amazon gift card, in which odds of being selected were 1 in 150, upon completion of the survey. The estimated odds were based on response to the Wave 1 survey. The overall survey response rate at the close of the survey was calculated as 19.2% using AAPOR RR2 (American Association for Public Opinion Research 2016).
Results
We report the sample size and response counts by treatment status and gender in Table 2. Overall, fewer males responded than females despite similar sample sizes, which aligns with findings in the literature about response rates for this subpopulation. Males and females appear to respond differently to the treatment, with fewer males responding to the informal email used in the treatment and more females responding to this email.
Multivariate analyses find a statistically significant interaction between the tone of the email invitation and gender. Using the base weights, we estimate the following model with logistic regression:
Pr(y=1|x)=F(β0+β1treatment+ β2gender+ β3treatment∙gender)
where the probability of y, response, is a function of treatment status, gender, and the interaction between treatment and gender. Table 3 reports the results of a logistic regression, which shows that males are less likely to respond to the survey, with treatment status interacting with gender to produce a negative effect for males. To determine how the effect of the treatment varied by gender, we begin by calculating the predicted probability for each treatment-gender combination. We then examine how the marginal effect of treatment, T, changes as gender, G, changes from female to male:
Marginal Effect of Treatment=Pr(Y=1|G, T=1)−Pr(Y=1|G, T=0)
Table 4 reports the marginal effects. The change in the predicted probability of response for males is statistically significant, such that the likelihood of response for the treatment group is 0.02 less than for the control group.
Discussion
On the surface, the results might appear to be just a function of gender differences in survey response that have been noted in other studies: on average, males are less likely than females to respond to surveys (e.g., Sax, Gilmartin, and Bryant 2003). However, that alone seems insufficient to explain the difference in the experiment given the random assignment into control and treatment groups. As such, it seems plausible that the gendered difference in response for our experiment is genuinely a function of how the recruitment messages were framed.
However, it should be noted that while the results of our experiment with communication tone are statistically significant, the effect size is small. Other design choices, such as a different survey mode or the use of higher monetary incentives, may result in a greater increase in response. But as a method that has little or no cost and is easy to implement, attention to the tone of the survey invitation and its anticipated reception, against a backdrop of respondent expectations about the survey sponsor, may be nevertheless useful for survey practitioners to consider in developing recruitment materials that maximize response.[5] Particularly in self-administered surveys, where an interviewer is not available to address concerns or tailor the recruitment effort to make a particular survey feature more positively salient to an individual respondent, the tone of the survey invitation in an email can be an efficient tool to promote response. Even small response gains in subgroups with typically higher nonresponse by such no-cost methods may be worthwhile.
For surveys of undergraduate student populations, our experiment and focus group data suggest that males may be less receptive to the informal tone in a survey request when perceived as being “from,” or at least endorsed by, their institution. A formal tone in the survey invitation may also contribute to a perception of greater importance or legitimacy of the survey in general. Millennial and postillennial students may themselves communicate in more informal ways, as prior education research suggests, but students’ expectations about communications from professors or other nonstudent members of their institution in a request to participate in a survey may lean toward more formal language. Given that the survey sponsor is typically highly salient for many respondents, response rates may suffer if the tone of the survey request language is inconsistent with respondents’ implicit norms regarding the sponsor. In other words, how you say it matters. For undergraduate students, it may matter more with men.
Appendix
Survey Invitation for Control Condition with Formal Communication Style
FROM ADDRESS: csr@indiana.edu
FROM NAME: Indiana University Center for Survey Research
REPLY-TO EMAIL: csr@indiana.edu
SUBJECT: Tell us about your experience at IU
Dear [FIRSTNAME] [LASTNAME],
From First Thursdays to IU sporting events, there are a lot of activities on campus. Take about 5-7 minutes to tell us how you spend your free time on campus, and you’ll be entered into a drawing for $50 where your chance of winning is 1 in 150. Even if you don’t have much free time or don’t participate in campus activities, we still want to hear from you. Just click the link below.
You have been selected to participate in this survey. Your responses will be used in a research study at IU on participation in activities on campus and how universities—including IU—can make improvements to serve their students better.
<<Complete the survey. SURVEY LINK>>:
• If the link doesn’t work for you, just copy/paste into your browser: [SURVEY URL]
Thank you!
Center for Survey Research
Indiana University
Follow <<this link OPT_OUT LINK>> to stop receiving emails about this project.
For questions about the study email csr@indiana.edu and reference survey ID: [SurveyID].
More information about the Center for Survey Research at Indiana University: https://csr.indiana.edu.
Survey Invitation for Treatment Condition with Informal Communication Style
FROM ADDRESS: csr@indiana.edu
FROM NAME: Indiana University Center for Survey Research
REPLY-TO EMAIL: csr@indiana.edu
SUBJECT: Tell us about your experience at IU
Hey [FIRSTNAME],
First Thursdays, IU sporting events, movies at the IMU… there’s tons of stuff to do on campus. Take about 5-7 minutes to tell us how you spend your free time on campus and we’ll enter you into a drawing for $50 with a 1 in 150 chance of winning. Yes, $50!! Just click the link below.
You’ve been selected to participate in this survey. Why are we doing this survey? It’s for a research study at IU about participation on campus and what universities can do to serve their students better. And even if you’re thinking, “what free time?” or you don’t really care for “campus activities”, we still need your input!
<<Complete the survey SURVEY LINK>>:
• If the link doesn’t work for you, just copy/paste into your browser: [SURVEY URL]
Thank you!
Center for Survey Research
Indiana University
Follow <<this link OPT_OUT LINK>> to stop receiving emails about this project.
For questions about the study email csr@indiana.edu and reference survey ID: [SurveyID].
More information about the Center for Survey Research at Indiana University: https://csr.indiana.edu.
Wave 1 of the survey was administered to faculty, staff, graduate and undergraduate students, while Wave 2 of the survey was only administered to undergraduate students. The overall response rate for Wave 1 was 17.81% using AAPOR RR2 (American Association for Public Opinion Research 2011).
Response rates for faculty, staff, and graduate students were 29.06%, 33.41%, and 20.01%, respectively.
See the Appendix for copies of the Wave 2 survey invitations for treatment and control groups.
Wave 2 of the survey oversampled upperclassmen in order to maximize potential overlap between Wave 1 and Wave 2. In total, 13% of the Wave 2 sample were undergraduate respondents sampled in Wave 1.
We recognize that the effect of gender and tone in surveys may not just apply to college-aged populations but that the effect might be prevalent in surveys aimed at different populations as well. We did not find any studies, however, which tested this effect in different settings.