Background
It is common for surveys conducted by or on behalf of government agencies to include required legal or regulatory language. While these requirements vary from country to country and from agency to agency, they often include mention of the authority under which the survey is collected and statements on confidentiality and security protections. Often this is written in legal fine print and presented as a footer on a cover letter (for interviewer-administered surveys) or on the questionnaire itself (in mail surveys).
An open question is whether such required language has an effect—either positive or negative—on potential survey respondents, either on their willingness to participate in the survey or on their willingness to answer survey questions honestly if they decide to participate. Because the statements are required, it is difficult to test the effects of the presence or absence of such language in self-administered surveys. However, we report here on a rare opportunity to experiment with the length and content of such a statement.
Findings in the informed consent literature (see Das and Couper 2014; Perrault and Nazione 2016; Tait et al. 2013) suggest that shorter consent statements may both increase consent rates and knowledge of the consent process. However, earlier work on the design of census cover letters found no effect of a more elaborated confidentiality pledge on census return rates (see Dillman et al. 1996). Our expectation, in line with the informed consent literature, was that a longer statement may deter people from participating in the survey.
We describe the specifics of the experiment below before describing the survey in which it was conducted. We then present the experimental results.
Privacy Act Statement Experiment
Federal surveys in the United States generally must include both Paperwork Reduction Act (PRA) and Privacy Act statements (or an equivalent for surveys falling under different authorization, such as the Public Health Services Act or the Confidential Information Protection and Statistical Efficiency Act [CIPSEA]). The PRA statement includes an estimated time burden to complete the survey and an Office of Management and Budget (OMB) control number. The Privacy Act statement notes the authority collecting the data, the voluntary nature of the request (in most cases), and specifies the conditions under which the data collected in the survey are used and (potentially) shared with others. Essentially, the statement explains the possible exceptions to an absolute pledge of confidentiality. A careful read of this statement may raise—rather than lower—potential confidentiality concerns, as it identifies conditions under which data may be shared. Similarly, the PRA statement that “no persons are required to respond to a collection of information unless it displays a valid OMB control number” may imply that if such a number is displayed, a response is required.
The PRA and Privacy Act statements are required, but there appears to be no standard for where they are printed. For example, the American Community Survey questionnaire contains the PRA information on the bottom of the last page of the form (see www2.census.gov/programs-surveys/acs/methodology/questionnaires/2015/quest15.pdf), the IRS Individual Taxpayer Burden Survey (see www.irs.gov/pub/irs-soi/15inburdensurvey.pdf) has the Privacy Act and PRA information on the inside of the back page, and the authorization and confidentiality statements for the Survey of Doctorate Recipients are on the front cover of the questionnaire (see www.nsf.gov/statistics/srvydoctoratework/surveys/srvydoctoratework_nat2013.pdf).
As noted above, testing the presence or absence of either or both of these statements is not feasible because they are typically required. One way to learn more about the effects of these statements on participant behavior would be to test the placement of the statements (e.g., on the cover letter, on the front page of the questionnaire, on the back page). Another option would be to vary the length of the statements. We know of no study that has done either of these. However, with OMB’s approval, we were able to conduct an embedded experiment varying the length (and, indirectly, the content) of the Privacy Act statement in a one-time mail survey conducted by the Consumer Financial Protection Bureau (CFPB), the “Survey of Consumer Views on Debt”. Both the CFPB’s Privacy Office and Office of Research, which was leading the survey, were interested in the possible effects of varying the length and content of the Privacy Act statement on survey response.
Figure 1 shows the longer version of the statement, while Figure 2 shows the shorter version. It is worth noting that the shorter Privacy Act notice is longer than the notices on, for example, the IRS Taxpayer Burden Survey and Survey of Doctorate Recipients mentioned above. Note too that the PRA statement was unchanged. The statements were presented on the inside front cover of the questionnaire, following a brief description of the survey and answers to some questions about the survey (e.g., “How was I selected?” “How long will it take?”). The outside front cover of the questionnaire contained the survey title and logo. The questions themselves started on page 3.[1] In other words, the Privacy Act and PRA statements would only be seen by those who opened the questionnaire. The survey was also accompanied by a cover letter, which made no mention of the authority conducting the survey or offering protection against disclosure.
The longer version of the statement included a number of conditions under which data may be disclosed and parties to whom the data could be disclosed. Some of these, such as providing the data to a member of Congress or the Department of Justice without detail on the circumstances that would prompt the disclosure, might be concerning to some respondents. The longer version also mentioned the possibility that survey responses could be combined with administrative data and provided additional details on the applicable regulations.
Three competing hypotheses could be offered regarding the statement (along with the PRA statement, which was not modified):
-
Providing more details on what may be done with the data and exceptions to the confidentiality pledge may undermine the willingness of sample persons to participate in the study and (if they do participate) increase item missing data to sensitive questions;
-
Sample persons may glance at privacy statements but not read the details of them, and thus, longer statements may convey greater authority and increase survey participation; or
-
Sample persons may take no notice of privacy statements, so varying the length and content will have no effect on participation.
The first hypothesis assumes that recipients read the statement, while the other two assume that recipients give it only cursory (second hypothesis) or no attention (third hypothesis). The magnitude of any differences in response rates across groups—and hence the likelihood of statistically significant findings in line with hypotheses one or two—is diminished by the fact that we cannot identify recipients who did not open the envelope or questionnaire and hence were not exposed to the experimental manipulation. That is, we would expect to find larger effects if we could condition our analysis on those who actually opened the questionnaire and were exposed to the statement. Nonetheless, we conducted an experiment on a sample sufficiently large to detect meaningful effects for typical mail surveys of this type, and we examine the effect of the experimental manipulation on both unit nonresponse and item nonresponse.
Survey Design and Data Collection
The sample for the Survey of Consumer Views on Debt was drawn from the CFPB’s Consumer Credit Panel (CCP). The CCP is a de-identified 1-in-48 random sample of credit records from one of the three nationwide credit reporting agencies. The CCP contains more than 5 million credit records, representing the universe of approximately 240 million credit records in the United States. The CCP was used as the frame for the sample of nearly 11,000 credit records selected for this survey.
Sample Design
The sample for the survey was a stratified sample from the CCP. Twelve strata were identified to ensure representation of key types of consumer debts and collections. Oversampling was employed to ensure sufficient numbers of sample consumers with debts in collection (the focus of the survey). Given that the Privacy Act statement experiment was randomized within strata, and our focus is on the effect of the manipulation on overall response rates, we do not account for differential selection probabilities here.
The competing hypotheses suggest the effect of the length of the Privacy Act statement on response is ambiguous, but because of initial concerns about the potential negative effects of the longer version on survey response rates, an unbalanced experimental design was used. Cases were randomly assigned to version, with about 20% of cases assigned to the longer version and the balance (80%) assigned to the shorter version.
Survey and Data Collection Process
The Survey of Consumer Views on Debt was developed by the CFPB and deployed by the credit reporting agency, with assistance from a data collection subcontractor. The survey was designed as a mail survey, with a Web (online) option. The printed survey was available in English only, but the online survey was available in both English and Spanish, and invitation letters and reminders were printed in both English and Spanish. The printed survey consisted of 8 sheets of paper printed on both sides, i.e., 16 pages, of which 14 contained the survey questions.
A small pilot study was conducted in December 2014, and the main study was launched in early 2015. After review of the pilot study data, we decided not to make any changes to the survey instruments and thus combined both the pilot and main data for analysis. The Privacy Act experiment was conducted in both the pilot and the main study.
Main data collection involved the following four mailings:
-
Week 1: Initial mailing: cover letter, survey, and $5 incentive
-
Week 2: Reminder letter
-
Week 5: Replacement survey, reminder letter, and $5 incentive to nonrespondents
-
Week 7: Reminder letter to nonrespondents
The pilot involved an abbreviated mailing schedule involving only the first two steps. Invitations were mailed to a total of 997 consumers for the pilot and 9,879 consumers for the main study, for a total sample size of 10,876. Respondents were also given the option to complete the survey online. Survey case identifiers were used to ensure that respondents received the same version of the Privacy Act statement on questionnaires mailed in week 1 and week 5 as well as in the online version.
A significant number of questionnaires (9.0% of all invitations) were returned by the U.S. Postal Service as undelivered. Given that this is a sample of named individuals, these cases are all assumed to be eligible for the survey and thus included in the denominator of the response rate. But they would not have been exposed to the experimental manipulation. We thus examine response rates both including and excluding these cases.
Experimental Results
The overall response rate (using American Association for Public Opinion Research’s [AAPOR] Response Rate 2 [RR2][2]) was 19.6% (10.1% for the pilot and 20.6% for the main survey). Excluding the postal nondelivery cases, the response rate was slightly higher (21.5%), using AAPOR’s RR6. In addition, 10.6% of respondents took the survey online.
In general, we find that the length of the Privacy Act statement had no effect on individuals’ likelihood of responding to the survey. Specifically, 18.9% of those sent the long version of the Privacy Act statement responded to the survey, compared with 19.8% of those sent the short version. This difference is not statistically significant (chi-squared =0.93, df=1, p=0.34). When we exclude the postmaster returns (undeliverable surveys), the response rates are 20.8% for the long version and 21.7% for the short version, and the difference is again not statistically significant (chi-squared =0.69, df=1, p=0.41). Finally, the share of respondents who took the survey online (10.6% overall) did not differ significantly by Privacy Act version (11.7% for the long and 10.3% for the short version; chi-squared=0.62, df=1, p=0.43). Very few (6) online respondents broke off after seeing the Privacy Act statement, and the share of breakoffs among online respondents did not differ by condition.
On the whole, we find no evidence of an effect on response rates between the longer and shorter versions of the Privacy Act statement. There are, however, two interesting departures from this general conclusion. First, response rates to the pilot survey were 4.1 percentage points greater for the short version (11.0%) than for the long version (6.9%). This difference is marginally significant (chi-square=3.0, df=1, p=0.08); however, the pilot and main survey were similar in all respects except for when they were fielded and the number of reminders, so we view this result as an anomaly.
Second, response rates differ significantly by Privacy Act version for consumers in the strata that were oversampled because they were more likely to have had a debt in collection. Among these consumers, the response rate was 15.1% for the short version and 12.9% for the long version (chi-squared=4.7, df=1, p=0.03). In contrast, for consumers in the stratum in which debt collections were less likely, the response rate to the long version (31.1%) was greater than for the short version (29.1%), although the difference is not statistically significant (chi-squared=1.0, df=1, p=0.31).
Our next research question is whether the statements had any effect on item missing data. The survey was targeted at a specific group of consumers (those with outstanding debts), and the topic could be considered sensitive (their experiences with debt collection); the different versions of the statement could have an effect on willingness to answer questions in the survey. We examined missing data rates for key demographic and substantive measures in the survey (many of the questions were asked of a subset of respondents, so were excluded) and found no evidence of systematic differences in missing data. Selected examples are presented in Table 1.
None of the differences shown reach statistical significance (p>.10). While some variables examined in Table 1 show a slightly higher missing data rate for the longer version of the Privacy Act statement, others show the opposite trend. We thus see no pattern of missing data by experimental version.
Our conclusion is that the length (and, by extension, content) of the legal and regulatory statements included on the mail questionnaire had no meaningful effect on the likelihood of responding or on the likelihood of answering individual questions, given response.
There are a few limitations to our experiment. First, we were unable to test a version with no statement. Second, as noted earlier, both versions are long relative to other surveys, and the difference in length is not large. A starkly shorter version may have had measurable effects on response compared with the longer version. In addition, we cannot tell what share of recipients was not exposed to the experiment because they did not open the envelope or questionnaire. Finally, the statements we tested were on the inside front cover of the questionnaire. Putting them on the cover of the questionnaire or including them in the accompanying letter may have produced a stronger effect. Nonetheless, we found no effects (either positive or negative) of the two alternatives we tested. This null finding may be good news for those required to include such statements in their survey materials and suggests that these statements may not affect individuals’ survey response behavior.
Acknowledgments
The views expressed are those of the authors and do not necessarily reflect those of the Consumer Financial Protection Bureau or the United States.
The full questionnaire is included in the appendix of Consumer Financial Protection Bureau (2017), available at http://files.consumerfinance.gov/f/documents/201701_cfpb_Debt-Collection-Survey-Report.pdf
American Association for Public Opinion Research’s (AAPOR) Response Rate 2 includes both completed and partially completed surveys in the numerator and all eligible sample cases in the denominator. RR6 excludes ineligible cases from the denominator. See AAPOR (2015).