Are Response Rates to Organizational Climate Surveys Influenced by Informing Respondents About the Sampling Strategy?

Taylor Lewis U.S. Office of Personnel Management

Lorraine Latimore U.S. Office of Personnel Management

Abstract

First administered in 2002, the Federal Employee Viewpoint Survey (FEVS) is an annual, Web-based organizational climate survey of U.S. federal government employees. In recent years, numerous external stakeholders have urged the FEVS administration team at the U.S. Office of Personnel Management to consider changing its sampling policy to one of a perennial census of the FEVS-eligible workforce. Among the most frequently cited reasons is a belief that offering all employees the opportunity to participate could boost response rates. A corollary of this belief is that an agency currently sampling its FEVS-eligible workforce is somehow achieving a lower response rate than it otherwise could be. In this paper, we report results from an email contact wording experiment systematically manipulating whether employees were made aware of the underlying sampling strategy. We find emphasizing a census has no effect, but emphasizing the employee was part of a random sample produces a slight increase in response rates.

Background

The Federal Employee Viewpoint Survey (FEVS) is an organizational climate survey administered by the U.S. Office of Personnel Management (OPM). Launched in 2002 as the Federal Human Capital Survey (FHCS), the survey was administered biennially until 2010 when it was renamed FEVS and administered annually. The Web-based survey is sent to employees from 85 agencies via a personalized link in an email. Weekly reminders are sent to nonrespondents over the course of a six-week field period. The survey instrument consists predominantly of attitudinal items posed on a five-point Likert-type response scale, for example ranging from “Completely Disagree” to “Completely Agree,” and taps into a diverse range of constructs, such as job satisfaction, engagement, and the perceptions of senior leadership within the agency.

The FEVS sampling frame is derived from the Statistical Data Mart of the Enterprise Human Resources Integration (EHRI-SDM), an expansive personnel database maintained by OPM. The sample size has increased markedly over its existence, from approximately 200,000 in 2002 to approximately 900,000 in 2016. The observed increase is attributable to participating agencies’ desire for progressively deeper reporting and analyses within the organization (Berry 2012). Detailed in OPM (2015), each agency provides hierarchically structured work-unit identifiers for its roster of FEVS-eligible1 employees which are used to create sampling strata and also for reporting out survey results. If the necessary sample size given the agency’s stratification scheme surpasses 75% or more of the agency’s population, a census is conducted instead. In FEVS 2016, a census was conducted in 71 out of the 85 participating agencies.

Parker (2011) reviews some of the inherent and perceived advantages and disadvantages of conducting a census versus sampling in organizational climate surveys. While acknowledging sampling is generally more cost effective and, with proper planning and execution, allows for sufficiently representative results, she cautions that some employees not selected may feel excluded in certain circumstances. In addition, she notes the tendency for lay consumers of survey results to ascribe more credibility to data derived from a census. We can attest to this in the FEVS. For example, Stier (2016) asserts that transitioning to an annual FEVS census “would enhance the usefulness of the survey as an oversight and accountability tool for Congress and offer more value to agency leaders and managers who are using the survey to improve satisfaction and commitment within their organizations.”

Numerous other stakeholders have lobbied for the FEVS to be administered as an annual census as well. Among the range of motivating factors, which includes senior leaders’ desire for the optics of a survey giving each and every employee an equal say, is the widespread speculation that offering every individual the opportunity to participate will boost response rates. As noted in Lewis and Hess (2015), response rates to the FEVS have been gradually declining, a trend the FEVS administration team is keen on reversing. So, if the assumed effect can be proven to exist, budget constraints permitting, the FEVS administration team would pursue a change in its sampling strategy to accommodate agencies to conduct a census perennially.

To the best of our knowledge, there has been no published research investigating the relationship between response rates in organizational climate surveys and whether or not the workforce was censused. Within the purview of the FEVS, the first formal investigation into the matter was Lewis et al. (2016), who exploited the natural experiment (Wooldridge 2012) resulting from a sequence of sample design changes between 2011 and 2013. With few exceptions, every agency conducted a census in FEVS 2012, but not in the two adjacent administrations, FEVS 2011 and FEVS 2013. A by-product was a cohort of agencies that transitioned into and out of a census administration which could be compared against a cohort of agencies that conducted a census continuously from FEVS 2011 to FEVS 2013. Using what Wooldridge (2012) terms a first-differenced estimator, Lewis et al. (2016) found marginally significant evidence suggesting a census is linked with higher response rates. The study was limited because it was retrospective and observational. In this paper, we present results from a follow-up study employing a prospective research design devoid of that criticism.

This paper is organized as follows. After a brief review of the pertinent literature, we posit two hypotheses about the relative impacts of informing individuals whether the agency’s workforce was sampled or censused. We then report results from an FEVS 2016 email wording experiment carried out to test these hypotheses. The paper concludes with a summary and discussion of limitations.

Two Hypotheses Regarding the Relative Impacts of a Census Versus a Sample

In their review of the theory of diffusion of responsibility, Barron and Yechiam (2002) cite two studies relevant to the question of whether a census might be associated with a response rate increase. The first is Darley and Latané (1969), who found one’s motivation to help is tempered when others are perceived as able to help. The second is Diekmann (1985), who discovered in a similar vein that individuals in a game setting were less likely to volunteer to help the greater good of the group if they knew someone else already had volunteered. Viewing the solicitation to participate in an organizational climate survey such as the FEVS as a request for help (i.e., to provide opinions and perspectives used to drive organizational improvements), our first hypothesis is that we expect employees to be less inclined to participate in the survey if made aware all other employees in the organization have also been asked to participate.

A corollary of the argument calling for an FEVS census is that surveying only a sample of an organization’s employees is somehow associated with lower response rates. Here, too, the literature seems to suggest otherwise. Groves et al. (1992) outline a set of psychological constructs that factor into a given individual’s decision to participate in a survey, maintaining that individuals may be more likely to comply if the requestor highlights the scarcity of the opportunity, such as “not all employees have been given the chance to participate” or “only a select number of employees like you were randomly selected.” Indeed, Porter and Whitcomb (2003) found this strategy to be effective in a survey of high school students. As such, our second hypothesis stipulates that informing the employee he or she is part of a random sample will result in a response rate increase.

Data and Methods

To test the two competing hypotheses laid out in the previous section, during the FEVS 2016 administration that ran from April 26 to June 16, we systematically manipulated the messaging embedded in reminder emails sent to 242,717 employees from four agencies: two conducting a census and two conducting a sample. The two censused agencies were the Department of the Interior (DOI) and the Environmental Protection Agency (EPA); the two sampled agencies were the Department of Homeland Security (DHS) and the Department of Veterans Affairs (VA). All employees in these agencies received the same initial survey invitation. Thereafter, one-half of the employees were randomly assigned to a condition where the merits of a census or a sample, respectively, were heavily emphasized in subsequent reminders. We refer to this as pro-census or pro-sample wording. The other half was assigned to a condition not mentioning the sampling strategy, which we refer to as neutral wording. Three examples are provided in the Appendix.

Our key outcome measure of interest is the response rate, which we report in accordance with the RR1 definition of the American Association of Public Opinion Research (AAPOR 2016). To be considered a complete, we used the same rule described in OPM (2015) whereby the respondent must have answered at least 21, or 25%, of the 84 nondemographic survey items. Employees determined to have left their position (e.g., retired, took a job in the private sector) during the six-month lag between the time the sampling frame was produced and the start of the survey field period were considered ineligible and removed from the denominator of the response rate calculation.

Results

Table 1 summarizes results from the FEVS 2016 email wording experiment. Emphasizing a census produced mixed results, so our first hypothesis was not upheld. A slightly positive effect (0.4%) was observed for DOI, but a negative effect was observed for EPA (−0.8%). Because DOI is much larger than EPA, the net result was a marginal response rate increase of 0.2%. This was not a statistically significant difference (t = 0.33; p = 0.3704). On the other hand, our second hypothesis was upheld. We found informing the employee that he or she was one of a select number of individuals sampled to participate had a positive effect on response rates for both DHS (0.3%) and VA (1.4%). Combined, the overall response rate increase was 0.9%. Although modest in magnitude, the increase was statistically significant (t = 2.42; p = 0.0078).

Table 1 FEVS 2016 wording experiment response rates by agency and condition.

Agency Adjusted sample size* Respondents Response rate
Censused agencies
 1. Environmental Protection Agency
   Pro-census wording 7,026 5,049 71.9%
   Neutral wording 7,028 5,107 72.7%
 2. Department of the Interior
   Pro-census wording 22,985 11,567 50.3%
   Neutral wording 23,121 11,531 49.9%
Censused agencies totals
 Pro-census wording 30,011 16,616 55.4%
 Neutral wording 30,149 16,638 55.2%
Sampled agencies
 1. Department of Homeland Security
   Pro-sample wording 46,874 23,595 50.3%
   Neutral wording 46,835 23,396 50.0%
 2. Department of Veterans Affairs
   Pro-sample wording 44,401 15,448 34.8%
   Neutral wording 44,447 14,865 33.4%
Sampled agencies totals
 Pro-sample wording 91,275 39,043 42.8%
 Neutral wording 91,282 38,261 41.9%

*Adjusted sample size excludes employees determined to be ineligible due to leaving their position with the agency for any reason between the time of sample selection and survey administration.

Discussion

The purpose of this paper was to present results from an experiment carried out during FEVS 2016 to investigate the veracity of speculation within the FEVS community that response rates could be boosted if all employees within an agency were given the opportunity to participate. Relatedly, we sought to assess whether the sampling strategy currently implemented for 14 of the 85 participating agencies was negatively impacting response rates. The experimental design called for randomly apportioning employees from four agencies – two conducting a census and two conducting a sample – into two groups. The first group received email reminders with wording emphasizing that the agency was reaching out to all employees in the form of a census or, where applicable, that only a sample of employees had been granted the opportunity to participate. For both conditions, a control group within the agency received email reminders not mentioning the sampling strategy.

Our findings do not support the prevailing notion that a higher level of response could be achieved if the FEVS were to census the workforces of all participating agencies. There was no substantive increase in response rates observed when highlighting to respondents that a census was being undertaken. On the other hand, we did observe a modest, albeit statistically significant, response rate increase when emphasizing to the employee that he or she was part of a randomly selected sample.

Despite the large sample size of approximately 250,000 employees, our study was limited in scope in that it focused on only four of the 85 participating agencies. However, these four agencies do represent roughly 25% of the overall FEVS 2016 sample and were purposefully selected for our experiment due to their expressed desire to conduct a census in administrations following FEVS 2012. We acknowledge it would have been preferred to first randomly assign a larger number of agencies to be either sampled or censused prior to randomly assigning employees therein to the two messaging conditions. We opted against this, however, because we were concerned breaking away from the established sample versus census determination rules detailed in OPM (2015) could have influenced results by inadvertently advertising the experiment.

The primary practical implication of our research is that, as was asserted in Groves et al. (1992) and previously demonstrated empirically by Porter and Whitcomb (2003), emphasizing the scarcity of the survey opportunity can help improve response rates. The improvement may be slight and likely varies depending on the target population, survey topic, and the medium of emphasis. For example, whereas Porter and Whitcomb (2003) found an effect of about 6 percentage points, we observed an effect of about 1 percentage point. While there could be other factors at play, one possible explanation of the more muted effect we observed is that it is harder to convey the scarcity of the opportunity to participate in a large-scale, highly publicized annual survey like the FEVS. With its current 50% marginal sampling rate, 1 out of every 2 individuals in the target population is sampled. Hence, individuals are just as likely not to be given the opportunity to participate as they are to be afforded the opportunity.

Disclaimer

The opinions, findings, and conclusions expressed in this article are those of the authors and do not necessarily reflect those of the U.S. Office of Personnel Management.

References

AAPOR 2016
American Association for Public Opinion Research (AAPOR). 2016. Standard definitions: final dispositions of case codes and outcome rates for surveys (9th ed.). AAPOR, Ann Arbor, MI.
Barron and Yechiam 2002
Barron, G. and E. Yechiam. 2002. Private e-mail requests and the diffusion of responsibility. Computers in Human Behavior 18(5): 507–520.
Berry 2012
Berry, J. 2012. Guide for interpreting and acting on federal employee viewpoint survey results. Memorandum to the Chief Human Capital Officers (CHCO) Council. Available at https://www.chcoc.gov/content/guide-interpreting-and-acting-federal-employee-viewpoint-survey-results.
Darley and Latané 1969
Darley, J. and B. Latané. 1969. Bystander “apathy". American Scientist 57(2): 244–268.
Diekmann 1985
Diekmann, A. 1985. Volunteer’s dilemma. Journal of Conflict Resolution 29(4): 605–610.
Groves et al. 1992
Groves, R., R. Cialdini and M. Couper. 1992. Understanding the decision to participate in a survey. Public Opinion Quarterly 56(4): 475–495.
Lewis and Hess 2015
Lewis, T. and K. Hess. 2015. An experiment testing alternative email contact timing strategies in a web-based survey of federal personnel. Proceedings of the Federal Committee on Statistical Methodology (FCSM) Research Conference. Available at https://s3.amazonaws.com/sitesusa/wp-content/uploads/sites/242/2016/03/G2_Lewis_2015FCSM.pdf.
Lewis et al. 2016
Lewis, T., L. Latimore and N. Graf. 2016. Are there substantive differences between sampling and censusing employees in organizational climate surveys? Proceedings of the FedCASIC Workshops. Available at https://www.census.gov/fedcasic/fc2016/ppt/1_5_Lewis.pdf.
Parker 2011
Parker, S. 2011. Sampling versus census: a comparative analysis. TNS Employee Insights Report. Available at http://www.hr.com/en/app/media/resource/_hcoegz03.deliver?&layout=og.pdf&mode=download.
Porter and Whitcomb 2003
Porter, S. and M. Whitcomb. 2003. The impact of contact type on web survey response rates. Public Opinion Quarterly 67(4): 579–588.
Stier 2016
Stier, M. 2016. The best and worst places to work in the federal government. Written statement prepared for a hearing of the House Committee on Oversight and Government Reform, Subcommittee on Government Operations. Available at https://oversight.house.gov/wp-content/uploads/2016/04/2016-04-27-Max-Stier-PPS-Testimony.pdf.
OPM 2015
United States Office of Personnel Management (OPM). 2015. Federal employee viewpoint survey results: technical report. Available at http://www.fedview.opm.gov/2015/published/.
Wooldridge 2012
Wooldridge, J. 2012. Introductory econometrics: a modern approach (5th ed.). Cengage Learning, Mason, OH.

Appendix: Examples of Email Messaging Conditions

  1. Example of a pro-census email message

    Inspire change through your participation in the Federal Employee Viewpoint Survey!

    What matters most to you as a Federal employee? If you had the opportunity to speak directly with your agency’s senior leaders, what would you say?

    To get the most comprehensive view possible about what’s working well in <AGENCY NAME> and what areas need improvement, we are reaching out to each and every employee. All voices are important!

    If you have not yet completed the 2016 FEVS, take this opportunity to fill out the survey. This is your chance to voice your opinions and let your leadership know which issues are most critical to you.

    <URL HERE>

    If the link does not take you directly to the survey, copy and paste the following into a browser window: <URL HERE>

    Please DO NOT forward this e-mail since it contains your personalized link to the survey.

    Please reply to this message if you have any questions or difficulties accessing the survey, or call our Survey Support Center toll free at: 1-855-OPM-FEVS (1-855-676-3387).

  2. Example of a pro-sample email message

    Inspire change through your participation in the Federal Employee Viewpoint Survey!

    What matters most to you as a Federal employee? If you had the opportunity to speak directly with your agency’s senior leaders, what would you say?

    We know you are busy, but your opinions are very important. Only a select number of <AGENCY NAME> employees have been asked to participate. Your answers to the survey will represent both you and your colleagues who were not selected to participate.

    If you have not yet completed the 2016 FEVS, take this opportunity to fill out the survey. This is your chance to voice your opinions and let your leadership know which issues are most critical to you.

    <URL HERE>

    If the link does not take you directly to the survey, copy and paste the following into a browser window: <URL HERE>

    Please DO NOT forward this e-mail since it contains your personalized link to the survey.

    Please reply to this message if you have any questions or difficulties accessing the survey, or call our Survey Support Center toll free at: 1-855-OPM-FEVS (1-855-676-3387).

  3. Example of a neutral message

    Inspire change through your participation in the Federal Employee Viewpoint Survey!

    What matters most to you as a Federal employee? If you had the opportunity to speak directly with your agency’s senior leaders, what would you say?

    If you have not yet completed the 2016 FEVS, take this opportunity to fill out the survey. This is your chance to voice your opinions and let your leadership know which issues are most critical to you.

    <URL HERE>

    If the link does not take you directly to the survey, copy and paste the following into a browser window: <URL HERE>

    Please DO NOT forward this e-mail since it contains your personalized link to the survey.

    Please reply to this message if you have any questions or difficulties accessing the survey, or call our Survey Support Center toll free at: 1-855-OPM-FEVS (1-855-676-3387).

Footnote
1 The target population includes permanently employed, full- or part-time, nonpolitical, nonseasonal, civilian personnel on board the agency at least six months prior to the start of the data collection field period.


About Survey Practice Our Global Partners Disclaimer
The Survey Practice content may not be distributed, used, adapted, reproduced, translated or copied for any commercial purpose in any form without prior permission of the publisher. Any use of this e-journal in whole or in part, must include the customary bibliographic citation and its URL.