WERA1010: Improving Data Quality from Sample Surveys to foster Agricultural and Community Development in Rural America

(Multistate Research Coordinating Committee and Information Exchange Group)

Status: Inactive/Terminating

WERA1010: Improving Data Quality from Sample Surveys to foster Agricultural and Community Development in Rural America

Duration: 10/01/2018 to 09/30/2023

Administrative Advisor(s):


NIFA Reps:


Non-Technical Summary

Statement of Issues and Justification

A major challenge continues to exist with respect to obtaining data to enable us to understand the human capital characteristics of rural regions of the United States and the development challenges they face. As noted in the 2013 WERA 1010 proposal, the Decennial Census Long Form, used throughout the 20th century to collect essential data on population characteristics from 1 in every 6 households throughout the United States, was discontinued after the 2000 Census. As a result, there is no longer a regular source of reliable data on the characteristics of people who live in each county and community of the United States.


Its replacement, the American Community Survey, collects data from about two million U.S. households each year, thus making it possible to produce acceptable city and urban county estimates (for  population characteristics including age, education, income, occupation, commute time to work, and other essential indicators of human capital and well-being). Data from this survey can also be accumulated across rural counties over multiple years so that acceptable estimates for rural regions containing a number of counties may also be obtained. However, if one’s interest is in a specific rural county or even small clusters of rural counties, no reliable data now exist. This is, in particular, a problem for sparsely population regions of Western United States. The absence of such data becomes a major concern when professionals and businesses interested in economic development activities are trying to assess and use key information on the economic development potential of such areas.


The void produced from loss of the long form Census data is not filled by other national surveys. The sample sizes for most nationwide surveys are too small to reliably measure attributes of specific rural areas because the variability is so large, and, in many cases, sample sizes are too small for use at the state level. If data are going to be available for rural places and people on their human capital characteristics and other relevant information including farm and agricultural group activities and interests, it is essential that data on these populations be collected in other ways.


The further we get from our last county-level benchmark data obtained in the 2000 Census, the more challenging it is to know with reasonable certainly what is happening demographically, as well as what economic development issues are facing specific rural areas of the United States. It is important to continue the development of methods that will allow geographic specific projects to obtain household data suitable for guiding economic development decisions.


Understanding attitudes, behaviors, and demographic characteristics of the general public is only part of the data problem that now prevails among agricultural and rural populations. Sample surveys, many of which have been conducted by professionals in Agricultural Research and Extension programs, have long been used to provide specific data of interest to local municipalities as a needed supplement to the questions formerly asked in the Decennial Census. In addition, sample surveys have been used to regularly and efficiently obtain information from agricultural production groups, rural interest groups, and others to describe problems and identify solutions for which no official statistics are available.


The capability of sample surveys that sets them apart from other methods of collecting data is that only a few hundred or thousand questionnaires collected randomly in a carefully designed sample of a specific population (e.g., a rural community) allows one to estimate characteristics (from attitudes to behaviors) within a few percentage points of the actual population parameters at a high level of statistical confidence for the population being studied (Dillman, Smyth, & Christian, 2014). No other social science data collection method has this capability. However, this capability is in danger of not being realized because of rapid technological and other changes that make traditional data collection methods less effective. Finding ways to improve sample survey methods so they can be used to meet rural and agricultural data needs is the proposed purpose for this coordinating committee.


Background


In the last quarter of the 20th century, telephone surveys became the main method of conducting general public surveys needed for estimating a wide range of behaviors from employment rates and the effects of rural development initiatives on entrepreneurial efforts to the consumption of food products. During this time, the telephone also became the dominant mode for surveying opinions, e.g., community and employment satisfaction, desire for new products and services, and satisfaction with rural and urban development activities. Face-to-face interviews continue to be used for the nation's most critical national surveys (e.g., the monthly Currently Population Survey’s estimate of employment rates and the USDA's Agricultural Resource Management Annual Survey of Farm operators) and mail surveys are used for surveys of list populations (e.g., participants in Extension programs whose names and addresses were available.)


However, telephone survey methods are no longer working well for producing valid statistical estimates because of low response rates (often less than 10%), and reduced household coverage. Fewer than 50% of households now have landlines, the traditional sample frame for drawing household samples via random-digit dialing. Cell phones can now be included in random sample frames, but because of the loss of geography the area code being a valid indicator of where one lives sub-national (state, county, and community) surveys by telephone cannot be counted on to accurately represent the populations they purport to describe with precision.


In response to this trend, some surveyors have tried to turn to the web-only surveys as an alternative data collection strategy. The difficulty with this proposed solution is that about 84% of the U.S. households have access to the web, and a smaller percentage of households have access that is used more than once every two weeks. Some who do have web access also seem unwilling to respond to web surveys. In addition, there is no usable sample frame for contacting the general public by email, confounding the problem of gaining household responses over the web. It is also evident that those without web access have significantly less education, lower incomes, are less likely to be married, have less stable employment and differ in other significant ways, as shown by (Brenner & Raine, 2012; Zickhur & Smith, 2012). Web-only surveys are not yet able to produce valid results in household surveys.


The Work of WERA 1010 and Its Predecessors


WERA 1010, now in its fifth and final year, has a long history of studying ways of improving survey methodologies for rural and agricultural surveys that can provide data which are impossible to obtain for much of Rural America other than through the use of sample survey methods. Research has been aimed at improving mail, web, and telephone methods, including use in mixed-mode surveys, whereby some people are surveyed via one mode (e.g., mail) and others by a second mode (e.g., web or telephone).


Research and its application to survey practice has been undertaken cooperatively for more than 25 years under Regional Project W-183, followed by Western Region Coordinating Committees (WERA 1001 and WERA 1010), with the aim of improving survey methods for use in rural and agricultural surveys. During the last five years, research under WERA 1010 has been conducted with increased urgency, hoping to find solutions to current sample survey design problems.


It is important to put the work of WERA 1010 into the context of work conducted during the previous ten years, as well as work  under WERA 1001 (2002-2007). Historically, that committee attempted to focus its interactions and work around the most pressing issues facing survey methodology. Thus, the focus of WERA 1001 was predominately on measurement issues that affected the use of mixed-mode surveys whereby data were collected by aural (interview) and visual (self-administered) questionnaires, considered then to be the major problem being experienced in survey methodology. A significant proportion of the publications produced by that committee reported measurement experiments aimed at learning the extent and reasons that visual questionnaires often produced different answers to the same question (Christian & Dillman, 2004).


By 2007, it became apparent that the prevalence of landline telephones was destined to continue its decline. Because of the reduced coverage as well as dismal response rates, a new approach to surveying was needed. It was also becoming apparent that the best sample frame for reaching rural households was the U.S. Postal Service’s residential Delivery Sequence File which became available for survey sampling. WERA participants began researching the possibilities this list provides as a replacement for telephone methods because of pioneering work by Todd Rockwood, a professor at the University of Minnesota who began attending WERA meetings at that time, prior to becoming a member of WERA 1010.


When WERA 1010 was formed there was a significant shift in research conducted and disseminated by its members. At that time the predominant thinking among survey researchers was that contacting people by mail and asking them to respond over the web, if they were able, would result in very low response rates. WERA members began researching several aspects of this problem. A head-to-head comparison of telephone-only with mail-only and mail with a push to web was begun by Lesser (2011a). That work clearly showed that response rates to a mail-only approach was superior to the response rates that could be obtained by telephone only.


Work by Dillman and a research team at Washington State University, provided strong evidence that it was possible to push a significant proportion (43%) of households to the web. However, this work also showed that following up the early web response with a mail follow-up obtained responses from people (older, less educated, with lower incomes) not able or willing to respond over the web. The resulting publication, Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21 century, (Smyth, Dillman, Christian & O Neill, 2010) provided initial evidence that pursuing a web+mail methodology could provide an effective alternative to the telephone-only survey approach that had dominated survey methodology from the 1970s through the 1990s. It also showed that the method could be especially effective in rural communities and regions, where American Community Survey demographic estimates are least likely to be available.


The methods developed by this team of researchers, is now being used worldwide to conduct household surveys.  The Japanese Census (2015), Australian Census (2016), and Canadian Census (2016) have adopted web-push methods. In the United States, the American Community Survey began using web-push methods in 2013, and plans to use them again in 2020 for the required Decennial Census. Each of these agencies has done additional research to make those methods, along with in-person follow-up, work for their situation.  However, the fundamental research that encouraged these adaptations was conducted under the WERA 1001 and WERA 1010, as described by Dillman (2017).


This work on general public surveys was greatly expanded during the life of WERA 1010. Statewide general public survey experiments in Oregon (Lesser et al., 2007a; Lesser et al., 2008a; Lesser et al., 2008b; Lesser & Yang, 2009; Lesser et al., 2010; Lesser et al., 2011a; Lesser et al., 2011b; Lesser et al., 2016) and Washington (Messer & Dillman, 2011; Messer, 2012) confirmed that the web+mail methodology could be effectively used to push many respondents to the web. It also showed that that the mail follow-up was essential for achieving better representation of the population, as measured by American Community Survey results (which Smyth et al., 2010,could not evaluate because of such data not being adequate for estimating true population characteristics of the rural region in Washington and Idaho that was studied). Studies in these states have used different methodologies, survey topics, and populations, but all find that there is much potential for combining mail and web data collection. Finally, Israel (2009a; 2009b; 2010b) showed this was true for list populations too.


WERA 1010 pursued many different dimensions of this challenge. Work was also begun on how one might leverage the survey situation of having postal addresses and email addresses for particular clientele groups. Several studies by Israel and his colleagues tested multiple ways of combining mail contact and email follow-ups in order to bolster responses and improve data quality for evaluation surveys of Extension clients (Israel, 2011; 2012a; Newberry & Israel, 2017). This work found that email augmentation sending quick emails to make responding easier for individuals who had just been contacted by mail also tested on students by Millar and Dillman (2011), was effective for client surveys.


Other work by WERA 1010 members was aimed at advancing research on visual design and layout effects on measurement studied by WERA 1001. These included open-ended question effects (Israel, 2010a; Israel & Galindo-Gonzalez, 2010; Kumar Chaudhary & Israel, 2016; Lesser & Newton, 2007b; Smyth & Dillman, 2007; Swinford, 2007), response option format (Smyth et al., 2006a) and general visual layout effects (Christian & Dillman, 2004; Dillman et al., 2005; Israel, 2006; Mahon-Haft & Dillman, 2010; Rookey & Dillman, 2008; Smyth et al., 2006b; Toepoel & Dillman, 2011). Work also was conducted regarding effects of personalization and incentives on response behavior (Beebe et al., 2005; Dillman et al., 2007; Wilcox et al., 2010). In addition, considerable research involving Fred Lorenz (Conger et al., 2011; Lorenz et al., 2010) were focused on applying data collected based on WERA research experiments to the design and analysis of longitudinal surveys.


One of the challenging issues faced by WERA 1010 was a finding that permeated the work of all members, as a general rule, the use of a mail only data collection strategy consistently produced higher response rates than the web+mail strategy developed by committee members. For example, 10 comparisons of mail-only vs. web+mail response rate comparisons conducted from 2007 to 2012 have shown that in every case the mail-only response rates are higher than the web+mail treatments, with a difference of 2-15 percentage points (mean = 10 percentage points) (Smyth et al., 2010; Messer & Dillman, 2011; Messer, 2012; Messer et al., 2012).


Mail has traditionally been considered by many surveyors to be old-fashioned, and to suffer from many problems, e.g. higher item nonresponse than web, the difficulty of using extensive numbers of branching questions, higher costs for printing, postage, and labor, and more time to collect the data. Yet, it became apparent to members of WERA 1010 that mail-only surveys had become our highest response rate method for many surveys, and for the general public, had superior coverage to other modes. Address-based samples that rely on residential mailing addresses obtained from the U.S. Postal Service, are now recognized as providing the best coverage of households in the United States, covering approximately 98% of all residences. Thus, part of our work shifted to identifying and jointly addressing problem issues associated with mail and how to maximize its effectiveness in surveying the general public (Harter, et al. 2016; Battaglia, 2016). A related question for which we sought and are still currently seeking answers was to find the optimal mix of web+mail for achieving cost and data quality objectives.


For example, it has been argued that higher item nonresponse rates to mail questionnaires diminish the effectiveness of obtaining higher unit response rates. WERA members organized a special session at the 2011 meetings of the American Association for Public Opinion Research to report findings from our research. The papers presented there were published in 2012 by Survey Practice, an electronic journal created by AAPOR to bring important research findings to practitioners in a format that they could use. The consistent finding of these articles (Messer, Edwards & Dillman, 2012; Lesser, Newton, & Yang, 2012; Israel & Lamm, 2012; Millar & Dillman, 2012) is that the item nonresponse differences, although higher for mail in most studies, are only insignificantly greater, thus suggesting this problem is not an important barrier to the use of a web+mail methodology. One possible reason for the small difference between mail and web item nonresponse pertains to the construction methods used for both web and mail questionnaires by WERA members is the use of the visual design techniques developed and tested by WERA 1001 and WERA 1010 members.


One of the other means by which results from WERA s work has been disseminated to practitioners is through the book, Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th edition (Dillman, Smyth, & Christian, 2014). That book on data collection methods included most of the major findings and recommendations developed through work under WERA 1001 and WERA 1010. This book, in its combined editions appears to be the most cited reference for designing sample survey data collection procedures now in print.


Renewal of WERA 1010 will allow current members, as well as new members, to continue and initiate new coordinated scientific work across states on the design and conduct of sample surveys. It is apparent as this proposal is written that the work of this committee will focus in part on refining the web+mail methodology invented by members of this committee, for example, testing the limits of holding back on mail follow-up until later in the implementation process, testing the ability to collect data from lesser known sponsors (e.g., universities in one state trying to collect data in another), the judicious use of incentives more than once during the data collection process. We will work to identify the most cost-efficient design using both web and mail that will optimize response rates for the lowest cost per completed questionnaire. We will also investigate how new mediums and media can be used in our survey methods, including recruitment to surveys, distribution and collection of surveys, and delivery of digital content (e.g., audio and video) in surveys. In addition, we will explore the use of smartphones, tablets, and mobile communication devices to better determine whether surveys can be successfully completed using these evolving technologies. The advent of integrative technologies, including QR codes, might also provide opportunities to advance survey methodologies.


With the emergence of sophisticated survey software platforms, an increasing number of web surveys are being conducted with nonprobability samples.  Recognizing the need for consumer input, a number of companies have emerged over the past few years advertising their ability to solicit opinions from a sample of online users.   The companies recruit individuals using pop-up screens that may appear online when visiting certain locations.  These members are recruited by the companies and also through business partners to create a panel based on nonprobability sampling.  A client can purchase the use of this panel of individuals from the company to assess their opinions on a product.  However, these panels are not based on probability sampling, which is based on random selection, and thus may not be representative of the population that a client desires.   WERA members have begun exploring the utility of nonprobability samples for item development and examining the comparability of survey data from nonprobability and probability samples (Israel, 2017; Israel et al., 2017; Lesser et al, 2017a and 2017b). The initial studies suggest that nonprobability samples produce significantly different estimates from those of probability sample surveys and it is unclear, after accounting for demographic differences in the sample as compared to the population, the cause of the differences and whether further adjustments can be made in order to incorporate these data with data collected from probability sampling.  Given that nonprobability samples for web surveys can be fast and inexpensive, further research is needed to identify procedures, including weighting schemes and merging with probability sample data, which might provide reasonably accurate, unbiased estimates for decision makers in rural America.


Another area of investigation to be developed is the effectiveness of alternative communication strategies for improving response to surveys.  Preliminary work suggests that the approaches being used for communicating with potential respondents are not working well in federal surveys, and may be causing considerable distrust among respondents.  A test of social exchange theory vs. a marketing theory, called persuasion theory (Cialdini, 2016), suggests the later approach does not improve response and may diminish it (Dillman & Greenbeerg, 2016).


Another issue that needs deeper investigation in rural areas is the use of drop-off pick-up (DOPU) techniques for conducting surveys.  This method has been studied by some members of WERA 1010, and is growing in importance as a cost-effective way of producing quality data from rural areas. The WERA 1010 group continues to discuss this method and is likely to dedicate effort to streamline best practices and guidelines for this approach, particularly in rural places where it has been shown to be highly effective. Although often associated with research conducted in the state of Utah due to the long-standing use of this approach by researchers at Utah State University, the DOPU method has been used around the country (including California, Nevada, Wyoming, Colorado, Arizona, and Texas; McKim et al., 2015) and internationally. This survey mode relies on human labor to hand-deliver survey questionnaires to sampled units and return visits to retrieve the survey. Although response rates associated with other modes continue to decline, DOPU mode surveys consistently achieve response rates of 70% or higher, particularly in smaller urban or rural communities (Jackson-Smith et al., 2016; Trentelman et al., 2016; McKim et al., 2015; Allred & Ross-Davis, 2011). The success of this approach is based in its application of social exchange theory, suggesting that face-to-face interaction increases the commitment to reciprocity and follow through by participants (Jackson-Smith et al., 2016).


Also, developing and testing multimedia methods of collecting survey data, including asynchronous digital video diaries (Farias et al., 2015; McKim et al., 2015) and digital media integration into intercept surveys (Hill et al., 2015, 2016), continues to be an area of focus for some WERA 1010 members. Effectively integrating evolving technologies and new media into surveys, including digital audio and video media on laptop-, tablet- and smartphone-based survey platforms, continues to require exploratory studies that will lead to refined methods for reaching difficult-to-reach and underserved populations, including low-social economic status communities, non-native English speakers, populations with low literacy rates, and emerging sub-cultures among generations. Additionally, WERA 1010 members will continue to draw on theories, including social exchange theory and Bandura’s (1986; 2001) social cognitive theory, in tandem with biometric technologies to investigate how audiences think, feel, and behave as they interact with analog and digital survey mediums. The expansion of biometric-based investigation of survey mediums and designs may lead to more efficient models for collecting survey data and reduced internal error of surveys.

Objectives

  1. Continue the interaction and collaboration of researchers and extension faculty using sample survey methods, for arriving at a better understanding of how to assure the conduct of quality sample surveys, during this period of rapid change in survey systems and technologies and the loss of national survey data to describe sparsely populated counties and small rural communities.
  2. Continue the exploration of the Postal Delivery Sequence File as a sample source and as a means of pushing respondents to the web in order to achieve cost efficiencies, improved response rates, and reduced nonresponse bias by exploring web/mail, and other mixed-mode survey approaches.
  3. Conduct research on emerging issues, including examining data collected from less expensive non-probability approaches to determine if these approaches can be combined with probability methods to obtain unbiased estimates with improved levels of precision, developing and testing methods for reaching traditionally difficult-to-reach and underserved populations, and expanding the use of biometric technologies to investigate analog and digital survey mediums for collecting data.
  4. Encourage and facilitate the joint writing and publication of research results by members of the Coordinating Committee.
  5. Disseminate research findings through teaching, seminars, applied publications and Extension in-service training by members of the Coordinating Committee to reach survey developers and consumers in the land grant system.

Procedures and Activities

WERA 1001 meets annually for researchers to share their activities and plans for the coming year. All participants are encouraged to present tentative plans on their future studies in order to obtain advice and comment from other participants. One of the typical outcomes of these discussions is to encourage other participants to develop a new test and/or repeat a test conducted by other colleagues on the committee in their own survey work. Previous work on WERA and its W-183 predecessor has shown that members are particularly effective in developing new tests because of their roles in experiment station, extension and other work locations in helping design surveys. WERA members are often consulted by other Agricultural Experiment Station and Extension employees for improving their survey designs. Opportunities come-up each year to do experiments by convincing these professionals that inclusion of an experiment will help them design a better survey. The joint work now underway on address-based sampling, the use of mail with web data collection methods, combining probability and nonprobability sample data are examples of how committee members actively influence others to conduct new experiments, not previously planned.


Because survey methodology is changing so rapidly and the desire to be cost-effective through use of the web, it is difficult to anticipate the exact experiments that members of the committee will conduct during the life of the committee. The typical time-lag between development of a new idea and getting it tested by WERA members in this area of research is several months to a year. Committee members report at the annual meeting, typically held in February of each year. When the results of a new idea tested during the year appear promising, another committee member will find a way to provide a further test (and sometimes an exact replication) the same year. Thus, the committee is quite dynamic in its operation. We expect this philosophy of operation to continue under renewal of the coordinating committee.

Expected Outcomes and Impacts

  • Introduce members to innovative ideas for improving survey quality being tested by individual members.
  • Critique proposed survey designs and instruments at least annually, and through follow-up among individuals, in order to improve one another's experiments.
  • Coordinate proposed experimental designs and the dissemination of results across states and agencies.
  • Facilitate, when appropriate, the joint write-up and publication of research results.
  • Update best practices for conducting sample surveys of the general public (especially those in rural areas) which use appropriate technologies.
  • Increase capacity of units in the land grant system for conducting surveys that yield scientifically accurate data for use in assessing needs of client groups, evaluating the effectiveness of extension programs, and understanding issues facing agriculture and Rural America.
  • Infuse research findings on best practices into graduate-level curriculum and non-formal professional development programs.

Projected Participation

View Appendix E: Participation

Educational Plan

Educational outreach to professionals and graduate students involved in agricultural and rural research. WERA committee members have conducted presentations, workshops, and short courses at conferences attended by agricultural and rural researchers, including that of the Rural Sociological Society, the Southern Association of Agricultural Scientists, the American Association for Public Opinion Research, and the American Statistical Association to inform participants about the latest methodological findings of the committee and their application. We plan to continue our outreach efforts with these and other relevant groups.


Outreach to Extension professionals working in the field. County agents and state specialists frequently conduct surveys to assess needs and evaluate programs while dealing with the constraints of limited resources and access. Committee members, including will conduct in-service training workshops, which incorporate WERA research findings into practical steps for participants to conduct cost-effective, credible surveys. In addition, WERA members will develop brief fact sheets on selected topics to provide user-friendly advice about survey procedures to county agents and specialists. To date, a WERA member at the University of Florida has developed 20 fact sheets for the Savvy Survey Series and over 28000 factsheets have been downloaded since the series began in 2013.  Finally, WERA committee members will assist, consult, and collaborate with others to conduct surveys of extension audiences (e.g., Singletary & Smith, 2006). In these instances, methods developed by the committee are embedded into the survey design and implementation.


Outreach to survey methodologists, evaluators, and other relevant professional groups. WERA members regularly conduct presentations and workshops for survey methodologists and professional conferences (e.g., American Association of Public Opinion Research). In addition, members periodically present findings to other relevant groups, such as participants at the American Evaluation Association conference (e.g., Israel, 2012b; Kumar Chaudhary & Israel, 2016b). This facilitates the diffusion of research findings to a broad array of practitioners involved in surveys, including those working in the agricultural and rural development fields.


Publication of research findings in peer-reviewed journals. WERA committee members will publish study results in leading journals in the survey methodology field as well as applied journals used by the committee s primary stakeholders. The committee has established a long-standing record of productivity in well-respected journals, including Public Opinion Quarterly, Rural Sociology, Journal of Official Statistics, and numerous other journals and this work is frequently cited by other survey scholars. The committee plans to continue publishing in these venues.


Infusion into undergraduate and graduate curriculum. WERA committee members have continued to integrate members’ work into undergraduate and graduate coursework. To date, more than 300 students at Texas A&M University (TAMU) have completed at least one undergraduate research methods course that integrates results from the annual WERA meetings and Dillman et al. (2014). These students also participated in data collection activities that contribute to one or more goals of WERA 1010. Since 2012, WERA committee members have mentored more than 19 undergraduate students who completed the Undergraduate Research Scholars (URS) program at TAMU and have used survey data collected from WERA 1010 research efforts. Of the former undergraduate students who have completed the URS program, 12 have enrolled in or have completed a graduate program. Further, several former URS participants have secured research jobs including one at the Congressional Research Service (Library of Congress) and another in Google’s media research division. Each undergraduate student has directly benefited from the research of WERA 1010. These formal undergraduate education efforts will be continued and potentially expanded to other WERA members’ campuses.

Organization/Governance

A chair and secretary will be elected annually. The chair will be responsible for developing an agenda for the annual meeting, and facilitating communication among participants throughout the year. The secretary will be responsible for taking minutes and mailing them to the Administrative Advisor and members.

Literature Cited

(References include those cited plus recent additional work by participants that provides selective background for the proposed coordinating committee activities completed under WERA-1001 and WERA-1010).


Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, N.J.: Prentice Hall.


Bandura, A. (2001). Social cognitive theory of mass communication. Media Psychology, 3(3), 265-299. doi: 10.1207/S1532785XMEP0303_03.


Battaglia, Michael, Dillman, Don A., Frankel, Martin R., Harter, Rachel, Buskirk, Trent D., McPhee Cameron Brook, DeMatteis, Jill Montaquila and Yancey, Tracey.. 2016. Sampling data collection and weighting procedures for address-based sample surveys.  Journal of Survey Statistics and Methodology 4 (4): 476-500.


Beebe, T.J., M.E. Davern, D.D. McAlpine, K.T. Call, and T.H. Rockwood. 2005. Increasing Response Rates in a Survey of Medicaid Enrollees: The Effect of a Prepaid Monetary Incentive and Mixed Modes (Mail and Telephone). Medical Care. 3(4):411-4.


Brenner, Joanna and Lee Rainie. 2012. Pew Internet: Broadband. Pew Internet & American Life Project, Pew Research Center. Accessed June 3, 2012 at: http://pewinternet.org/Commentary/2012/May/Pew-Internet-Broadband.aspx


Christian, L. and D.A. Dillman. 2004. The Influence of Symbolic and Graphical Language Manipulations on Answers to Paper Self-Administered Questionnaires. Public Opinion Quarterly. 68(1):57-80.


Conger, R. D., Stockdale, G. D., Song, H., Robins, R. W., & Widaman, K. F. 2011. Predicting change in substance use and substance use cognitions of Mexican origin youth during the transition from childhood to early adolescence. In Y. F. Thomas, L. N. Price, & A. V. Lybrand (Eds.), Drug use trajectories among African American and Hispanic youth. New York: Springer.


Dillman, D.A., V. Lesser, R. Mason, J. Carlson, F. Willits, R. Robertson, and B. Burke. 2007. Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies. Rural sociology, 72(4), 632-646.


Dillman, D.A. 2006. Why Choice of Survey Mode Makes a Difference. Public Health Reports, 121(1):11-13.


Dillman, Don A. 2015.  Future Surveys. Bureau of Labor Statistics Monthly Labor Review. November.   http://www.bls.gov/opub/mlr/2015/article/future-surveys.htm.


Dillman, Don A. 2016. Moving Survey Methodology Forward in our Rapidly Changing World: A Commentary, Journal of Rural Social Sciences, 31(3): 160-174.


Dillman, Don A. 2017. The promise and challenge of pushing respondents to the Web in mixed-mode surveysSurvey Methodology, Statistics Canada, Catalogue No. 12‑001‑X, Vol. 43, No. 1. Paper available as PDF (English): http://www.statcan.gc.ca/pub/12-001-x/2017001/article/14836-eng.pdf 


Dillman, Don A. and Michelle L. Edwards.  2016. Chapter 17. Designing a Mixed-Mode Survey. In Wolfe, Christof, Joye, Dominique, Smith, Tom W. and Fu, Yang-chih (eds.) Sage Handbook of Survey Methodology. Sage Publications Wolf, Joye, Smith and Fu. Thousand Oaks. CA pp.255-268.


Dillman, Don A., Feng Hao, Morgan M. Millar. 2016. Chapter 13.  Improving the Effectiveness of Online Data Collection by Mixing Survey Modes. In Fielding, Nigel, Raymond M. Lee and Grant Blank (eds.).   The Sage handbook of Online Research Methods, 2nd edition. Pp.220-237 Sage Publications, London.


Dillman, D.A., A. Gertseva, and T. Mahon-Haft. 2005. Achieving Usability in Establishment Surveys Through the Application of Visual Design Principles. Journal of Official Statistics, 21(2):183-214.


Dillman, D.A. and L.M. Christian. 2005. Survey Mode as a Source of Instability Across Surveys. Field Methods, 17(1):30-52.


Dillman, D. A., Smyth, J. D., & Christian, L. M. 2014. Internet, phone, mail, and mixed-mode surveys: The tailored design method. (4th ed.). Hoboken, NJ: John Wiley and Sons.


Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth. 2014. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Public Opinion Quarterly, 78 (3): 734-750.


Farias, K., McKim, B. R., Yopp, A. M., & Hernandez, F. (2015, August). Using video diaries as an alternative to mail diaries: Engaging Millennials in hard-to-reach populations. Proceedings of the of the 2015 Annual Meeting of the Rural Sociological Society. Madison, WI.


Harter, Rachel, Battaglia, Michael P., Buskirk, Trent D., Dillman, Don A., English, Ned, Mansour Fahimi,    Frankel, Martin R., Kennel, Timothy, McMichael, Joseph, McPhee, Cameron Brook,  Montaquila, Jill, Yancey, Tracie, and Zukerberg, Andrew L.  2016.  Address-base Sampling.  American Association for Public Opinion Research Task Force Report http://www.aapor.org/getattachment/Education-Resources/Reports/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx  140 pages. 


Hill, J., Mobly, M., & McKim, B. R. (2015, February) Reaching Millennials: Implications for advertisers of competitive sporting events that use animals. Proceedings of the 2015 Agricultural Communications section of the Annual Meeting of the Southern Association of Agricultural Scientists. Atlanta, GA.


Hill, J. S., Mobly, M., & McKim, B. R. (2016). Reaching Millennials: Implications for Advertisers of Competitive Sporting Events that Use Animals. Journal of Applied Communications.


Israel, G. D. 2006. Visual Cues and Response Format Effects in Mail Surveys. Paper presented at Southern Rural Sociological Association, Orlando, FL, February.


Israel, G. D. 2009a. Obtaining Responses by Mail or Web: Response Rates and Data Consequences. Survey practice, June. Available at: http://surveypractice.org/2009/06/29/mail-vs- web/.


Israel, G. D. 2009b. Obtaining Responses by Mail or Web: Response Rates and Data Consequences. JSM Proceedings, Survey Research Methods Section. 5940-5954. Available at: http://www.amstat.org/Sections/Srms/Proceedings/.


Israel, G. D. 2010a. Effects of Answer Space Size on Responses to Open-ended Questions in Mail Surveys. Journal of official statistics, 26(2), 271-285.


Israel, G. D. 2010b. Using Web Surveys to Obtain Responses from Extension Clients: A Cautionary Tale. Journal of extension, 48(4), available at: http://www.joe.org/joe/2010august/a8.php.


Israel, G. D. 2011. Strategies for Obtaining Survey Responses from Extension Clients: Exploring the Role of E-mail Requests. Journal of extension, 49(3), available at: http://www.joe.org/joe/2011june/a7.php.


Israel, G. D. 2012a. Combining Mail and E-mail Contacts to Facilitate Participation in Mixed-Mode Surveys. Social Science Computer Review. Published online November 28, 2012 at http://ssc.sagepub.com/content/early/2012/11/26/0894439312464942. doi: 10.1177/0894439312464942.


Israel, G. D. 2012b. Mixed-Mode Methods for Follow-up Surveys of Program Participants. Demonstration presented at the Annual Conference of the American Evaluation Association, Minneapolis, MN. October.


Israel, G. D. Lessons Learned While Planning and Conducting a Survey of Florida Residents about Climate Change Opinions. Presented at the annual meeting of the Rural Sociological Society, Columbus, OH, July, 2017.


Israel, G. D., & Galindo-Gonzalez, S. 2010. Getting Optimal Answers to Open-ended Questions: An Experiment with Verbal Prompts and Visual Cues. Paper presented at the annual meeting of the Rural Sociological Society, Atlanta, GA, August.


Israel, G. D., & Lamm, A. J. 2012. Item Non-response in a Client Survey of the General Public. Survey Practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/item- nonresponse-in-a-client-survey-of-the-general-public/#more-6070.


Israel, G. D., Newberry, III, M. G., & Lamm, A. J. Climate Change Knowledge and Perceptions of Florida Residents: Challenges and Opportunities for Florida Master Naturalists. Paper presented at the International Symposium for Society and Resource Management, Ümea, Sweden, June, 2017.


Kumar Chaudhary, A., & Israel, G. D. 2016a. Influence of Importance Statements and Box Size on Response Rate and Response Quality of Open-ended Questions in Web/Mail Mixed-Mode Surveys. Journal of Rural Social Sciences, 31(3), 140-159.


Kumar Chaudhary, A., & Israel, G. D. 2016b. A Demonstration on Optimizing Mixed-Mode Surveys to Address Device Variability in Program Evaluation. Demonstration presented at the American Evaluation Association, Atlanta, GA, October, 2016.


Lesser, V.M., & L. Newton. 2007a. Comparison of Delivery Methods in a Survey Distributed by Internet, Mail, and Telephone. Proceedings of the International Statistics Institute Meetings. Lisbon, Portugal, August.


Lesser, V.M., & L. Newton. 2007b. Effects of Mail Questionnaire formats on answers to Open- Ended Questions. Unpublished paper presented at annual meetings of the American Statistical Association, Salt Lake City, Utah. August 3, 2007.


Lesser, V.M., K. Hunter-Zaworski, L. Newton and D. Yang. Using Multiple Survey Modes in a Study of Individuals with Disabilities. Presented at the American Association for Public Opinion Research, New Orleans, Louisiana, May, 2008a.


Lesser, V.M., L. Newton, and D. Yang. Evaluating Frames and Modes of Contact in a Study of Individuals with Disabilities. Presented at the American Statistical Association Meetings, Denver, Colorado, August, 2008b.


Lesser, V.M. and D. Yang. Alternatives to Phone Surveys: a study comparing Random Digit Dialing with Mail and Web using the Postal Delivery Sequence File. Presented at the American Association for Public Opinion Research, Hollywood, Florida, May, 2009.


Lesser, V.M., L. Newton, and D. Yang. Does Providing a Choice of Survey Modes Influence Response? Presented at the American Association for Public Opinion Research, Chicago, Illinois, May, 2010.


Lesser, V.M., L. Newton, and D. Yang. Evaluating Methodologies to Increase Internet Responses in Mixed-Mode Surveys. Proceedings of the International Statistics Institute Meetings, Dublin, Ireland, August, 2011a.


Lesser, V.M., A. Olstad, D. Yang, L. Newton. Comparing Item Nonresponse and Responses Across Modes in General Population Surveys. Presented at the American Association for Public Opinion Research, Phoenix, Arizona, May, 2011b.


Lesser, V. M., Newton, L. A., & Yang, D. 2012. Comparing item nonresponse across different delivery modes in general population surveys. Survey Practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/comparing-item-nonresponse-across-different- delivery-modes-in-general-population-surveys-2/#more-6026.


Lesser, V. M., Newton, L. D., Yang, D. K., & Sifneos, J. C. 2016. Mixed-Mode Surveys Compared with Single Mode Surveys: Trends in Responses and Methods to Improve Completion. Journal of Rural Social Sciences, 31(3), 7-34.


Lesser, Virginia M., Nawrocki, Kerri, & Newton, Lydia. 2017a. Improving Response in Multimode and Single Mode Probability Based Surveys Compared to a Non-probability Survey. Presented at the annual meeting of the European Survey Research Association, Lisbon, Portugal, July, 2017.


Lesser, Virginia M., Nawrocki, Kerri, & Newton, Lydia. 2017b. Promises and Pitfalls: Experiences and Lessons Learned from Using Commercial Survey Services. Presented at the annual meeting of the Rural Sociological Society, Columbus, OH, July, 2017.


Lorenz, F., L. Hildreth, V.M. Lesser, & U. Genshel. 2010. General-specific questions in survey research: a confirmatory factor analysis approach. Presented at the Annual Meeting of the Rural Sociology Society, August.


Mahon-Haft, T. A., & Dillman, D. A. 2010. Does Visual Appeal Matter? Effects of Web Survey Aesthetics on Survey Quality. Survey research methods, 4(1), 43-59.


McKim, B. R., Specht, A. R., & Stewart, A. Y. (2015, June). Video ethnography: An approach to collecting, archiving, and sharing data and results. Proceedings of the 2015 NACTA Conference. Athens, GA.


McKim, B. R., Stewart, A. Y., & Bishop, D. M. (2015, August). An experiment testing variations of the home delivery survey method. Proceedings of the 2015 Annual Meeting of the Rural Sociological Society. Madison, WI.


McMaster, Hope Seib, Cynthia A. LeardMann, Steven Speigle and Don A. Dillman. 2017.  An Experimental Comparison of Web-push vs. Paper-only survey Procedures for Conducting an In-Depth health Survey of Military Spouses.  BMC Medical Research Methodology.   https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-017-0337-1 . April 26, 9 pages.


Messer, B. L., & Dillman, D. A. 2011. Using address-based sampling to survey the general public by mail vs. Web plus mail. Public Opinion Quarterly, 75(3), 429-457. doi: 10.1093/poq/nfr021.


Messer, B. L., Edwards, M. L., & Dillman, D. A. 2012. Determinants of item nonresponse to web and mail respondents in three address-based mixed-mode surveys of the general public. Survey practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/determinants-of-item- nonresponse-to-web-and-mail-respondents-in-three-address-based-mixed-mode-surveys-of-the- general-public/#more-5998.


Messer, Benjamin L. 2012. Pushing households to the web: Experiments of a web+mail methodology for conducting general public surveys. Dissertation. Pullman, WA: Washington State University.


Millar, M. M., & Dillman, D. A. 2011. Improving response to Web and mixed-mode surveys. Public Opinion Quarterly,75,(2), 249-269. doi: 10.1093/poq/nfr003.


Millar, M. M., & Dillman, D. A. 2012. Do Mail and Internet Surveys Produce Different Item Nonresponse Rates? An Experiment Using Random Mode Assignment. Survey practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/do-mail-and-internet-surveys- produce-different-item-nonresponse-rates-an-experiment-using-random-mode-assignment/


Newberry, III, M. G., & Israel, G. D. 2017. Comparing Two Web/Mail Mixed-Mode Contact Protocols to a Unimode Mail Survey. Field Methods, 29(3), 281-298.


Redline, C.D., D.A. Dillman, A. Dajani, and M.A. Scaggs. 2003. Improving Navigational Performance in U.S. Census 2000 By Altering the Visual Languages of Branching Instructions. Journal of Official Statistics. 19(4):403-420.


Rookey, Brian, & Dillman, Don A. 2008. Do Web and Mail Respondents Give Different Answers in Panel Surveys. Unpublished paper prepared for Annual Conference of the American Association for Public Opinion Research. New Orleans, LA.


Singletary L. and M. Smith. 2006. Nevada Agriculture Producer Research and Education Needs: Results of 2006 Statewide Needs Assessment. University of Nevada Cooperative Extension, EB- 06-02. pp. 118.


Smyth, J.D., D.A. Dillman, L.M. Christian, & M.J. Stern. 2006a. Comparing Check-All and Forced-Choice Question Formats in Web Surveys. Public Opinion Quarterly. 70(1):66-77.


Smyth, J.D., D.A. Dillman, L.M. Christian, & M.J. Stern. 2006b. Effects of Using Visual Design Principles to Group Response Options in Web Surveys. International Journal of Internet Science. 1(1):5-15.


Smyth, J.D., & Dillman, D.A. 2007. Open-ended Questions in Mail, Web and Telephone Surveys. Unpublished paper presented at annual meeting of the American Statistical Association, Salt Lake City, Utah. August.


Smyth, J. D., Dillman, D. A., Christian, L. M., & O Neill, A. C. 2010. Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century.  American Behavioral Scientist, 53(9):325-37.


Stern, Michael J., Ipek Bilgen and Don A. Dillman.  2014. The State of Survey Methodology: Challenges, Dilemmas and new Frontiers in the Era of the Tailored Design. Field Methods, (August) 26: 284-301.


Swinford, Stephen. 2007. How Answer Spaces Affect Answers to Open-Ended Questions in Mail Surveys; Results from Multiple Experiments. Unpublished paper presented at Annual Meeting of the American Statistical Association, Salt Lake City, Utah. August.


Toepoel, V., & Dillman, D. A. 2011. Words, Numbers, and Visula Heuristics in Web Surveys: Is There a Hierarchy of Importance? Social Science Computer Review, 29(2), 193-207.


Wilcox, A. S., Giuliano, W. M., & Israel, G. D. 2010. Response Rate, Nonresponse Error, and Item Nonresponse Effects When Using Financial Incentives in Wildlife Questionnaire Surveys. Human dimensions of wildlife, 15(4), 288-295.


Zickhur, Kathryn, & Smith, Aaron. 2012. Digital differences. Pew Internet & American Life Project, Pew Research Center. Accessed June 4 2012 at: http://pewinternet.org/~/media//Files/Reports/2012/PIP_Digital_differences_041312.pdf

Attachments

Land Grant Participating States/Institutions

FL, GA, IA, IL, MO, MT, NY, OR, PA, TX, UT, WA

Non Land Grant Participating States/Institutions

South Dakota State University, University of Idaho
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.