WERA_OLD1010: Reduction of Error in Rural and Agricultural Surveys

(Multistate Research Coordinating Committee and Information Exchange Group)

Status: Inactive/Terminating

WERA_OLD1010: Reduction of Error in Rural and Agricultural Surveys

Duration: 10/01/2008 to 09/30/2013

Administrative Advisor(s):


NIFA Reps:


Non-Technical Summary

Statement of Issues and Justification

A quiet crisis has developed in the ability of agencies, sciences and businesses to do high quality surveys in the U.S, including those aimed at providing important information about rural and agricultural populations. In the last quarter of the 20th century, telephone surveys became the dominant way of doing general public surveys needed for estimating behaviors such as employment rates, effects of rural development initiatives on entrepreneurial efforts, and consumption of food products. During this time, the telephone also became the dominant mode for surveying opinions, e.g. community and employment satisfaction, desire for new products and services, and satisfaction with rural and urban development activities. Face-to-face interviews continue to be used for the nation's most critical national surveys (e.g. the monthly Currently Population Survey used to estimate employment rates and the USDA's Agricultural Resource Management Annual Survey of Farm operators) and mail surveys are used for surveys of list populations (participants in Extension programs whose names and addresses were available.)

The telephone is now losing favor as a data collection methodology, because of low response rates, and greater use of cell phones, which has resulted in fewer households having landlines. For a variety of reasons, cell phones are not generally accessible for the conduct of general public surveys. In response to this trend, many surveyors have turned to the web as an alternative data collection strategy. The difficulty with this response is that only about 70% of the U.S. households have access to the web, and many who do have access seem unwilling to respond to web surveys. Those without web access have significantly less education, lower incomes, are less likely to be married, have less stable employment and differ in other significant ways (Rookey and Dillman, 2007).

Many private sector firms have increasingly turned to web only panels for data collection, whereby volunteers are sought and surveyed repeatedly in order to lower costs of doing surveys. It is statistically inappropriate to generalize results to any larger population, as is done through scientific means for random sample surveys. The nature of this problem is summarized elsewhere (ESOMAR, 2006). Government agencies, as well as private sector organizations are searching for viable alternatives to the telephone in order to maintain the accuracy of data sets while being able to take advantage of lower data collection costs offered by web survey.

In addition to the above problems Rural America now faces a unique data crisis because of the loss of the long form from the Decennial Census, which was the primary source of information for rural communities and counties. In 2000, a minimum of 1 in 6 households in every location in the U.S. were required to complete the long census form (as opposed to the short one that asks only for name, sex, race, ethnicity, marital status, housing ownership) in order to estimate with precision the percent of households with different levels of income, education, commuting status, and many other questions important to the formation of rural policy. The long-census form has been replaced by the American Community Survey for use after 2000. This survey obtains "long-form" information from a few hundred thousand households each year. However, from a statistical standpoint, it is necessary to accumulate data over several years (up to five) in order to make reliable estimates for these demographics in smaller rural communities. This means that detailed demographic information for a particular year, is no longer available, past 2000, for most rural counties, especially in the more sparcely populated west. This change in how demographic data are collected puts significant pressure on surveyors to develop methods that will make it possible to gather information that describes situations for a particular time or season (e.g. the poverty rate in a particular county for a given year) that will no longer be available from the US Census.

Since 2002, WERA 1001 and its predecessor Regional Project W-183 have been studying ways of improving survey methodologies for rural and agricultural surveys, including a replacement for the telephone. Members of WERA, who are now requesting continuation of this coordinating committee, have for many years, researched ways of improving mail, web, and telephone methods, including use in mixed-mode surveys, whereby some people are surveyed via one mode (e.g. mail) and others by a second mode (e.g. web or telephone). For ten years preceding 2002, this work was carried on under Western Regional Project W-183.

In 2005 work by WERA 1001 members commenced in what may be the most promising replacement for survey methods that are losing their effectiveness. The U.S. Postal Service now makes available a listing of postal delivery addresses, known as the Delivery Sequence File or DSF. WERA participants have been quite active in researching the possibilities this list provides as a replacement for telephone methods. Todd Rockwood, who joined WERA a year ago from Minnesota and his colleagues have done fundamental investigation of whether use of the postal list can provide demographically representative samples. Virginia Lesser from Oregon, and a long-time member of WERA and W-183 that preceded it, has investigated blending web and mail methods together to obtain improved responses in addition to comparing the quality of responses between the two modes (Lesser and Newton, 2007a). Building upon this work, Don Dillman, also a long-time member of WERA and W-183, is now researching new procedures for using the list to contact a representative general public population and encouraging them to respond by web as a means of reducing survey costs, while developing a procedure to replace the telephone methods advocated in his books (Mail and Telephone Surveys, 1978; Mail and Internet Surveys, 2000) which are now inadequate as a basis for resolving problems stemming from the telephone revolution of the 1990's.

Other work of WERA 1001 has positioned this committee to be particularly effective in developing a replacement strategy for the telephone. Several committee members have been studying the effects of visual vs. aural communication on how people answer survey questions, making this committee a leader in such research. For example, numerous experiments on how different visual layouts affect answers have been carried out by Oregon, Florida, Montana and Washington members of WERA-1001.The results of these experiments provide an essential foundation for attempting to develop effect procedures for using combined mail and web approaches as a possible replacement for telephone survey methods.

WERA 1001 members as well as members of its predecessor, W-183, have developed a long tradition of interacting intensively on a wide variety of specific survey design issues on response rate improvements (including, Cornell, Pennsylvania, Iowa, New Hampshire, replicating experiments and reporting these to professional audiences in ways that go beyond the individual state work. For example, in August 2007, four members of WERA (Lesser from Oregon, Israel from Florida, Swinford from Montana, and Dillman from Washington) presented papers at the annual meeting of the American Statistical Association in a session devoted to developing more effective open-ended questions (Lesser and Newton, 2007b; Swinford, 2007; Smyth and Dillman, 2007; Israel, 2007). This work needs to be continued, because of one of the most promising features of web surveying is to be able to improve the quality of open-ended questions, which were virtually lost for cost reasons in the telephone survey era. This builds upon previous work by WERA members Israel (2005) and Dillman (Christian and Dillman, 2004) that showed ways of improving the quality of answers in open-ended questions. We know of no other researchers in the U.S. who have done as much research aimed at improving the quality of open-ended questions as WERA has accomplished in the last two years.

Members of WERA and its W-183 predecessor have developed a procedure of working jointly on publications that reports results from replicating and extending one another's studies. One of these outcomes is an article being published in December 2007 in the scientific journal, Rural Sociology, which includes data from five WERA and W-183 states (New Hampshire, Idaho, Oregon, Pennsylvania and Washington). It confirms the importance of personalization for improving response rates in general public surveys, but questions its effectiveness in group-specific surveys. This work is quite relevant to the work now being done on findings ways of improving the effectiveness of an initial mail contact to complete a web survey, being done now by committee members. It's unlikely that these data would have been published had it not been for the replication of studies across states.

Renewal of WERA 1001 will allow current members, as well as new members, to continue and initiate new coordinated scientific work across states on the design and conduct of research to improve a variety of survey methods, including but not limited to effective alternatives to the telephone. The committee will also continue its work in topic focused surveys of importance to rural areas, e.g. environmental and rural recreation surveys by Gentner (National Marine Fisheries) Brown and his colleagues (Cornell), Robertson (New Hampshire) and Mertig (Middle Tennessee State University). The addition of Allen (Utah State) also adds a community development perspective to our work that will allow testing of ideas in another setting, important to rural people and places.

Objectives

  1. Continue the interaction and collaboration of researchers and extension faculty using sample survey methods, for arriving at a better understanding of how to assure the conduct of quality sample surveys, at a time, when use of the telephone is losing its effectiveness for government, private sector and university surveys.
  2. Explore intensively the use of the Postal Delivery Sequence File, as a sample source and potential replacement for random digit dialing by telephone, which appears to have lost its effectiveness in obtaining a representative sample of U.S. residents.
  3. Encourage and facilitate the joint writing and publication of research results by members of the Coordinating Committee.
  4. Disseminate research findings through teaching, seminars including Extension in-service training by members of the Coordinating Committee to reach survey developers and consumers in the land grant system. In addition to the methodical insights generated from the Multi-State research and the publications, results of the data are usually provided to participants, local officials, and other interested parties at the community level to impact local decision making.

Procedures and Activities

WERA 1001 meets annually for researchers to share their activities and plans for the coming year. All participants are encouraged to present tentative plans on their future studies in order to obtain advice and comment from other participants. One of the typical outcomes of these discussions is to encourage other participants to develop a new test and/or repeat a test conducted by other colleagues on the committee in their own survey work. Previous work on WERA and its W-183 predecessor has shown that members are particularly effective in developing new tests because of their roles in experiment station, extension and other work locations in helping design surveys. WERA members are often consulted by other Agricultural Experiment Station and Extension employees for improving their survey designs. Opportunities come-up each year to do experiments by convincing these professionals that inclusion of an experiment will help them design a better survey. The joint work now underway on visual design layouts such as open-ended questions and use of the postal delivery sequence file are examples of how committee members actively influence others to conduct new experiments, not previously planned.

Because survey methodology is changing so rapidly with the decline of telephone, and the desire to be cost-effective through use of the web, it is difficult to anticipate the exact experiments that members of the committee will conduct during the life of the committee. The typical time-lag between development of a new idea and getting it tested in this area of research is several months to a year. Committee members report at the annual meeting, typically held in March of each year. When the results of a new idea tested during the year appear promising, another committee member will find a way to provide a further test (and sometimes an exact replication) the same year. Thus, the committee is quite dynamic in its operation. We expect this philosophy of operation to continue under renewal of the coordinating committee.

Expected Outcomes and Impacts

  • Throughout the existence of WERA and W-183, committee members have been focused on both the creation and application of new knowledge which will continue if the committee's work is renewed. The following outcomes and impacts are expected from the work of this Committee:
  • Introduce members to innovative ideas for improving survey quality being tested by individual members
  • Critique proposed survey designs and instruments at least annually, and through follow-up among individuals, in order to improve one another's experiments.
  • Coordinate proposed experimental designs and the dissemination of results across states and agencies.
  • Facilitate when appropriate the joint write-up and publication of research results.
  • Update best practices for conducting sample surveys of the general public (especially those in rural areas) which use appropriate technologies.
  • Increase capacity of units in the land grant system for conducting surveys that yield scientifically accurate data for use in assessing needs of client groups, evaluating the effectiveness of extension programs, and understanding issues facing agriculture and Rural America.

Projected Participation

View Appendix E: Participation

Educational Plan

Several members of the committee have Extension appointments. Singletary and Smith in Nevada have provided workshops to other extension faculty on how to use web survey technology and have published survey results on Nevada farms that builds on the work of other committee members. Methods developed by committee members have been embedded in extension evaluation surveys conducted annually in Florida by Israel. Lesser's work has included conducting evaluation surveys for extension audiences, e.g. master gardeners in Oregon and has collaborated with many extension faculty each year to conduct surveys. Extension faculty in Oregon have also benefited by a presentation provided to them on "how to do a survey".

Work done under WERA 1001 has also contributed to a complete redesign of the questionnaire used in the annual Agricultural Resource and Management Survey of 30,000 farmers conducted by the USDA, enabling this federal agency to begin implementation by mail, and save the far more costly personal enumeration until late in the data collection process (e.g. Beckler and Ott; Dillman, Gertseva and Mahon-Haft, 2005). Several workshops are taught in Washington D.C. each year as well as other locales by a WERA 1001 member that has brought WERA 1001's work to audiences of private sector, federal agency and university surveyors. These short courses will continue with a renewed emphasis on private sector organizations because of the decline in survey quality that has occurred as firms have shifted to web only surveys, which are inadequate.

Members of the committee regularly publish their work in referred journals, books and other publications. WERA 1001 is the only committee within the agricultural experiment station system that we are aware of, which is focused on the improving of survey methods. This relatively small, but important group is looked to for national leadership on making its research available to diverse audiences. Much of the value of our work is that members come from a variety of work settings, in addition to the Agricultural Experiment Stations, and has national representation in its membership. This small group (typically 12-15 attendees at each year's meeting) and commitment to common interests among members in advancing survey methodology has resulted in particularly effective interaction on joint research efforts that continues between our annual meetings.
The issues at stake in the committee's work are large. Because we are in danger of losing one of our most used survey methods-telephone surveying-as people's communication patterns shift in another direction, it's critical that we find new survey methods that allow us to be able to measure important behaviors and attitudes of rural and agricultural audiences. Loss of our main source of demographic point-in-time information on rural places and people-the Census long-form-and its replacement by procedures that will provide estimates of critical demographic information only over a multi-year period for people living in rural towns and counties give an urgency to the committee's work that we have not previously experienced. Because of this urgency we expect that the committee will discuss innovated new ways of reaching audiences more effectively, which will require close collaboration between extension and research members of the committee.

Organization/Governance

A chair and secretary will be elected annually. The chair will be responsible for developing an agenda for the annual meeting, and facilitating communication among participants throughout the year. The secretary will be responsible for taking minutes and mailing them to the Administrative Advisor and members.

Literature Cited

(References include those cited plus recent additional work by participants that provides selective background for the proposed coordinating committee activities completed under WERA-1001).

Beebe, T.J., M.E. Davern, D.D. McAlpine, K.T. Call, and T.H. Rockwood. 2005. Increasing Response Rates in a Survey of Medicaid Enrollees: The Effect of a Prepaid Monetary Incentive and Mixed Modes (Mail and Telephone). Medical Care. 3(4):411-4.



Burdge, R.J. and R.A. Robertson. 2006. Social Impact Assessment and the Public Involvement Process. In A Conceptual Approach to Social Impact Assessment. R.J. Burdge (ed.), Middletown, WI: Social Ecology Press. pp. 177-187.



Christian, L. and D.A. Dillman. 2004. The Influence of Symbolic and Graphical Language Manipulations on Answers to Paper Self-Administered Questionnaires. Public Opinion Quarterly. 68(1):57-80.



Connelly, N.A., T.L. Brown, and D.J. Decker. 2003. Factors Affecting Response Rates to Natural-Resource-Focused Mail Surveys: Empirical Evidence of Declining Rates Over Time. Society and Natural Resources. 16:541-549.



Cui, M., F.O. Lorenz, R.D. Conger, J.N. Melby, and C.M. Bryant. 2005. Observer, Self and Partner Reports of Hostile Behaviors in Romantic Relationships. Journal of Marriage and Family. 67:1169-1181.



Dillman, Don A., Arina Gertseva and Taj Mahon-Haft. 2005. Achieving Usability in Establishment Surveys Through the Application of Visual Design Principles. Journal of Official Statistics 21(2):183-214.



Dillman, D.A., V. Lesser, R. Mason, J. Carlson, F. Willits, R. Robertson, and B. Burke. Forthcoming. Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies. Rural Sociology. 72(4):632-646.



Dillman, D.A. 2007. Mail and Internet Surveys: The Tailored Design, Second Edition-2007 Update. New York: John Wiley. 565 pp. ISBN: 0-470-03856-x. 523 pp.



Dillman, D.A. 2006. Why Choice of Survey Mode Makes a Difference. Public Health Reports. 121(1):11-13.



Dillman, D.A., A. Gertseva, and T. Mahon-Haft. 2005. Achieving Usability in Establishment Surveys Through the Application of Visual Design Principles. Journal of Official Statistics. 21(2):183-214.



Dillman, D.A. and L.M. Christian. 2005. Survey Mode as a Source of Instability Across Surveys. Field Methods. 17(1):30-52.



ESOMAR. 2006 Panel Research. 2006. Proceedings from the European Society of Marketing and Opinion Research, Barcelona, Spain. November 26-28, 2006.

Gessert, C.E., K. Hyer, R.L. Kane, T. Rockwood, A.B. Brassard, K. Desjardins, and R.A. Kane. 2005. Cognitive Impairment and Quality-of-Life: Views of Providers of Long-Term Care Services. Alzheimer Disease & Associated Disorders. 19(2):85-90.



Hartley, T.W. and R.A. Robertson. 2006. Emergence of Multi-Stakeholder Driven Cooperative Research in the Northwest Atlantic: The Case of the Northeast Consortium. Marine Policy. 30(5):580-592.



Hartley, T.W. and R.A. Robertson. 2006. Stakeholder Engagement, Cooperative Fisheries Research, and Democratic Science: The Case of the Northeast Consortium. Human Ecology Review. 13(2):161-171.



Heleski, C.R., A.G. Mertig, and A.J. Zanella. 2006. Stakeholder Attitudes Toward Farm Animal Welfare. Anthrozoos. 19(4):290-307.



Israel, Glen 2007. Effects of Answer Space Size on Responses to Open-ended Questions in Mail Surveys. Unpublished Paper presented to Annual Conference of the American Statistical Association in Salt Lake City, Utah. August 3, 2007.



Kane, R.L., B. Bershadsky, T. Rockwood, K. Saleh, and N.C. Islam 2005. Visual Analog Scale Pain Reporting Was Standardized. Journal of Clinical Epidemiology. 58(6):618-23.



Kane, R.L., T. Rockwood, K. Hyer, K. Desjardins, A. Brassard, C. Gessert, R. Kane, and C. Mueller. 2006. Nursing Home Staff's Perceived Ability to Influence Quality of Life. Journal of Nursing Care Quality. 21(3):248-55.



Kane, R.L., T. Rockwood, K. Hyer, K. Desjardins, A. Brassard, C. Gessert, and R. Kane. 2005. Rating the Importance of Nursing Home Residents' Quality of Life. Journal of the American Geriatrics Society. 53(12):2076-82.



Kralewski, J., B.E. Dowd, A. Kaissi, A. Curoe, and T. Rockwood. 2005. Measuring the Culture of Medical Group Practices. Health Care Management Review. 30(3):184-93.



Lesser, V.M. and L. Newton. 2007a. Comparison of Delivery Methods in a Survey Distributed by Internet, Mail, and Telephone. Proceedings of the International Statistics Institute Meetings. Lisbon, Portugal.



Lesser, V.M. and L. Newton. 2007b. Effects of Mail Questionnaire formats on answers to Open-Ended Questions. Unpublished paper presented at annual meetings of the American Statistical Association, Salt Lake City, Utah. August 3, 2007.



Lorenz, F.O., K.A.S. Wickrama, and H. Yeh. 2004. Rural Mental Health: Comparing Differences and Modeling Change. In Critical Issues in Rural Health N. Glasgow, L.W. Morton, & N. Johnson (eds.), Ames, IA: ISU/Blackwell Press. Chapter 7:75-88).



Lorenz, F.O., K.A.S. Wickrama, and R.D. Conger. 2004. Modeling Continuity and Change in Family Relationships With Panel Data. In Continuity and Change in Family Relationships: Theory, Methods, and Empirical Findings. R.D. Conger, F. O. Lorenz & K.A.S. Wickrama (eds.). Mahwah, NJ: Lawrence Erlbaum. Chapter 2:15-62.



Lorenz, F.O., K.A.S. Wickrama, R.D. Conger, and G.H. Elder, Jr. 2006. The Short Term and Decade Long Effects of Divorce on Women's Midlife Health. Journal of Health and Social Behavior. 47:111-125.



Mason, R. and S. Amer. 2006. A Dual Process that Disables the Persuasive Impact of Mass Media Appeals to Obey Tax Laws. In Law and Psychology. Belinda Brooks-Gordon and Michael Freeman (eds.). New York: Oxford University Press.



Massey, M., S. Newbold, and B. Gentner. 2006. Valuing Water Quality Changes Using a Bioeconomic Model of a Coastal Recreational Fishery. Journal of Environmental Economics and Managment. 52(1):482-500.



Munoz-Hernandez, B. and V.M. Lesser. 2005. Adjustment Procedures to Account for Non-Ignorable Missing Data in Environmental Surveys. Environmetrics, 16:1-10.



Ott, Kathy and Dan Beckler. 2007. Incentives in Surveys with Farmers. Paper presented at ICESIII the International Conference on Conference Surveys. Montreal. June 18, 2007.



Peterson, M.N., A.G. Mertig, and J. Liu. 2006. Effects of Zoonotic Disease Attributes on Public Attitudes Toward Wildlife Management. Journal of Wildlife Management. 70(6):1746-1753.



Redline, C.D., D.A. Dillman, A. Dajani, and M.A. Scaggs. 2003. Improving Navigational Performance in U.S. Census 2000 By Altering the Visual Languages of Branching Instructions. Journal of Official Statistics. 19(4):403-420.



Rockwood, T. and M. Constantine. 2005. Demographic and Psychosocial Factors. Understanding Health Care Outcomes Research. R.L. Kane (ed.). Gaithersburg, MD, Aspen.



Rookey, Brian and Don A. Dillman. 2008. Do Web and Mail Respondents Give Different Answers in Panel Surveys. Unpublished paper prepared for Annual Conference of the American Association for Public Opinion Research. New Orleans, LA.



Singletary L. and M. Smith. 2006. Nevada Agriculture Producer Research and Education Needs: Results of 2006 Statewide Needs Assessment. University of Nevada Cooperative Extension, EB-06-02. pp. 118.



Singletary, L., M. Smith, and W. Evans. 2006. Self-Perceived 4-H Leader Competencies and Their Relation to the Skills Youth Learn Through 4-H Youth Development Programs. Journal of Extension. 44(4): Article # 4RIB2.



Smyth, J.D., D.A. Dillman, L.M. Christian, and M.J. Stern. 2006. Comparing Check-All and Forced-Choice Question Formats in Web Surveys. Public Opinion Quarterly. 70(1):66-77.



Smyth, J.D., D.A. Dillman, L.M. Christian, and M.J. Stern. 2006. Effects of Using Visual Design Principles to Group Response Options in Web Surveys. International Journal of Internet Science. 1(1):5-15.



Smyth, J.D. and Dillman, D.A. 2007. Open-ended Questions in Mail, Web and Telephone Surveys. Unpublished paper presented at annual meeting of the American Statistical Association, Salt Lake City, Utah. August 3, 2007.



Swinford, Stephen. 2007. How Answer Spaces Affect Answers to Open-Ended Questions in Mail Surveys; Results from Multiple Experiments. Unpublished paper presented at Annual Meeting of the American Statistical Association, Salt Lake City, Utah. August 3, 2007.



Stern, M.J. and D.A. Dillman. 2006. Community Participation, Social Ties and Use of the Internet. City and Community. 5(4):409-424.



Wickrama, K.A.S., F.O. Lorenz, R.D. Conger, and G.H. Elder, Jr. 2006. Changes in Family Circumstances and the Physical Health of Married and Recently Divorced Mothers. Social Science and Medicine. 63:123-136.



Yeh, H., F.O. Lorenz, K.A.S. Wickrama, R.D. Conger, and G.H. Elder, Jr. 2006. Relationships Between Sexual Satisfaction, Marital Satisfaction and Marital Instability at Midlife. Journal of Family Psychology. 20:339-343.


Attachments

Land Grant Participating States/Institutions

AZ, FL, IA, IL, MD, MT, NH, NV, NY, OH, OR, TX, UT, WA

Non Land Grant Participating States/Institutions

Middle Tennessee State University, National Marine Fisheries Services/Recreational Survey Dept.
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.