WERA1010: Improving Data Quality from Sample Surveys to foster Agricultural, Community and Development in Rural America

(Multistate Research Coordinating Committee and Information Exchange Group)

Status: Inactive/Terminating

SAES-422 Reports

Annual/Termination Reports:

[01/29/2017] [01/29/2017] [01/23/2017] [10/30/2017] [04/13/2018]

Date of Annual Report: 01/29/2017

Report Information

Annual Meeting Dates: 02/27/2014 - 02/28/2014
Period the Report Covers: 10/01/2013 - 09/30/2014

Participants

Glenn Israel, Chair (Florida) gdisrael@ufl.edu;
Lou Swanson, Administrative advisor (Colorado State) Louis.Swanson@ColoState.edu;
Virginia Lesser, (Oregon) lesser@stat.orst.edu;
Gerard Kyle, Texas A & M, gkyle@tamu.edu;
Billy McKim (Texas A & M) brmckim@tamu.edu;
Bo/David Willford (Texas A & M) bodavid@hotmail.com;
Victoria Pilger, Texas A & M victoriaprlger@tamu.edu;
Mallory Mobly, Texas A & M mallerymobly1992@tamu.edu;
Karin Farias, Texas A & M kariz-frs@neo.tamu.edu;
Rob Robertson (New Hampshire) rob.robertson@unh.edu;
Don Dillman (Washington State) dillman@wsu.edu;
Fred Lorenz (Secretary: Iowa State) folorenz@iastate.edu;
Todd Rockwood (Minnesota) rockw001@umn.edu;
Steve Swinford (Montana State) swinford@montana.edu;
Courtney Flint (Utah State) Courtney.Flint@usu.edu

Brief Summary of Minutes

Accomplishments

<p>See meeting minutes.</p>

Publications

<p>Ackerman, R. A., Kashy, D. A., Donnellan, M. B., Neppl, T<strong>., Lorenz, F.O</strong>., &amp; Conger, R. D. (2013). &ldquo;The interpersonal legacy of a positive family climate in adolescence. <em>Psychological Science,</em> 20 (2), 243 &ndash; 250 (Tier 1).</p><br /> <p>&nbsp;</p><br /> <p><strong>Dillman, D. A</strong>. and House, C. C, Editors. (2013). <em>Measuring What We Spend: Toward a New Consumer Expenditure Survey.&nbsp;</em> National Research Council Panel on Redesigning the BLS Consumer Expenditure Surveys. Washington, D. C.: The National Academies Press.</p><br /> <p>&nbsp;</p><br /> <p>Hendee, J.T. and C.G<strong>. Flint.</strong> 2013. Managing Private Forestlands along the Public-Private Interface of Southern Illinois: Landowner Forestry Decisions in a Multi-Jurisdictional Landscape. <em>Journal of Forest Economics and Policy </em>34: 47-55.</p><br /> <p>&nbsp;</p><br /> <p>Hildreth*, L. A., Genschel, U., <strong>Lorenz, F. O</strong>., &amp; Lesser, V. (2013). A permutation test for a first order response structure in surveys. <em>Structural Equation Modeling, 20, 226 - 240</em> (Tier 1).</p><br /> <p>&nbsp;</p><br /> <p><strong>Israel, G. D</strong>. 2013. Combining Mail and E-mail Contacts to Facilitate Participation in Mixed-Mode Surveys. <em>Social Science Computer Review</em>. <em>31</em>, 3, 346-358. Published online November 28, 2012. doi: 10.177/0894439312464942. At: <a href="http://ssc.sagepub.com/content/early/2012/11/26/0894439312464942">http://ssc.sagepub.com/content/early/2012/11/26/0894439312464942</a></p><br /> <p>&nbsp;</p><br /> <p><strong>Israel, G. D</strong>. 2013. Using Mixed-mode Contacts in Client Surveys: Getting More Bang for Your Buck. <em>Journal of Extension</em>, <em>51</em>(3), article 3FEA1. At: <a href="http://www.joe.org/joe/2013june/a1.php">http://www.joe.org/joe/2013june/a1.php</a></p><br /> <p><strong>&nbsp;</strong></p><br /> <p>Klabunde, Carrie N., Willis, Gordon B., McLeod, Caroline C., <strong>Dillman, Don A</strong>., Johnson, Timothy P., Greene, Sarah M. and Brown, Martin L.&nbsp; (2013). Improving the Quality of Surveys of Physicians and Medical Groups: A Research Agenda. <em>Evaluation &amp; the Health Professions</em><span style="text-decoration: underline;">.</span>&nbsp; 35(4) 477-506.</p><br /> <p>&nbsp;</p><br /> <p>*Lannin, D. G., Bittner, K.E., and <strong>Lorenz, F.O</strong>. (2013). Longitudinal effect of defensive denial on relationship instability. <em>Journal of Family Psychology</em>, 27, 968 - 977.</p><br /> <p>&nbsp;</p><br /> <p>Leggette, H. R., <strong>McKim, B. R</strong>., &amp; Dunsford, D. (2013). A case study of using electronic self-assessment rubrics in a core curriculum writing course. <em>NACTA Journal</em>, 57(<em>2</em>), 2-10.</p><br /> <p>&nbsp;</p><br /> <p>Masarik, A. S., Conger, R. D., &amp; <strong>Lorenz, F. O.</strong> (2013). Romantic relationships in early adulthood: Influences of family, personality and relationship cognitions. <em>Personal Relationship,30, 356 &ndash; 373 (Tier 2).</em></p><br /> <p><em>&nbsp;</em></p><br /> <p><strong>McKim, B. R.,</strong> Latham, L., Treptow, E., &amp; Rayfield, J. (2013). A repeated measures study of the short-term influences of high-impact practices on college students&rsquo; learning styles. <em>NACTA Journal</em>, 57(<em>3a</em>), 122-128.</p><br /> <p>&nbsp;</p><br /> <p><strong>McKim, B. R</strong>., Lawver, R. G., Enns, K., Smith, A. R., &amp; Aschenbrener, M. S. (2013).&nbsp; Developing metrics for effective teaching in extension education: A multi-state factor-analytic and psychometric analysis of effective teaching. <em>Journal of Agricultural Education</em>, 54(<em>2</em>), 143-158. doi: 10.5032/jae.2013.02143</p><br /> <p>&nbsp;</p><br /> <p><strong>McKim, B. R</strong>., Rayfield, J. S., Harlin, J. &amp; Adams, A. (2013). Stress levels of agricultural science cooperating teachers and student teachers: A longitudinal analysis. <em>Career and Technical Education Research</em>. 38(1), 3-17. doi: 10.5328/cter38.1.3</p><br /> <p>&nbsp;</p><br /> <p><strong>McKim, B. R</strong>., &amp; Saucier, P. R. (2013). A 20-year comparison of teachers&rsquo; self-efficacy of agricultural mechanics laboratory management. <em>Journal of Agricultural Education</em>, 54(<em>1</em>), 153-166. doi: 10.5032/jae.2013.01153</p><br /> <p>&nbsp;</p><br /> <p>Moore, L<strong>., McKim, B. R</strong>., &amp; Bruce, J. (2013). Organizational climate of the Association of Leadership Educators. <em>Journal of Leadership Education</em>, 12(<em>2</em>), 88-102.</p><br /> <p>&nbsp;</p><br /> <p>*Surjadi, F. F., <strong>Lorenz, F. O</strong>., Conger, R. D., and Wickrama, K. A. S. (2013). Inconsistent parental discipline and relationship quality in young adults: Mediating processes of behavioral problems and attitudinal ambivalence. <em>Journal of Family Psychology, 27, 762-772</em> (Tier 1).</p><br /> <p>&nbsp;</p><br /> <p>Stock, M. L., Gibbons, F. X., Gerrard, M., Houlihan, A. E., Weng, C-Y., <strong>Lorenz, F. O</strong>. &amp; Simons, R. L. (2013). Racial identification, racial composition, and substance use vulnerability among African American adolescents and young adults. <em>Health Psychology, 32, 237 &ndash; 247</em> (Tier 1).</p><br /> <p>&nbsp;</p><br /> <p>Wickrama, K. A. S., O., Neal, C. and <strong>Lorenz, F. O</strong>. (2013). Marital functioning from middle to later years: A life course &ndash; stress process framework. <em>Journal of Family Theory &amp; Review,5</em> 15 -34.</p>

Impact Statements

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.
Back to top

Date of Annual Report: 01/29/2017

Report Information

Annual Meeting Dates: 02/19/2015 - 02/20/2015
Period the Report Covers: 10/01/2014 - 09/30/2015

Participants

Glenn Israel, Chair (Florida) gdisrael@ufl.edu;
Lou Swanson, Director of Extension (Colorado State) Louis.Swanson@ColoState.edu;
Billy McKim (Texas A & M) brmckim@tamu.edu;
Don Dillman (Washington State) dillman@wsu.edu;
Fred Lorenz (Secretary: Iowa State) folorenz@iastate.edu;
Todd Rockwood (Minnesota) rockw001@umn.edu;
Steve Swinford (Montana State) swinford@montana.edu;
Zhu, Zhengyuan (Iowa State) zhuz@iastate.edu;
Pina, Manuel (Texas A & M)m-pin@tamu.edu;
Hau Qin (U Missouri) qinh@missouri.edu;
Jackie Hill (Texas A & M)jackie.hill@ag.tamu.edu;
Mobly, Mallory (Texas A & M) mallory.mobly@ag.tamu.edu;
Bishop, Danielle (Texas A & M) bishop.danielle12@gmail.com;
Stewart, Ashley (Texas A & M) ashley.stewart@ag.tamu.edu;
Newberry Milton III (Florida)miltonius3@ufl.edu;
Constantine, Melissa (Minnesota) cons0026@umn.edu

Brief Summary of Minutes

Accomplishments

<p>See meeting minutes.</p>

Publications

<p>Dillman, Don A. Jolene D. Smith and Leah Christian. 2014. <em>Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method. </em>The 4th edition. John Wiley&rdquo; Hoboken, NJ.&nbsp;<br /> <br /> Stern, Michael J., Ipek Bilgen and Don A. Dillman. 2014. The State Of Survey Methodology: Challenges, Dilemmas ,and new Frontiers in the Era of the Tailored Design. <em>Field Methods</em> (August) 26: 284-301<br /> </p><br /> <p>Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth. 2014. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. <em>Public Opinion Quarterly</em>. 78 (3): 734-750.&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>Qin, C.G. Flint and A.E. Luloff. 2015. Tracing temporal changes in the human dimensions of forest insect disturbance on the Kenai Peninsula, Alaska. <em>Human Ecology.</em> doi:10.1007/s10745-01409717-x.</p><br /> <p>&nbsp;</p><br /> <p>David, M.B., C.G. Flint, L.E. Gentry, M.K. Dolan, G.F. Czapar, R.A. Cooke, and T. Lavaire. 2014. Navigating the socio-bio-geo-chemistry and engineering of nitrogen management in two Illinois tile-drained watersheds. <em>Journal of Environmental Quality.</em> doi:10.2134/jeq2014.01.0036.</p><br /> <p>&nbsp;</p><br /> <p>DeDecker, J., J. Masiunas, A.S. Davis, and C.G. Flint. 2014. Weed management practice selection among Midwest U.S. Organic Growers. <em>Weed Science</em> 62: 520-531.</p><br /> <p>&nbsp;</p><br /> <p>Celio, E., C.G. Flint, P. Schoch, and A. Gr&ecirc;t-Regamey. 2014. Farmers' perception of their decision-making in relation to policy schemes: A comparison of case studies from Switzerland and the United States. <em>Land Use Policy</em> 41: 163-171.</p><br /> <p>&nbsp;</p><br /> <p>Hendee, J. and C.G. Flint. 2014. Incorporating cultural ecosystem services into forest management strategies for private landowners: An Illinois Case Study. <em>Forest Science</em>. <a href="http://dx.doi.org/10.5849/forsci.13-710">http://dx.doi.org/10.5849/forsci.13-710</a>.</p><br /> <p>&nbsp;</p><br /> <p>Maurer, K. M., Stewart, T. W., and Lorenz, F. O. 2014. Direct and indirect effects of fish on invertebrates and tiger salamanders in prairie pothole wetlands. <em>Wetlands</em>, 34 (4), 735 &ndash; 745.</p><br /> <p>&nbsp;</p><br /> <p><strong>Qin, H.</strong>, L. Davis*, M. Mayernik, P. Romero Lankao, J. D&rsquo;lgnazio, and P. Alston. 2014. Variables as currency: Linking meta-analysis research and data paths in sciences. <em>Data Science Journal</em> 13: 158-171.</p><br /> <p><strong>&nbsp;</strong></p><br /> <p>Romero-Lankao, P., S. Hughes, <strong>H. Qin</strong>, J. Hardoy, A. Rosas-Huerta, R. Borquez, and A. Lampis. 2014. Scale, urban risk and adaptation capacity in neighborhoods of Latin American cities. <em>Habitat International</em> 42: 224-235.</p><br /> <p>&nbsp;</p><br /> <p><em>A cross-sampling method for estimating abundance and detectability for aggregated population with varying local abundance,&rdquo; S. Zhou, N. Klaer, R. Daley, Z. Zhu, M. Fuller, and A. Smith, ICES Journal of Marine Science, 2014, </em>71(9):&nbsp;2436-2447.<br /> </p>

Impact Statements

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.
Back to top

Date of Annual Report: 01/23/2017

Report Information

Annual Meeting Dates: 02/18/2016 - 02/19/2016
Period the Report Covers: 10/01/2015 - 09/30/2016

Participants

Swinford, Steve (Swinford@montana.edu) – Montana State; Lesser, Virginia (lesser@science.oregonstate.edu) – Oregon State; Willits, Fern (FKW@PSU.edu) - Penn State; Qin, Hua (qinh@missouri.edu) – Missouri; Kyle, Gerald (gkyle@tamu.edu) – Texas A&M; van Riper, Carena (cvanripe@illinois.edu) - Illinois; McKibben Jason – University of West Virginia; Yopp, Ashley – Texas A&M; McKim, Billy (brmckim@tamu.edu) – Texas A&M; Israel, Glenn (gdisrael@ufl.edu) – University of Florida; Dillman, Don (DIllman@wsu.edu) – Washington State

Brief Summary of Minutes

The committee began the meeting by discussing the impact of smartphones and survey procedures on data quality. Glenn Israel reported that Qualtrics collects metadata on platform used by respondent; he reported finding 20% of respondents were using a smartphone and 10% were using a tablet to access the 2015 Florida Cooperative Extension customer satisfaction survey on the web. The discussion focused around concerns about error in multi-mode data sets. Additional questions included, “Where are respondents when they are using a mobile device?” and, “Are they focused on task of completing instrument or distracted by other things?” Some clues might be found by looking at dropout rates by age, along with data on speed of response. We have always had difficulty obtaining responses from young people and innovations to address this were discussed including whether there is an opportunity to adjust the framing of surveys to increase buy-in. Topics mentioned included gamification, perceptions of legitimacy and providing feedback on responses. Same concern found when pictures are used on questionnaires. Also discussed the problem with people sharing their access ID to an online questionnaire with others, where one respondent turns into many, particularly when topic is “politically charged” and a stuffing the ballot box situation occurs.


 


Glenn Israel (Florida) reported on two studies. The first examined how clarifying instructions can improve item response quality for numerical open-ended questions. Two numerical open-ended questions asked respondents information about number of times they contacted FCES in past 12 months and years using Extension services. The addition of clarifying instructions significantly reduced the percent of missing and incorrectly formatted responses for both questions. The second study explored the interaction of stem and response order effects on satisfaction rating questions. Israel suggested an explanation is that respondent heuristics influence the communication and respondents assume that positive response categories will begin at the left for horizontal scales and at the top for vertical scales. Experimental factors are the order of the response categories and the order of the options in the question stem in a 2-by-2 design. Israel reported there was clear evidence of response order effects consistent with theories of satisficing and respondent heuristics but inconclusive evidence of stem order moderating these effects.


 


Hua Qin (Missouri) studied the applicability of using partially correlated longitudinal data to examine community change. Community surveys have been widely used to investigate local residents’ perceptions and behaviors related to natural resource issues. These include residents’ attitudes about rapid growth induced by energy development or amenity migration, public perspectives of wildfire and fuel management, and Community risk perception and response to forest insect disturbance. Although community can be conceived as a dynamic process of interaction and collective action, most existing community survey research relies on cross-sectional data and is thus unable to capture the temporal dynamics of community processes. Longitudinal analysis has received increasing interest in recent natural resource social science literature. Trend and panel studies are two typical approaches in longitudinal community survey research. Due to limited sampling frames, research design, and respondent attrition, longitudinal community surveys often involve both matched (paired) and uncorrelated (independent) observations across different waves. Using previous re-survey data on community response to forest insect disturbance in Alaska as an example, this research note shows that the corrected z-test is a more appropriate approach to compare partially correlated samples than conventional statistical techniques such as the paired and independent t-tests.


 


Ginny Lesser (Oregon) reported on ODOT Multi-mode studies during 2006-2014. Unit response rates, percentage moving to web with several sub-studies of design components. Fourth contact in 2012 and 2014 reminds respondent of web response option. 2010 statement on saving the state money had lots of discussions of effects. Comparisons with American community survey – percentage of males, income (web group higher), degree (web higher education). Lesser also reported on response rates for a monthly customer satisfaction survey and these have been declining since 1994 – changes did occur to instrument and design. Finally, a 2014 Off-Highway Vehicle Survey had mail response was 31% and web was 29%. Overall, 56% of total respondents were Web, 17% of them by tablet, 8% by smartphone.


 


Don Dillman (Washington) discussed the work he is doing on address-based sampling to request responses over the web from household samples. He sees these methods expanding because of telephone going into rapid decline, the result of it no longer fitting with the culture. People don’t answer phone calls from unknown numbers and parties, and if they do it once, they won’t do it again for follow-ups. And, the only good household frame we now have for general public surveys is postal service address based samples. He has just completed his contribution to an AAPOR task force on evaluating the use of U.S. Postal Service address-based samples. That report is now being published, and a follow-up journal manuscript is ready for submission to the Journal of Survey Statistics and Methodology. He also is making a major effort to help improve the communications used to request survey responses in the U.S. Decennial Census as well as the American Community Survey. In October, 2015, he discussed current issues and challenged at a National Academies of Science seminar on the roll-out of 2020 Decennial Census methods, and will be making a similar presentation to a National Academies Committee in May 2016, on those procedures.


 


Steve Swinford (Montana) continued work on transportation research as well as applying cultural models to adolescent behavior and child abuse prevention studies. A cannabis study will investigate perceptions of user and non-users on impairment to drive a motor vehicle after use. Also a Utah seatbelt safety study to collect local information to inform policy changes. Also conducted two small-scale community needs assessment surveys for Dillon and Valier (Montana cities).


 


Gerald Kyle (Texas) discussed four treatments on normative appeals in cover letters for a study he will be conducting. Texas boaters will be the target audience. Discussed reordering; does the treatment get lost in the format? Committee members made suggestions on wording.


 


Billy McKim (Texas) studied Heuristics Matrix Effects with TX Extension agents. This study measured perceived ability on a task and importance of the ask to job as two dimensions. Having the responses side by side (as opposed to on separate pages) made a difference. They used one answer (first one) to answer second factor. McKim’s Rodeo Austin Study was an economic impact and customer satisfaction survey. A total of 2,173 contacts were made and 1,473 usable completed questionnaires were obtained with an iPad intercept methodology. He used offline Qualtrics – an add-on function. An incentive – a token for concessions was offered (received $10,000 in tokens for the survey) Those at the $8 and $6 incentive also spent more time answering questions. McKim also reported on a third study: Reaching the Public – Personas for Marketing Agricultural Organization to Target Audiences. He discussed how one size fits all does not work with incentives – and same thing happens with messaging. Use of a “persona” in the marketing to increase response. The topic in the study was addressing animal treatment. They created a number of statements reflecting the values of the organization. The Q-sort method – a set number of statements are placed along a continuum by the respondent to reflect values, was used.

Accomplishments

<p>An agenda was developed and the coordinating committee held its annual meeting in February, 2016. Participating members discussed several important topics affecting error in agricultural and rural surveys. These topics included: Impact of smartphones on survey procedures and data quality, Cultural, technological, and generational challenges and opportunities to survey engagement, and IRB-Surveyor issues. In addition, members reported on survey research studies being conducted or planned in their state and provided feedback to others. Members from several states discusses plans for coordination studies on comparing nonprobability samples in on-line surveys with address-based probability samples using mail and mixed-mode surveys in order to assess the strengths and weaknesses of these technologies.</p><br /> <p>&nbsp;</p><br /> <p>In addition, committee members have been active in publishing research in journal articles, presenting papers and posters at relevant conferences, and developing educational materials available to Extension professionals and the public during the past year. This includes 6 publications for Extension and outreach audiences and 24 presentations at professional conferences where attendees are members of the target audience for this project. In addition, the member from Florida conducted a 4-hour workshop on conducting on-line surveys for extension professionals, which incorporated research of the coordinating committee. The member from Washington conducted a workshop for approximately 15 individuals at the annual meeting of the Rural Sociological Society in July, 2016.</p>

Publications

<ol><br /> <li>Landon, A.C., <strong>van Riper, C.J.</strong>, Angeli, N.F., Fitzgerald, D.B., &amp; Neam, K.D. 2015. Growing transdisciplinary roots in the Peruvian Amazon: Lessons from the field. <em>The</em> <em>Journal of Transdisciplinary Environmental Studies, 14</em>(1), 2-12.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <ol start="2"><br /> <li><strong>Qin, H</strong>. and T. F. Liao. 2015. The association between rural-urban migration flows and urban air quality in China. <em>Regional Environmental Change</em> (in press). doi:10.1007/s10113-015-0865-3</li><br /> </ol><br /> <p><strong>&nbsp; <br /></strong></p><br /> <ol start="3"><br /> <li><strong>Qin, H.</strong>, P. Romero-Lankao, J. Hardoy, and A. Rosas-Huerta. 2015. Household responses to climate-related hazards in four Latin American cities: A conceptual framework and exploratory analysis. <em>Urban Climate</em> 14(Part 1): 94-110.</li><br /> </ol><br /> <p><strong>&nbsp;</strong></p><br /> <ol start="4"><br /> <li><strong>Qin, H. </strong> Comparing newer and long-time residents&rsquo; perceptions and actions in response to forest insect disturbance on Alaska&rsquo;s Kenai Peninsula: A longitudinal perspective. <em>Journal of Rural Studies </em>39: 51-62.&nbsp;</li><br /> </ol><br /> <p><strong>&nbsp;</strong></p><br /> <ol start="5"><br /> <li><strong>Qin, H.</strong>, C. G. Flint, and A.E. Luloff. 2015. Tracing temporal changes in the human dimensions of forest insect disturbance on the Kenai Peninsula, Alaska. <em>Human Ecology</em> 43(1): 43-59.</li><br /> </ol><br /> <p><strong>&nbsp;</strong></p><br /> <ol start="6"><br /> <li>Wallen, K., <strong>Kyle, G.</strong>, &amp; <strong>van Riper, C.J.</strong> Carrying capacity and commercial services in the Southern Sierra Nevada. (Prepared for the U.S.D.A. Forest Service.) College Station, TX: Texas Agrilife Research.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <ol start="7"><br /> <li>Schuett, M.A., <strong>Kyle, G.T.</strong>, Dudensing, R., Ding, C., <strong>van Riper, C.</strong>, &amp; Park, J. 2015. Attitudes, behavior, and management preferences of Texas artificial reef users. (Prepared for the Artificial Reef Program, Texas Parks and Wildlife Department.) College Station, TX: Texas AgriLife Research.</li><br /> </ol><br /> <p>&nbsp;</p>

Impact Statements

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.
Back to top

Date of Annual Report: 10/30/2017

Report Information

Annual Meeting Dates: 02/16/2017 - 02/17/2017
Period the Report Covers: 10/01/2016 - 09/30/2017

Participants

Steven Swinford (swinford@montana.edu), Montana State University; Ginny Lesser (lesser@science.oregonstate.edu), Oregon State University; Bill Stewart (wstewart@illinois.edu), University of Illinois; Carena van Riper (cvanripe@illinois.edu), University of Illinois; Hua Qin (qinh@missouri.edu), University of Missouri; Zhengyuan Zhu (zhuz@iastate.edu), Iowa State University; Melissa Constantine (cons0026@umn.edu), University of Minnesota; Todd Rockwood (rockw001@umn.edu), University of Minnesota; Fern Willits (fkw@psu.edu), Pennsylvania State University; Emily Perdue (emily.perdue@mail.wvu.edu), West Virginia University; Jason McKibben (Jason.mckibben@mail.wvu.edu), West Virginia University; Ashley Yopp (ayopp@tamu.edu), Texas A&M University; Billy McKim (brmckim@tamu.edu), Texas A&M University; Tobin Redwine (tredwine@tamu.edu), Texas A&M University; Gladys Walter (gladysw@tamu.edu), Texas A&M University; Glenn Israel (gdisrael@ufl.edu), University of Florida; Jessica Goldberger (jgoldberger@wsu.edu), Washington State University; Don Dillman (dillman@wsu.edu), Washington State University

Brief Summary of Minutes

Call to order at 8:20 a.m.


The meeting began with introductions and meeting logistics.


Brief history of group was provided for new attendees and Chair Glenn Israel mentioned more details are included in the Introduction to the special issue of the Journal of Rural Social Sciences.  Started as a methodology research project, W-183, in rural social sciences. Joint publications of replications of experiments was the primary motivation initially behind organization.


Israel added that the current WERA-1010 project runs through September 2018.  Inquiry about renewal will be posed to Lou Swanson, Administrative Advisor.


Improving Communication with Potential Respondents


The committee began the substantive meeting by discussing theoretical approaches to improving communication with potential respondents. Don Dillman (Washington) discussed plans that were developed for beginning a new line of experimentation in 2017 for evaluating alternative communication designs for encouraging response to surveys.  This experiment is scheduled to be put into the field in March, 2017.  A detailed rationale for designing and implementing this study was presented along with draft materials to be used in the experiment, for review and critique by committee members.


Dillman presented ideas for Improving Response Rates through better communications.  First, he noted that RDD is no longer working – response rates are bad, coverage is problematic with extra questions needed to test/assess coverage. Web-push surveys are the likely replacement.  One can make an initial contact with postal (good contact) and follow up with other modes/requests. In addition, Web-push is better now at getting people to use web (cuts costs) but doesn’t always work. American Community Survey is now all web-push; Dillan cited several other examples given of major national surveys using this method.  Obtaining good response rates on these too.


Dillman discussed what we know about what works/does not in web-push.  We don’t know how improved communications could increase effectiveness of web-push.  We need to understand how and when communication occurs.  Communications extends beyond the letters and graphics.  Not much work has been done on elements of this. There was discussion of what makes communication sequence effective, and things that detract. Many stages in this process though – not a singular solution to the issue.


Dillman discussed a new approach to communication, noting that there are four aspects to communicate – presentation, content, letter, and questionnaire. He also reviewed pre-suasion concepts from Robert Cialdini, which inspired the new approach: 1) Establishing trust initially can increase compliance later, 2) Creating a privileged moment, 3) Focal arguments in one’s mind tend to be causal, and 4) Utilize normative appeals to magnetize attention to focal argument. Dillman asked, “How might these concepts be built into a sequence of letters intended to elicit response?” and “How should one use connective language relevant to the community?” First, the communication should explain why WSU is conducting the study; the second contact extends connections; the third contact emphasizes local questions and provides feedback on results to date; and the fourth contact is a reminder postcard. Building the concepts into questionnaire was done is several ways: the cover page connects to community (two examples…one emphasizes location, other not); there are questions on local community up front and transitions are used to explain questions throughout.


The survey was developed for the two experimental conditions. In addition the cover letter explained why it is was coming to people in West Virginia from WSU. Everyone is getting an incentive of $2 at Week 1 (evidence is out there that they work) – not an experimental factor. 


A 2x2 design is planned with the persuasion letter vs standard letter and persuasion questionnaire vs standard questionnaire. The research team is shooting for a response rate of about 40%.


Comparison of online opt-in panels with address-based sample surveys


A second area discussed was the use of online panels and how they compare with probability samples. Ginny Lesser (Oregon) talked about one of her studies this year to compare a probability sample and an opt-in panel using the same questionnaire.  First, she contacted to Knowledge Networks for an online probability sample size n=2000 but this was going to cost $45,000.


The “standard” probability sample had 3,750 in mail, 3,750 in web+mail and used addresses from the USPS DSF. In addition, Lesser experimented with an “I Love Oregon” sticker (state outline, green heart) as an incentive insert. Response rates for token incentives are modest -- they don’t work well. Also, comparison of 4 vs 5 contacts on web+mail showed a  4%-6% bump for fifth mailing…overall rate was low 20s to mid 20s. Over a decade the percentage of respondents going to web has become about half of all completed.


For the opt-in nonprobability panel, Lesser used a Qualtrics commercial panel…recruited from their business partners to take opt-in surveys. Qualtrics recruits respondents and Lesser asked for Oregon residents with specific demographics. The panel matched demographic request, invited people to participate, and Qualtrics gives “points” incentive…for cashing in on gift cards. To reach the target of 500 completes, Qualtrics needed to contact 7250 individuals. Over four contacts, 457 completed the survey, so they ran another sample of 790 to get her to the 500 completed she contracted for initially. Qualtrics does not give you the list of people who are contacted. The overall response rate was 6.3%. Lesser noted that Qualtrics removes the “speeders” – people who just click a column or complete too quickly.


Lesser compared the opt-in panel responses with those from the probability sample, using the.  point estimate for the nonprobability Qualtrics sample and 95% confidence intervals for the probability sample. She also weighted the responses. Over the 265 questions, 63% of panel estimates outside of the confidence limits of the probability sample. Lesser also compared item nonresponse – there were a higher percentage of missing answers in the probability sample.  It may be the case that those in panels, to get reward points, cannot leave as many “no answer” responses.


Steve Swinford (Montana) discussed his experiences with using Qualtrics panels in several studies at the Center for Health and Safety Culture. He had used them 10+ times. Mostly for pre-testing new items but sometimes for very specific groups. These usually collected 75 or so completed surveys and costs about $600 per instance. He noted that you can get results in 6 hours sometimes. He also reported that there was a problem with getting rural samples because Qualtrics does not have the database for very specific rural samples, e.g., Utah, rural, aged 18-44.


Swinford reported that the cost was normally about to $6-$8 per completion, which is both faster and less expensive than mail. He shared several examples of recent studies:



  • Idaho traffic safety study

    • Used it for pretesting instrument a couple of times

    • We did household mailout and online panel

    • Results were “close” in the end



  • Oregon – health care providers

    • Paid $50-$75 per completion, but we got them

    • Dating Violence in high schools

    • Needed 18-year-olds in HS…got 8




Swinford commented that it is not perfect but another tool to use. He noted that he didn’t use them as the primary data collection method but the results have consistently been close to what we obtain using conventional random samples.


Zhengwan Zhu (Iowa) reported on the AVMA Pet Ownership and Demographic Survey. He conducted an addressed-based pilot of dog ownership in 2015 and then two more surveys –Pet Demographic Survey (PDS) and Metropolitan Market Survey (MMS), were done online using panels. PDS is conducted every five years to estimate the percentage of household that have different types of pets. The ISU study for 2017 is using SSI with the goal of 50,000 completes. MMS focuses on one type of pet, adding detail about the pet type. He looked at 2012 data (the previous panel data) to discern the sampling and weighting used, as well as to assess the impact of eliminating the split sample design that was previously used.  There was a claim that the sampling in 2012 was representative, but no data was given to ISU to verify this; nor was the documentation clear as to the procedures used to manipulate the data. It appears that construction of the panel makes the representativeness questionable.  No demographic info was asked in the questionnaire – thus it all comes from information in the original profile. Zhu also estimated standard errors of 2012 data but there is a lack of certainty of estimate overall.  He did this to arrive at scheme for doing so in 2017.


Zhu also reported on using a Google survey – it had one question and cost 15 cents per complete. Ne reported that they got estimates close to their adjusted estimates from a Qualtrics survey. No conclusion to draw, just noting the estimates that this approach yielded.


Glenn Israel (Florida) reported on an opportunity to have a Qualtrics nonprobability quota sample and a probability address-based sample for a survey on Climate Change in Florida.  The online Qualtrics survey was completed in 8 days, with 514 respondents completing it in November, 2016.  He contracted $5 per complete.  With a few exceptions, the online and mail questionnaires were the same.  About 800 accessed survey to get 514 completes.


A mail survey was started about the same time – no pre-letter; it had an initial packet with a cover letter, questionnaire, and postage-paid return envelope, followed by a reminder postcard, then a second questionnaire and a third questionnaire to nonrespondents. To date, the response rate is 16.3% on the mail survey with a 1500 sample size. A second replicate of 500 is in the field using a mixed-mode protocol. Israel will report results at the next meeting.


 


State Reports


Bill Stewart and Carena van Riper (Illinois) shared their information about their search and research for high response rates as part of the Parks and Environmental Behavior Work Group. They reported on research examining Sense of Place – place making – with a community and belonging theme. The study was conducted in an urban context (Chicago’s south side) and addressed land vacancy.  There were 25-30,000 vacant lots (common urban problem) and leaders were attempting to re-develop these spaces. There is hope for the neighborhood and redevelopment. The study was intended to measure their perceived impact of vacant lot buying on their community and, hence, was a social assessment of community engagement. They worked with partners that included several NGOs and neighborhood associations and the survey of large lot owners achieved a 71% response rate.  Data collection included a first mailing of the questionnaire, postcard reminder, second survey, city called non-respondents, third replacement survey.  About 58% responded before the phone call. Possible reasons for the strong response rate include issue salience, relationships built within policy chain (Introductory letter from City, Promise of survey response as voice in decision‐making process, Word‐of‐mouth due to focus groups, and Phone calling from City prior to third wave of questionnaires), and the $1 incentive enclosed in first questionnaire (basically got their dollar back).  Good buzz – word of mouth – was generated during his data collection process.


Stewart and van Riper also discussed a study on community resilience in protected grasslands. Rural communities face challenges to development and protected grasslands part of this. They are attempting to understand changes in social and economic conditions of rural communities near protected grasslands.  The study will focus on two counties in Illinois and Iowa with bison reintroduction as part of the issue. The survey will focus on trade-offs among future growth scenarios and be administered via mail. It will include a stated choice experiment to assess relative importance of attributes in a design. This will include determining the relative importance of 6 or 9 attributes identified in focus groups and then developing profiles (subgroups) defined by community attachment Pilot testing is planned for May, 2017, and the main data collection late summer/fall. Discussion focused on acquiring the sample, sampling frame and methods, and estimating nonresponse bias.


 


Friday, February 16, 2017


Call to order at 8:05 a.m.


Steve Swinford (Montana) described a study of alcohol use at university events: SAFE – Substance Abuse Free Environment.  Description of methods used and purpose of study.  Multiple surveys in multiple modes conducted. This was funded internally and, originally, focused on football tailgating at home games and then it was expanded to all public campus events involving alcohol sales/distribution. The methods included: 1) Observation work (done), 2) Interviews with key stakeholders, 3) Analysis of policies at peer institutions, and 4) Survey work of community members. He planned a sample of 1200 and to use online data collection and mail with four contacts: Pre-letter, letter (survey), reminder, replacement.


Ginny Lesser (Oregon) reported that each year the survey center conducts about 8 surveys for the state of Oregon. The Oregon DOT study measures satisfaction with highway maintenance (since 2000). Lesser noted that general and specific ordering of items has been studies and published. She reported that it uses a probability sample of about 4,130 (half all mail back, half web+mail) of licensed drivers. Lesser used four contacts for mail only and five for web+mail (Preletter, 1st mail, PC, 2nd, 3rd (web+mail only)). The response rates by mode were 34% overall, web+mail with 4 mailings (27%), web+mail with 5 mailings (31%).  The fifth mailing boosted response rate but these came back mainly through paper. Lesser also reported on a Control of Litter survey and found no consistent differences in responses between all mail and web+mail respondents on answers to specific or general questions. She will look at this again in two years (will be doing the same study again). Lesser noted that urban areas responded more via web and rural more via mail.


Zhengwan Zhu (Iowa) reported briefly on the 2015 and 2016 Iowa Nutrient Management Survey and noted the 2017 version goes out in early March. The focus was on water quality impacts of agricultural nutrients. They were looking at farmer knowledge, barrier to reducing nutrient loss, and changes over time. The sampling design identified priority watersheds. He reported the 2015 study had n=1746 and a response rate of 47%. Zhu provided a brief description of logistic regression analysis results and discussed the measurement of knowledge


Hua Qin (Missouri) reported on conducting a systematic review and meta-analysis of survey research. He sampled eleven papers dealing with survey research methods and discussed them broadly. Qin said this provided an opportunity to develop agricultural/natural resource topic meta-analysis around survey methods – defining an area that might have enough treatments to analyze. He suggest this may become a future endeavor of WERA group. Don Dillman noted that there was some previous integrative work on related topics, for example forced choice versus check all that apply questionse.


Todd Rockwood (Minnesota) briefly talked about an issue of knowledge about internal organs. He observed that health is only known with respect to disease and this creates challenges for other measurement approaches.


Fern Willits (Pennsylvania) suggested organizing one or more sessions at the upcoming annual meeting to the Rural Sociological Society this July to share and extend to a larger audience the fruits of our discussions concerning the uses of commercial survey providers. The WERA attendees suggested we aim for two (2) "panel discussion" sessions" as follows: 1) An Overview of  Commercial Survey  Service Providers for which we would invite representatives of several such service organizations  to describe their services concerning sampling procedures, data collection options, and survey consultation. A second panel, tentatively titled "Experiences and Lessons Learned Using Commercial Survey Service Providers," would engage various WERA participants and (hopefully) audience members to share the pros and cons of their experiences. Todd Rockwood and Melissa Constantine agreed to work with me to try to "make this happen."


Willits also described her interest in engaging other WERA members in analysis that would examine the consistency (or not) of observed relationships between survey variables if analysis were carried out at different stages of data collection/follow-up. There has been considerable research on differences between the characteristics of early and later responders.  She did not know of research that addresses whether observed relationships between or among variables differ depending upon whether the subjects responded early and later to solicitations for participation. To assess this idea, one needs to have access to substantive survey data sets that include the date of response for each subject as well as the information on the variables of interest.  Analysis would then require examination of the relationships between or among selected variables using only cases responding to the first wave of contact, and similar analysis after one or more later waves. She did not have any such data available and requested anyone who might have such information who would be willing to share (or join forces with her) to pursue such analysis to contact me. Ginny Lesser (Oregon) pointed out that proprietary constraints in some studies might be a problem, but these may not be insurmountable. 


Jessica Goldberger (Washington) discussed a series of data sources based on survey collections she has worked with over her career.  The first was for survey of certified organic producers (2007), which asked producers about biodegradable plastic mulch and barriers to use of the technology.  She is head of the technology adoption working group on the 5-year USDA project.  A second survey of Strawberry growers focused on various benefits and costs of the use of the plastic film. A biodegradable option does exist but it comes with some costs though.  None of the products currently meet the current standards for organics.  Some strawberry producers are not able to utilize the technology. Goldberger described the questionnaire design and sample, where 1553 growers in 6 states sampled.  Mailing list purchased from Meister Media (n=1357) and supplemented with OR/WA names (n=196). The data collection used four contacts: pre-letter, first full mailing, reminder postcard, second full mailing.  A web option was provided for all mailings. The initial response rate was about 18% and phone calls were conducted with 290 non-respondents. There was a higher proportion of ineligibles than anticipated, resulting in an adjusted response rate about 21% and  227 useable questionnaires.  Another farmer survey is planned for Fall, 2017, and WERA members discussed how to improve response rate on this next survey.


Glenn Israel (Florida) reported on follow-up data collection in a presentation of “Can Clarifying Instructions influence response to numerical open-ended questions in self-administered surveys?” Last year, he reported that 20% of his respondents were using phones and were using 10% tablets.  This led to a reformatting the questionnaire to improve navigation. Given this, for the Q10 and Q11 experiment in 2016, specific information on the information obtained from Extension was “mail-merged” into the individual’s questionnaire. The results suggested the instructions helped but they were not as clear-cut as were those for 2015 when the instructions were not individualized.


 Glenn Israel also reported on an additional experiment that examined effects of stem and response order on response patterns in satisfaction ratings. A 2x2 experiment that tested the order satisfied and dissatisfied in the question stem and very satisfied to very dissatisfied in the responses were manipulated.  The experiment in 2016 tested Q7 (overall satisfaction with the Extension office). There was a large response order effect found. This finding has been replicated in other studies in Florida and Nebraska. In addition, the column format seemed to enhance the magnitude of the effect in 2016. In summary, there is clear evidence of response order effects and satisfied/dissatisfied items should start with positive answers first as this is consistent with heuristics used by respondents.


The next meeting is scheduled for February 22-23, 2018, in Tucson.


Adjourned at 3:15 pm


 

Accomplishments

<p>An agenda was developed and the coordinating committee held its annual meeting in February, 2017. Participating members discussed several important topics affecting error in agricultural and rural surveys. These topics included: Effective survey communication and Comparison of online opt-in panels with address-based sample surveys. In addition, members reported on survey research studies being conducted or planned in their state and provided feedback to others. Members from several states discusses plans for studies on comparing nonprobability samples in on-line surveys with address-based probability samples using mail and/or mixed-mode surveys in order to assess the strengths and weaknesses of these technologies, assessing the utility of different theories for inviting people to respond to a survey, assessing the order of concepts in question stems and responses for satisfaction items, and assessing the effects of follow-up contacts on sample characteristics and substantive research findings, as well as other topics.</p><br /> <p>During the year, work was completed on a special issue focusing on survey research methods for the Journal of Rural Social Sciences. Coordinating committee chair Israel then led efforts to solicit manuscripts and serve as guest editor for the special issue. The special issue contained an introductory article, six research articles and a commentary. Each article involved committee members as an author, co-author, or reviewer. In addition to the special issue, committee members have been active in publishing research in journal articles, presenting papers and posters at relevant conferences, and developing educational materials available to Extension professionals and the public during the past year. This includes publishing 12 survey methods-related articles, updating 5 publications for Extension and outreach audiences, and 13 presentations at professional conferences where attendees are members of the target audience for this project. In addition, the member from Florida conducted a 2-hour demonstration workshop on optimizing mixed-mode surveys for respondents using mobile technology at the American Evaluation Association annual meeting and he trained 75 Florida Extension professionals on the use of online survey tools, which incorporated research of the coordinating committee. Members from Florida, Minnesota, Oregon, and Pennsylvania participated in panel sessions at the annual meeting of the Rural Sociological Society in July, 2017. One panel, Contributions and Issues Related to the Use of Commercial Survey, involved participants from three firms while the second panel, Promise and Pitfalls: Experiences and Lessons Learned from Using Commercial Survey Services, was comprised of WERA 1010 members. Attendees at these panel session learned about available services for conducting surveys as well as the pros and cons of using some of these services.</p>

Publications

<p>1.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Battaglia, M., Dillman, D. A., Frankel, M. R., Harter, R., Buskirk, T. D., McPhee C. B., DeMatteis B., Montaquila, J., &amp; Yancey, T. 2016. Sampling data collection and weighting procedures for address-based sample surveys.&nbsp; Journal of Survey Statistics and Methodology, 4 (4): 476-500.</p><br /> <p>2.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Dillman, D. A. 2016. Moving Survey Methodology Forward in our Rapidly Changing World: A Commentary. Journal of Rural Social Sciences, 31(3): 160-174.</p><br /> <p>3.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Dillman, D. A., &amp; Edwards, M. L. 2016. Chapter 17. Designing a Mixed-Mode Survey. In Wolfe, Christof, Joye, Dominique, Smith, Tom W. and Fu, Yang-chih (eds.) Sage Handbook of Survey Methodology. Sage Publications Wolf, Joye, Smith and Fu. Thousand Oaks. CA pp.255-268</p><br /> <p>4.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Dillman, D. A., Hao, F., &amp; Millar, M. M. 2016. Chapter 13.&nbsp; Improving the Effectiveness of Online Data Collection by Mixing Survey Modes. In&nbsp;Fielding, Nigel, Raymond M. Lee and Grant Blank (eds.). &nbsp;&nbsp;The Sage handbook of Online Research Methods, 2nd edition. Pp.220-237 Sage Publications, London.</p><br /> <p>5.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Flint, C. G., Mascher, C., Oldroyd, Z., Valle, P. A., Wynn, E., Cannon, Q., Brown, A., &amp; Unger, B. 2016. Public Intercept Interviews and Surveys for Gathering Place-Based Perceptions: Observations from Community Water Research in Utah. Journal of Rural Social Sciences, 31(3), 105-125.</p><br /> <p>6.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Harter, R., Battaglia, M. P., Buskirk, T. D., Dillman, D. A., English, N., Mansour, F.,&nbsp;&nbsp;&nbsp; Frankel, M. R., Kennel, T., McMichael, J. P., McPhee, C. B., Montaquila, J., Yancey, T., and Zukerberg, A. L.&nbsp; 2016.&nbsp; Address-base Sampling.&nbsp; American Association for Public Opinion Research Task Force Report http://www.aapor.org/getattachment/Education-Resources/Reports/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx&nbsp; 140 pages.&nbsp;</p><br /> <p>7.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Israel, G. D. 2016. Advances in Survey and Data Analysis Methods for Rural Social Scientists: An Introduction. Journal of Rural Social Sciences, 31(3), 1-6.</p><br /> <p>8.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Jackson-Smith, D., Flint, C. G., Dolan, M., Trentelman,, C. K., Holyoak, G., Thomas, B., and Ma, G. 2016. Effectiveness of the Drop-Off/Pick-Up Survey Methodology in Different Neighborhood Types. Journal of Rural Social Sciences, 31(3), 35-67.</p><br /> <p>9.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Kumar Chaudhary, A., &amp; Israel, G. D. 2016. Influence of Importance Statements and Box Size on Response Rate and Response Quality of Open-ended Questions in Web/Mail Mixed-Mode Surveys. Journal of Rural Social Sciences, 31(3), 140-159.</p><br /> <p>10.&nbsp; Lesser, Virginia M., Newton, Lydia D., Yang, Daniel K., &amp; Sifneos, Jean C. 2016. Mixed-Mode Surveys Compared with Single Mode Surveys: Trends in Responses and Methods to Improve Completion. Journal of Rural Social Sciences, 31(3), 7-34.</p><br /> <p>11.&nbsp; Trentelman, C. K., Irwin, J., Petersen, K. A., Ruiz, N., &amp; Szalay, C. S. 2016. The Case for Personal Interaction: Drop-Off/Pick-Up Methodology for Survey Research. Journal of Rural Social Sciences, 31(3), 68-104.</p><br /> <p>12.&nbsp; Willits, F. K, Theodori, G. L., &amp; Luloff, A. E. 2016. Another Look at Likert Scales. Journal of Rural Social Sciences, 31(3), 126-139.<br /> &nbsp;&nbsp;&nbsp;&nbsp;</p>

Impact Statements

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.
Back to top

Date of Annual Report: 04/13/2018

Report Information

Annual Meeting Dates: 02/22/2018 - 02/23/2018
Period the Report Covers: 10/01/2017 - 09/30/2018

Participants

1. Ginny Lesser – Oregon State University
2. Shannon Norris –Texas A&M University
3. Lacy Roberts – Texas A&M University
4. Gladys Walter – Texas A&M University
5. Stacy DeWalt – Texas A&M University
6. Katie Dentzman – Washington State University
7. Kenny Wallen – University of Arkansas - Monticello
8. Ashley Yopp – University of Georgia
9. Lou Swanson (Administrative Advisor) – Colorado State University
10. Jan Larson – Iowa State University
11. Natalie Davis – Texas A&M University
12. Billy McKim – Texas A&M University
13. Todd Rockwood – University of Minnesota
14. Fern (Bunny) Willits – Penn State University
15. Yano Prasetyo – University of Missouri
16. Glenn Israel (Chair) – University of Florida
17. Don Dillman – Washington State University

Brief Summary of Minutes

Committee business



  • Katie Dentzman volunteered to serve as the meeting secretary.

  • A Brief history of group was provided for new attendees by Chair Glenn Israel. He described:

    • Formally organized into regional research project 183 - transitioned to WERA 1001 (a coordinating committee) for 5 years. Renewed twice as WERA 1010.

    • Several publications have come out of coordinated research projects

      • An entire issue of Rural Sociology dedicated to measurement

      • Testing of incentives in late 90's and early 2000's

      • A number of articles on survey item nonresponse in Survey Practice

      • 2016 special issue of the Journal of Rural Social Sciences



    • One goal for the next two days - look for opportunities to collaborate and coordinate efforts on data collection and publication



  • Registration fee for meeting room was addressed.

  • Plans for dinner at the Skyline Country Club Planning for Thursday evening were discussed.

  • The chair requested that attendees provide materials for the annual report, including a word document of the state report and presentations at the meeting.


Comments from Administrative Advisor


Lou Swanson reviewed the renewal process and Glenn Israel reported that the renewal proposal would be completed and submitted by March 1, 2018. Swanson also emphasized the importance for this group of showing impact.



  • How do we impact extension, for example, in the Western Region

  • Demonstrate clearly (ex. for NIFA) our accomplishments


Thematic Issues for discussion


Effective survey communications (updated from 2017) – Don Dillman (WSU)


Dillman commented that Census Bureau (and all surveys) are having issues with communication and trust. He noted that address-based samples have the best household coverage and a mail contact can be used to push to web. Evidence suggests mail-only strategies have highest response rates. Dillman argued for a comprehensive design strategy based on either social exchange or another theory. Social Exchange hasn't been tested against another comprehensive design strategy. One candidate is 'Presuasion' theory by Robert Cialdini, which posits 1) establish trust, 2) create a privileged moment, 3) transition from attention to response through linkage, and 4) use magnetizers such as adding mystery to the task of responding.


Dillman said communication occurs via envelopes, letters explaining the request, the questionnaire, and any enclosures. This experiment focused on paper questionnaires and 4 letter contacts. The paper questionnaires had different cover pages (generic versus detailed with counties and pictures), different first question layout and wording  (generic first question versus magnetized first question talking about how great it is to live in West Virginia), and different callouts (questions without context versus callout box with context and signature). Likewise, the contact letters had different content, with Presuasion being much more personal, explaining researcher's experience and connection with the area, referencing specific questions in the questionnaire, mentioning specific discussions with local people. The main differences were in first paragraph or two of the letters.


The experiment used a 2x2 design: Presuasion letter and questionnaire, Exchange letter and Presuasion questionnaire, Presuasion letter and Exchange questionnaire, and Exchange letter and questionnaire. It had two versions of the survey instrument 'What's next for Southern West Virginia?' in two different formats - one social exchange and one persuasion.


Analysis showed that presuasion letters decreased response rates compared to exchange letters. Questionnaire format made no difference: Exchange letter and questionnaire 24%; Exchange letter and Presuasion questionnaire 23%; Presuasion letter and exchange questionnaire 19%; Presuasion letter and questionnaire 19%; Statistically significant differences. In addition, item non-response had no significant differences and the 'additional comments' question had no significant differences.


In the following discussion, Kenny Wallen said he did something similar with a web-based contact letter for his dissertation and got no statistically significant differences. It was asked, “Is there an issue of perceived authenticity?” Dillman suggested that further investigation on shaping letters is needed and we should also start trying these ideas with web-push methods. Other observations included:



  • Color questionnaires do not increase response rate and are also more expensive

  • Saying in the cover letter that taking the survey online saves the state money increases response rate

  • Personalization of marketing emails/etc. is making it increasingly difficult for survey research and especially persuasion formatting.


Online panels for testing items (Todd Rockwood, Minnesota)


A number of rural sociologists are using commercial panel samples. Specifically, using online panels for question diagnostics – a stage before cognitive interviewing on topics such as women and their bladders. Rockwood described how there were many MDs and PhDs involved in item development and refinement; Lots of direct care providers, 2 survey methods people.


The study focuses on measurement issues – we can generally rate our health but are NOT aware of specific body parts/organs etc. unless there is a health problem with them. Exceptions include your brain and your mouth, i.e., generally we have no clue what is going on with our bladders. A solution is to ask the right questions so that bladder health can be inferred. The approach uses the notion of the pathological to inform the understanding of the normal, as opposed to asking about health of your bladder, ask about the quality of your bladder compared to others.


The project has to deal with lots of measurement error - Desirability, Recall Error, and some Exceptions. The online panel can help identify sources of error for questions and how to solve these issues. These can then be refined in subsequent cognitive interviews The experiment involved 2000 women on a panel to assess questions with regard to:



  • Clear versus vague

  • How much you had to think about the question

  • What different parts of a question mean to you (i.e. ‘too frequently’, ‘have you ever’)

  • Are respondents adding exceptions into the question? (i.e. pregnancy)


Bunny Willits asked whether we want to know objective or subjective bladder health. What is the point of these survey questions? Rockwood hopes to convince NIH to drop ineffective questions, interpret the data correctly given how respondents will perceive questions.


Framing (message) and Response Rates - Kenny Wallen, Carena van Riper, and Elizabeth Golebie


The study targets anglers for a survey of aquatic invasive species in the Great Lakes. Research questions include: How to frame cover letters to increase response rates? Does it work and if so which frames? If we frame the cover letter aligning to cultural values, will response rates go up? How does this occur across multiple contacts? Wallen proposes to use Cultural Value Theory, which has four main categories: Fatalism, Hierarchy, Individualism, and Egalitarianism and implies how people should be grouped and how people should interact. The experiment has four versions of the cover letter using random assignment, using different language and images;



  • Fatalist – individual needs and survival

  • Egalitarian – save the ecosystem for group/future generations

  • Individualism – make sure you get the most benefit

  • Hierarchical – regulation and preservation of tradition


Wallen plans to compare value of cover letter to self-reported value of respondents. Also, the study will not use incentives or framing of questionnaire.


During discussion by the group, there wasa question of whether the cover letter might influence how they self-report values. That is, the cover letter may have already framed their response – primed them. On the other hand, it might be a more tailored, authentic version of ‘Presuasion’. In addition, social cognitive theory might prove a better fit than social exchange theory for the study.


Don Dillman said the letters won’t make much difference, but even less if the message is hidden half way into the cover letter; Put it in the first paragraph. He also suggested using more direct, less academic language and changing messages in a second and third mailing. Finally, Dillman noted that having the option to go online OR do a paper survey will reduce response rate. Ginny Lesser recommended using paper only for this population. Glenn Israel suggested the non-manipulated parts of the cover letter are already value-laden, so it is important to make sure you aren’t confounding the message. Todd Rockwood suggested putting the framing message on the envelope so it has a higher chance of being read; He opined that people don’t tend to actually read cover letters but there was disagreement on this point. Finally, Lou Swanson said ‘invasive species’ is not a good term; he noted Brown trout and zebra mussel are both invasive, but have completely different connotations, so it might be a good idea to add ‘such as zebra mussels’.


Ginny Lesser (Oregon)


Lesser reported on a multi-mode study comparing general and specific order of questions using a survey on highway conditions. The experiment had two groups: 1) General question first and then specific and 2) Specific and then general. The questions asked about conditions of highways, with one on the general  condition of highways and other items for specific issues related to highway conditions (litter, potholes, etc.). Sample size – 4,000; Half got all mail and half got letter directing to web, then mail for non-respondents. She found a significant impact of question order (general last -> higher satisfaction rating) but there was no impact of age, gender, or delivery mode, except that there was an interaction effect with gender with women reported being more satisfied when the general question was last.


In the ensuing discussion, Don Dillman commented that asking the generic question last makes sense because you establish recall with the specific questions first, but Ginny Lesser said that sometimes you want that first impression. Todd Rockwood also disagreed with Don, saying he thinks the general question should be first. Likewise, Bunny Willets said she prefers general question first because otherwise you are priming them too much.


Lesser also reported that she has been comparing response rates with this survey over time since 1999. The response rate over 16 years has steadily gone down and is consistent across gender and age.


For the next year, Lesser has several studies planned, including a DMV survey with mixed versus single mode approach and it will use an instrument that is double-sided Spanish/English. She plans to assess order effects, as well as repeat the Probability vs. Non-probability panel. In the initial study, there was evidence that web first, then mail is the most cost efficient for probability samples while non-probability panel was by far cheapest.


Applying an herbicide resistance survey in the Pacific Northwest – Katie Dentzman (Washington)



  • Weed management survey

  • Revision of previous survey questions

  • Trying to figure out where to get an effective sampling frame and maybe it doesn’t need to be probabilistic


CSSM Report - Jan (Iowa State)



  • Data Science Services is the biggest part

  • Survey Research Center is a smaller part doing human subjects research

  • Currently doing a LOT of web surveys with mixed success

  • Success transferred faculty ‘hours worked’ study to online


    • Renamed from ‘survey’ to ‘report’ online

    • Provide a paper ‘worksheet’

    • Has definitely increased response rate


  • Issues with NASS means CSSM is getting more business and is swamped with mail surveys

  • Do some telephone surveys but it has significantly fallen off


    • Ex.: Iowa Farmland Ownership and Tenure Study


  • A lot of observational and field data collection


    • Ex:. Surveys in driver license offices

    • People tend to be cooperative, get quite a few responses in a relatively short, relatively inexpensive period of time


  • Mixed field, phone, mail, and device data collection


    • Looking at what people over or underreport re: physical activity

    • Testing 24 hour recall



FEBRUARY 23rd


Comparing multimode and single mode probability based surveys with a non-probability survey - Ginny Lesser (Oregon)


Lesser suggested that adding some non-probability samples (such as panel data) to a probability sample might be a way of dealing with increasing nonresponse. Her research included a probability sample with half in a mail only group and half in a web+mail mixed mode group. The population was adults in Oregon to assess satisfaction with highway maintenance and 7400 addresses were purchased from MSG. The non-probability sample was purchased from Qualtrics, where panel members recruited from business partners. Lesser requested Oregon residents with specific demographics. The probability sample obtained a 27% response rate for the mail only group and 21% for mail+web group. An incentive “I love Oregon” sticker did nothing to change response rate. The non-probability sample had 7250 invitations sent and eventually they got 500 responses paid for. With regard to the results, 66% of panel estimates were outside confidence interval of probability sample. Overall, panel respondents were more positive (i.e. they tended to say they were more satisfied compared to the non-panel people). Panel respondents also were more likely to support ideas to generate new revenue for road maintenance while the probability sample was more likely to give no answer.


Lessons learned while planning and conducting a survey of Florida residents about climate change options


Glenn Israel noted that the UF/IFAS Center for Public Issues Education uses Qualtrics opt-in samples. Qualtrics quota opt-in samples are extremely cheap - $5 per response and very fast – 500 completes in a week to 10 days. He is comparing these types of samples to a probability address-based sample on the topic of climate change. The Qualtrics nonprobability sample requested a Florida population with specific demographics and excluded‘Speeders’ and those who failed an attention check question. Of over 8,000 who were contacted, 856 accessed survey and 514 useable responses were obtained. The probability address-based sample was purchased from SSI for the mail (n=1500) and mixed-mode (n=500) survey. Of the sample, 317 useable responses were collected (a 17% response rate). With regard to the results, the overall item response rate was similar, but quota sample was more complete. One reason might be that, there is a norm that you will complete the entire survey for opt-in panelists. On the other hand, address-based survey respondents had a lower response rate to the open-ended comment item at the end of the survey but much higher median number of words than the opt-in quota sample; the former’s comments were more substantive too. Address-based sample respondents also were more likely to have a higher score on a true-false knowledge of climate change index than the opt-in panel. Finally, the two samples generated different distributions of people along the Six America’s Scale. Over all, the results similar to what Ginny Lesser found: panels are cheap, fast, low item non-response, but there are serious concerns about data quality. One might speculate that people are less interested in subject and more interested in getting their monetary reward for being in the panel. Consequently, there is a need to learn about incentive structure and how panels are recruited.


Yano Prasetyo – What do rural Missouri residents view as the assets and challenges of their communities?


Yano Prasetyo is working with Hua Qin on comparing perceptions with objective measures of community. They used Qualtrics to survey community leaders via email sample and factor analysis to identify important issues for Missouri residents. Respondents generally think their community doesn’t need to change, but could if necessary. They love their community, but are concerned about the future. Community issues include jobs, housing, lack of choice, poverty, and drugs.


Effects of follow-up contacts on sample characteristics and substantive research findings in mail surveys: An exploratory analysis – Bunny Willits and Gene Theodori


They focused on mail surveys, which is still the dominant mode of data collection. Willits noted that multiple contacts are one of the most effective ways to increase response rates. Their research questions asked, “Do demographic characteristics predict who responds to 1st, 2nd, 3rd mail requests to complete survey?” and “Is relationship between demographics and attitudes predicted by which mailing they respond to?” Willits noted that a number of fields (e.g., psychology and medicine) conduct research without representative samples. She raised the question, “Does representativeness really matter?” She noted that we are often interested in relationships between variables for testing social theories.


Willits reported on data from two surveys. The Pennsylvania study examined knowledge of gas drilling and found the most important predictors in wave 1 were gender, education, and income. In waves 2 and 3, the most important predictors were gender, income, and education. This suggests that the same predictors are important across waves. The Texas study results showed there was no significantly different demographics by wave except for education, where the second wave gets more responses from low education respondents. So, Willits concluded that there is consistency across time in terms of relationships. Willits argued, if we’re increasing number of responses but NOT diversity of responses, what is the point? We have to know if we are/aren’t increasing the diversity of responses. We need to have better data so that we can determine whether different waves respond to non-demographic questions differently. Finally, Willets solicited committee members for data sets with dates on them to replicate this study.


A picture is worth 150 responses - Stacey and Gladys (Texas)


The research explored differences in social media use in personal and professional contexts and data collection was conducted at the Country Radio Seminar. They used Qualtrics to disseminate survey at the seminar and offered a professional social media headshot as an incentive to participate. A convenience sample of 150 people at the conference took the survey. The researchers used a poster to prominently advertise the incentive for participating in the study. The incentive offered to provide participants with professional-quality headships. The researchers reported that most participants came to them rather than being directly solicited. Advantages of the incentive include being a relatively cheap incentive, students gained photography experience, and it provided a networking opportunity. Disadvantages included spending a lots of time to editing and emailing photos, equipment needed, and survey length (it can’t be too long). The findings suggest several implications, including that the method can be useful for collecting data at conferences/industry events, it is important to connect theme of survey and incentive being offered and to ensure incentive is useable/practical, and to connect survey to an interest of the conference organizers.


How do millennials check their mail? - Stacey (Texas)


This study used iPads and Qualtrics for an intercept of students on campus and obtained 1,700 completes. They found women tend to check their mail more often than men; the 20-23 age range tend to check mail weekly or daily; and Millennials overwhelmingly check their email on their smartphones.


Sexual orientation and gender questions in rural areas - Bill McKim and Ashley Yopp (Texas)


They noted that the current administration removed sexual orientation and gender identity from national aging survey. They led a discussion of how the inclusion/exclusion of these kind of items might impact response rates in rural communities. Specifically, the question was asked, “How does a binary gender item perform in comparison to the Human Rights Campaign suggested item?”


Eye tracking: Do order and configuration matter - Bill McKim (Texas)


McKim reported on an eye-tracking study of question format where the topics examined items asking about importance then ability versus ability then importance. The two formats for the double matrices (side by side) were importance on the left and ability on the right versus ability on the right and importance on the left. The research subjects were college students and recruitment is tough when your study is called ‘eye-tracking’. The data collection involves recording a lot of information, which results in massive amounts of data. This creates problems with storage and analysis, even with 30 respondents. The analyses examined 1) Total focus duration, 2) Number of focal points, 3) Duration of focus per focal point, and 4) Pattern of focal points. The results suggest that a double matrix isn’t necessarily less work – respondents go back and check a lot. It isn’t saving them any time. Discussion including asking what happens in a page 1 and page 2 versus matrices that are side by side, whether a different visual design might change where people’s eyes go and how much work they are doing on various matrices, and is there a cultural component that is changing how people consume information on a survey, such as social media.


Plans for next year’s meeting



  • February 21 and 22 – Thursday and Friday at the Best Western

  • Meeting is conditional on renewal of the WERA 1010

Accomplishments

<p>An agenda was developed and the coordinating committee held its annual meeting in February, 2018. Participating members discussed several important topics affecting error in agricultural and rural surveys. These topics included: Effective survey communication and Comparison of online opt-in panels with address-based sample surveys. In addition, members reported on survey research studies being conducted or planned in their state and provided feedback to others. Members from several states discussed plans for studies, including additional comparisons of nonprobability samples in on-line surveys with address-based probability samples using mail and/or mixed-mode surveys in order to assess the strengths and weaknesses of these technologies, assessing the utility of different theories for inviting people to respond to a survey, assessing the order of concepts in question stems and responses for satisfaction items, and assessing the effects of follow-up contacts on sample characteristics and substantive research findings, as well as other topics.</p><br /> <p>During the year, committee members have been active in publishing research in journal articles, presenting papers and posters at relevant conferences, and developing educational materials available to Extension professionals and the public during the past year. This includes publishing 3 survey methods-related articles, updating 9 publications for Extension and outreach audiences, and 2 presentations at professional conferences where attendees are members of the target audience for this project. The above Extension publications are part of the web-based Savvy Survey Series, in which the 20 papers (incorporating WERA reseach) have generated over 28,000 visits. Colleagues report using these publications for workshops or professional development in Arkansas and Kentucky. &nbsp;In addition, the member from Florida conducted a 1-day workshop for new Extension professionals on developing quality questionnaires, implementing surveys and analyzing survey data for approximately 40 persons each year with incorporated WERA 1010 research. Finally, a member from Washington has taught a survey methods class for 35 students, which makes extensive use of WERA 1010 research.</p>

Publications

<p>&nbsp;</p><br /> <ol><br /> <li>Dillman, D.A. 2017.&nbsp;The promise and challenge of pushing respondents to the Web in mixed-mode surveys. <em>Survey Methodology</em>, Statistics Canada, Catalogue&nbsp;12‑001‑X,&nbsp;Vol.&nbsp;43,&nbsp;No.&nbsp;1. Paper available as&nbsp;PDF (English) at:&nbsp;<span style="text-decoration: underline;"><a href="https://urldefense.proofpoint.com/v2/url?u=http-3A__www.statcan.gc.ca_pub_12-2D001-2Dx_2017001_article_14836-2Deng.pdf-3Fcmp-3Dcwe-2Dcae&amp;d=DwMF-g&amp;c=C3yme8gMkxg_ihJNXS06ZyWk4EJm8LdrrvxQb-Je7sw&amp;r=35rW85vjCuB2co5s1Tgxxg&amp;m=SWLUjTwatTLK5k176whjYafeReOEPltoPLnOTGeJj9M&amp;s=Clc2YJHHtqLqrxy3V8FdsdzeesU_lSScx6tXKrcjRuA&amp;e=">http://www.statcan.gc.ca/pub/12-001-x/2017001/article/14836-eng.pdf</a></span>.</li><br /> </ol><br /> <ol start="2"><br /> <li>McMaster, H., LeardMann, C. A., Speigle, S., &amp; Dillman, D. A. 2017.&nbsp; An Experimental Comparison of Web-push vs. Paper-only survey Procedures for Conducting an In-Depth health Survey of Military Spouses.&nbsp; <em>BMC Medical Research Methodology</em>.&nbsp;Available at:&nbsp;&nbsp;<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__bmcmedresmethodol.biomedcentral.com_articles_10.1186_s12874-2D017-2D0337-2D1&amp;d=DwMGaQ&amp;c=pZJPUDQ3SB9JplYbifm4nt2lEVG5pWx2KikqINpWlZM&amp;r=ZaqFPLaHgaOS8ZTk7Qj39g&amp;m=rD7KiECwx8rZA2vA2YlLndHqqOXbju5wxHnsZhFSwMA&amp;s=OAZkc2Qyt2ctByDJ9ZBMNHGGuo4htZnkncMq58zJziM&amp;e=">https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-017-0337-1</a>.</li><br /> </ol><br /> <ol start="3"><br /> <li>Newberry, III, M. G., &amp; Israel, G. D. 2017. Comparing Two Web/Mail Mixed-Mode Contact Protocols to a Unimode Mail Survey. <em>Field Methods</em>, 29(3), 281-298. Prepublished June 5, 2017. doi: 10.1177/1525822X17693804.</li><br /> </ol>

Impact Statements

  1. 1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.
Back to top
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.