Is the National Citizen Survey conducted by the National Research Center valid?

April 11, 2018

The following addresses the question of the scientific validity of the National Citizen Survey, commissioned by the City of Coronado and conducted by the National Research Center, as it relates to the potential to allow for multiple responses from the same person or by those not targeted to take the survey.

National Research Center uses best research practices, and ensures that survey results are unbiased and that its findings can be trusted. The central concern is the chance that the survey could be completed more than once, thereby undermining the credibility of findings and creating the appearance that the survey is not scientific.

National Research Center President and CEO Tom Miller, who earned a Ph.D. in research and evaluation methods from the University of Colorado, Boulder, appreciates the opportunity to respond:

All questions about survey methods, including the question raised about the possibility of multiple responses, boil down to a single important question: Are the results likely to be a valid representation of the opinions of the adult population of Coronado? Corollary to that question is: Does the survey process itself create the optimal likelihood of garnering a representative set of community opinions?

National Research Center's survey methods are based on years of research on survey data collection. Company principals have written two books published by the International City/County Management Association, and have published research in scholarly and lay journals and tested in the field for more than 20 years. The essence of the company's approach is to follow a set of practices proven to maximize the chances of accurate findings.

  • Complete coverage of all dwelling units in a jurisdiction
  • Random/systematic sampling of housing units
  • Unbiased selection of adult residents in selected housing units
  • The anonymity of responses promised whenever feasible
  • Self-addressed and stamped reply envelope
  • Unbiased questions targeted for local government use that keep the survey as short and simple as possible
  • Layout easy to follow
  • Multiple contacts to remind response and offer new response materials. Importantly, contacting potential respondents multiple times encourages responses from people who may have different opinions or habits than those who would respond with only a single prompt.
  • Multiple data collection modes (mail, web) as needed
  • Signed cover letter from City Manager to evoke civic responsibility
  • Adherence to American Association of Public Opinion Research Transparency Initiative requirements to make public the key survey methods.

The biggest challenge in survey research these days is to get residents to respond. Response rates have fallen in all locations, for every purpose and by every data collection mode. The threat of multiple responses in most local government surveys is minimal because of the time required to complete the survey and the low stakes of the survey questions. Optimal survey methods for garnering the largest and broadest response include making it very easy for residents to participate and ensuring that responses will be anonymous so that answers are honest. Sometimes residents forget or are too busy to complete their survey or they misplace it, so the best survey practice is to give residents multiple opportunities to give their responses. To keep results anonymous, there is no code affixed to the questionnaire that would be required to respond to the survey and no secret identifying marks on the survey so that there is no way to link responses to individuals.

Secret codes can be used to identify residents and residents know this. In National Research Center's experience, such codes affixed to surveys even in locations not readily visible often are removed by residents from mailed surveys and codes required for entry to a web or online survey reduce response rates and change responses because responses no longer are given in anonymity.

Multiple survey responses are unlikely to have any noticeable effect on the results. In the National Research Center's research lab, the company has tested the possibility that residents would accidentally or intentionally respond more than once. By using identifying codes, the company determined that on average, about.5% to 1% of respondents have appeared twice in a sample. But imagine that as many as 5% of a sample comprised duplicate respondents in Coronado, a magnitude never seen by National Research Center. That would mean that 7 or 8 residents out of 300 responding residents ignored the clear instructions to respond only once and responded, instead, twice. Such an example of "ballot stuffing" would have no noticeable effect on results. Below is an example of that point.

Survey QuestionImagine that the results of the question about public art found 50% of 300 respondents, or 150 respondents, felt there was "too much" public art - with a margin of error ranging from 44% to 56%.

Now assume that the eight residents who completed the survey twice, submitting 16 duplicates, or about 5% of the 300 respondents, indicated there was "too much" public art in Coronado and those eight duplicates were removed. With 142 of 292 respondents believing that there was too much public art, the new figure would be 48.6% of respondents saying "too much," with a margin of error from 42.6% to 54.6%. Even with this unlikely magnitude of "double voting," there is no meaningful difference in results. A few "extra" responses do no harm and offsets the decrease in response rate expected if a code were required to respond. But what about some individual or some group that seeks to write in large numbers of duplicate responses to the anonymous survey? If there were mass intention to undermine the instructions of the survey - written and signed by the City Manager - the timing and volume of scores of unsought duplicate responses would reveal a signature of untypical dimensions that would be spotted easily by researchers. Such a pattern has not appeared.

Below are excerpts from two articles that support National Research Center's point that giving potential survey respondents multiple opportunities to respond is survey research best practice.

1. In: "Survey Completion Rates and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey" by Andrea Hassol, et al. Abt Associates Incorporated, article submitted for publication to "Public Opinion Quarterly":

"In designing data collection strategies, survey researchers must weigh available resources against expected returns…

Considerable research has been conducted on methods for improving response rates to surveys. Several key factors are known to affect response rates, including salience of the survey (Herberlein and Baumgartner, 1978), form of mailing and monetary incentives (Dillman, 1991), and multiple contacts (Linsky, 1975, Dillman, 1991). A response-maximizing approach to multi-modal surveys, as best articulated by Dillman (1978), includes:

  • A respondent-friendly questionnaire
  • Up to five contacts with recipients of the survey, a brief pre-notice letter sent a few days prior to the arrival of the questionnaire; a questionnaire mailing that includes a detailed cover letter; a thank you/reminder postcard; a replacement questionnaire; a final contact, possibly by telephone.
  • Inclusion of stamped return envelopes
  • Personalized correspondence
  • A token financial incentive

2. In: "The Effects of MultiWave Mailings on the External Validity of Mail Surveys" by Michael Dalecki, et al. Journal of the Community Development Society. 19(1) June 2010. University of Delaware:

Abstract: Survey data, particularly mail questionnaires, are very useful in community development work. With relatively low cost, a practitioner can obtain valid information to determine community needs, support for programs and general attitudes and opinions of local citizens. Low response rates, however, can have serious effects on the validity of the data. Previous research has shown that follow-up mailings are essential to obtaining a high response rate to mail surveys. This paper examines the potential for sample bias if the number of mailings is reduced. Differences between groups responding to three waves of mailings to a statewide Pennsylvania survey (N = 9,957) are examined via log-linear techniques, using continuation ratio models. The results indicate that initial respondents differ from laggard respondents on five demographic characteristics, but differences diminish between early laggard and late laggard respondents. The implication is that single mailings of questionnaires could cause serious threats to the validity of the data. Multiple mailings and other methods to maximize response rates are necessary to improve the quality of survey data for community development work.

Show All Answers

1. Was the Coronado Senior Association moved out of the John D. Spreckels Center to make room for the City’s Cultural Arts senior management analyst?
2. Since the Spreckels Center does not have the word “senior” anywhere on the facility, does the City have a true “senior center” that addresses the needs of those 50 and older?
3. Do I have to recycle my food waste now that EDCO has announced its new Organic Recycling Program?
4. Why does the City waste money watering the synthetic turf at the Lawn Bowling Green?
5. Does the City’s affordable housing provider San Diego Interfaith Housing treat tenants in an arbitrary, unpredictable, discriminatory fashion and evict people without cause?
6. Are the current asphalt repairs simply “make-work” and unnecessary?
7. Are there fresh water aquifers in Coronado that could be used for a supply of potable water?
8. Did the City of Newport Beach successfully challenge a state law, Senate Bill 2, that mandates a city's zoning codes accommodate emergency shelters and transitional housing?
9. Did the new traffic signal at Alameda Boulevard and Fourth Street cause a back-up of traffic east of the intersection on Wednesday, November 6?
10. Does refinancing the former redevelopment agency’s bonds and loans create more density in Coronado?
11. Does the City’s affordable housing provider San Diego Interfaith Housing treat tenants in an arbitrary, unpredictable, discriminatory fashion and evict people without cause?
12. How can the City leave the lights on at the Coronado Public Library overnight especially during the current heat wave and with potential rotating outages?
13. How does a City get selected to Dr. Beach’s Top 10 Best Beaches in America list?
14. Is it true that enterococci bacteria can be caused by decaying kelp and why doesn’t the City think the current advisory at Avenida del Sol is related to sewage impacts from the Tijuana River or Mexico
15. Is the City going to make changes at Coronado Cays Park?
16. Is the City not respecting its beautiful historic sidewalks?
17. Is the City trying to extend San Diego’s Lindbergh Field into Coronado?
18. Is the water quality being affected in South Beach and is Central Beach being tested for the Fourth of July?
19. Is there still time to have a say on the Coronado Cays Park Master Plan?
20. It has been reported in national news stories based on a local report that the City of Coronado’s beach was closed for several weeks. Is that true?
21. Questions have come up in the community about what uses are allowed in the City of Coronado’s R-1A residential zoning code. What are those uses?
22. The City has fire rings at North Beach that get very busy during the summer. What is the City’s fire ring policy? How does the City monitor behavior at the fire rings and is it enough?
23. What are the white cones along the Silver Strand State Highway?
24. Is the National Citizen Survey conducted by the National Research Center valid?
25. What is the current status of the Golf Course Water Recycling and Turf Care Facility project or the environmental review? How are the potential environmental impacts being addressed?
26. What role does the City have in the redevelopment of the Coramart building and has the City prevented its redevelopment?
27. What was the odor in Coronado on Easter Sunday?
28. Why are there two construction sites at Spreckels Park? And why is the site near Seventh only a concrete pad?
29. Why is the City ending its participation in the Rotary Santa program?
30. Why isn’t the City extending service to the Cays this year?
31. Why were two palms recently removed from the beach?
32. Why is the Glorietta Bay Boat Launch Ramp closed and when will it reopen?