Skip to main content Accessibility help
×
Hostname: page-component-7479d7b7d-jwnkl Total loading time: 0 Render date: 2024-07-12T01:53:15.907Z Has data issue: false hasContentIssue false

28 - Standardization and meaning in the survey of linguistically diversified populations: insights from the ethnographic observation of linguistic minorities in 2010 Census interviews

Published online by Cambridge University Press:  05 September 2014

Yuling Pan
Affiliation:
US Census Bureau
Stephen Lubkemann
Affiliation:
US Census Bureau
Roger Tourangeau
Affiliation:
Westat Research Organisation, Maryland
Brad Edwards
Affiliation:
Westat Research Organisation, Maryland
Timothy P. Johnson
Affiliation:
University of Illinois, Chicago
Kirk M. Wolter
Affiliation:
University of Chicago
Nancy Bates
Affiliation:
US Census Bureau
Get access

Summary

Introduction

Long a destination for millions of immigrants, the United States has included significant groups of non-English speakers in its population throughout most of its history. It therefore showcases many of the challenges that linguistic and cultural differences increasingly pose to survey methodologists in a globalizing world. This chapter draws upon an empirical field study of the 2010 US Census Nonresponse Follow-up (NRFU) interview that was conducted with respondents with limited (or no) English proficiency (LEPs) in seven immigrant communities. The goal of the study was to examine how linguistic and cultural differences affected access to LEP respondents and the quality of the responses they provided.

In the US context, non-English speakers are likely to be harder to reach and enumerate than their English-speaking counterparts for several reasons. Most obviously, language barriers impose hurdles to effective communication and thus to access. Some LEPs are recent arrivals who may be less familiar or comfortable with survey practices than English-speaking respondents. Others are reluctant to interact outside of their native community because they have tenuous legal status or distrust government authorities. It is reasonable to assume that limited English proficiency makes respondents relatively harder to reach for surveys in the US.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2014

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Agha, A. (2006). Registers of language. In Duranti, A. (ed.), A Companion to Linguistic Anthropology. Oxford: Blackwell Publishing.Google Scholar
Beatty, P. (1995). Understanding the standardized/non-standardized interviewing controversy, Journal of Official Statistics, 11(2), 147–60.Google Scholar
Behling, O., & Law, K. S. (2000). Translating Questionnaires and Other Research Instruments: Problems and Solutions. London: Sage.CrossRefGoogle Scholar
Biber, D., & Finegan, E. (eds.). (1994). Sociolinguistic Perspectives on Register. New York: Oxford University Press.
Billiet, J. (2003). Cross-cultural equivalence with structural equation modeling. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 247–64). Hoboken, NJ: John Wiley & Sons.Google Scholar
Braun, M. (2003). Communication and social cognition. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 57–68). Hoboken, NJ: John Wiley & Sons.Google Scholar
Braun, M, & Harkness, J. (2005). Text and context: challenges to comparability in survey questions. In Hoffmeyer-Zlotnik, J. P. & Harkness, J. A. (eds.), Methodological Aspects in Cross-National Research (pp. 95–107). Mannheim: ZUMA.Google Scholar
Briggs, C. (1986). Learning How to Ask: A Sociolinguistic Appraisal of the Role of the Interview in Social Science Research. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Childs, J. H., Terry, R., & Jurgenson, N. (2011). Measuring race and Hispanic origin: cognitive test findings searching for “truth.”Bulletin de Méthodologie Sociologique, 111(1), 26–42.CrossRefGoogle Scholar
Clark, H. H., & Schober, M. F. (1992). Asking questions and influencing answers. In Tanur, J. A. (ed.), Questions about Questions: Inquiries into the Cognitive Basis of Surveys (pp. 15–48). New York: Russell Sage.Google Scholar
Conrad, F. G., & Schnober, M. F. (2000). Clarifying question meaning in a household telephone survey. Public Opinion Quarterly, 64(1), 1–28.CrossRefGoogle Scholar
Dykema, J., Lepkowski, J. M., & Blixt, S. (1997). The effect of interviewer and respondent behavior on data quality: analysis of interaction coding in a validation study. In Lyberg, L., Biemer, P., Collins, M., de Leeuw, E. D., Dippo, C., Schwartz, N., & Trewin, D. (eds.), Survey Measurement and Process Quality (pp. 287–310). New York: John Wiley & Sons.Google Scholar
Eriksen, T. H. (2002). Ethnicity and Nationalism: Anthropological Perspectives. London: Pluto Press.Google Scholar
Ferguson, C. A. (1982). Simplified registers and linguistic theory. In Obler, L. K. & Menn, L. (eds.), Exceptional Language and Linguistics (pp. 49–66). New York: Academic Press.Google Scholar
Fitzgerald, R., & Jowell, R. (2010). Measurement equivalence in comparative surveys: the European Social Survey (ESS) from design to implementation and beyond. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 485–96). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Fowler, Jr., F. J. (1991). Reducing interviewer-related error through interviewer training, supervision and other means. In Biener, P. P., Groves, R. M., Lyberg, L. E., Mathiowetz, N. A., & Sudman, S. (eds.), Measurement Errors in Surveys (pp. 259–78). Hoboken, NJ: John Wiley & Sons.Google Scholar
Fowler, Jr., F. J., & Mangione, T. W. (1990). Standardized Survey Interviewing: Minimizing Interviewer-Related Error. Newbury Park: Sage.CrossRefGoogle Scholar
Gerber, E. R., & Crowley, M. (2005). Report on Cognitive Testing of a Shortened Sequence of Hispanic Origin, Race and Modified Ancestry Questions: Content Development for the 2005 National Content Test. Suitland, MD: US Census Bureau.Google Scholar
Goffman, E. (1974). Frame Analysis: An Essay on the Organization of Experience. New York, NY: Harper and Row.Google Scholar
Halter, M. (1993). Between Race and Ethnicity. Champaign, IL: University of Illinois Press.Google Scholar
Harkness, J. A. (ed.) (1998). ZUMA-Nachrichten Spezial N. 3. Cross-Cultural Survey Equivalence. Mannheim: ZUMA.
Harkness, J. A. (2003). Questionnaire translation. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. P. (eds.), Cross-Cultural Survey Methods (pp. 35–56). Hoboken, NJ: John Wiley & Sons.Google Scholar
Harkness, J. A., Edwards, B., Hansen, S. E., Miller, D. R., & Vilar, A. (2010). Designing questionnaires for multipopulation research. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural contexts (pp. 33–58). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Harkness, J. A., & Schoua-Glusberg, A. (1998). Questionnaires in translation. In Harkness, J. A. (ed.), ZUMA-Nachrichten Spezial No. 3. Cross-Cultural Survey Equivalence (pp. 87–127). Mannheim: ZUMA.Google Scholar
Harkness, J. A., van de Vijver, F. J. R., & Mohler, P. P. (2003). Questionnaire design in comparative research. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. P. (eds.), Cross-Cultural Survey Methods (pp. 19–34). Hoboken, NJ: John Wiley & Sons.Google Scholar
Harkness, J. A., Villar, A., & Edwards, E. (2010). Translation, adaptation and design. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural contexts (pp. 117–40). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Howell, D. A. (2010). Enhancing quality and comparability in the comparative study of electoral systems (CSES). In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 525–34). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Hox, J. J., de Leeuw, E., & Binkhuis, M. J. S. (2010). Analysis models for comparative surveys. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 395–418). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Hymes, D. (1972). Models of the interaction of language. In Gumperz, J. J. & Hymes, D. (eds.), Directions in Sociolinguistics: The Ethnography of Communication (pp. 35–71). New York: Holt, Rinehart, & Winston.Google Scholar
Isurin, L., Pan, Y., & Lubkemann, S. (2013). Observing Census Enumeration of Non-English Speaking Households in the 2010 Census: Russian Report. Center for Survey Measurement Research Report. USCensus Bureau.Google Scholar
Johnson, T. P. (1998). Approaches to establishing equivalence in cross-cultural and cross-national survey research. In Harkness, J. A. (ed.), Cross Cultural Survey Equivalence (pp 1–40). Mannheim: ZUMA.Google Scholar
Jowell, R., Kaase, M., Fitzgerald, R., & Eva, G. (eds.) (2007). Measuring Attitudes Cross-Nationally: Lessons from the European Social Survey. Thousand Oaks, CA: Sage.CrossRef
Kim, B. C., & Ryu, E. (2005). Korean families. In McGoldrick, M., Giordano, J., & Garcia-Preto, N. (eds.), Ethnicity and Family Therapy (pp. 349–62). NY: Guilford Press.Google Scholar
Kim, J., & Zapata, J. (2012). 2010 Census: Language Program Assessment Report. 2010 Census Program for Evaluations and Experiments. 2010 Census Planning Memoranda Series #204.
Kovar, M. G., & Royston, P. (1990). Comment on “interactional troubles in face-to-face survey interviews.” Journal of the American Statistical Association, 85(409), 246–47.Google Scholar
McKay, B., Breslow, M. J., Sangster, R. L., Gabbard, S. M., & Reynolds, R. W. (1996). Translating survey questionnaires: lessons learned. New Directions for Evaluation, 70, 93–104.CrossRefGoogle Scholar
Mohler, P. Ph., & Johnson, T. J. (2010). Equivalency, comparability and methodological progress. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 17–32). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Mohler, P. Ph., Smith, T. W., & Harkness, J. A. (1998). Respondents ratings of expressions from response scales: a two country, two language investigation on equivalence and translation. In Harkness, J. A. (ed.), Cross Cultural Survey Equivalence (pp. 159–184). Mannheim: ZUMA.Google Scholar
Pan, Y., & Kádár, D. Z. (2011). Politeness in Historical and Contemporary Chinese. London and New York: Continuum.Google Scholar
Platt, J. (2002). The history of the interview. In Gubrium, J. F. & Holstein, J. A. (eds.), Handbook of Interview Research (pp. 33–54). London: Sage.Google Scholar
Schaeffer, N. (1991). Conversation with a purpose – or conversation? Interaction in the standardized interview. In Biemer, P. P., Groves, R. M., Lyberg, L. E., Mathiowetz, N. A., & Sudman, S. (eds.), Measurement Errors in Surveys. New York: John Wiley & Sons.Google Scholar
Schaeffer, N., & Maynard, D. W. (2002). Occasions for intervention: interactional resources for comprehension in standardized survey interviews. In Maynard, D. W., Houtkoop-Steenstra, H., Schaeffer, N. C., & van der Zouwen, J. (eds.), Standardization and Tacit Knowledge: Interaction and Practice in the Survey Interview (pp. 261–80). New York: Wiley.Google Scholar
Schober, M., & Conrad, F. (1997). Does conversational interviewing reduce survey measurement error?Public Opinion Quarterly, 61(4), 576–602.CrossRefGoogle Scholar
Schwarz, N. (2003). Culture-sensitive context effects: a challenge for cross-cultural surveys. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 93–100). Hoboken, NJ: John Wiley & Sons.Google Scholar
Scollon, R. (2001). Mediated Discourse: The Nexus of Practice. London and New York: Routledge.CrossRefGoogle Scholar
Singleton, Jr., R. A., & Straits, B. C. (2002). Survey interviewing. In Holstein, J. A. & Gubrium, J. F. (eds.), Handbook of Interview Research. Thousand Oaks: Sage.Google Scholar
Smit, J. H., Dijkstra, W., & van der Zouwen, J. (1997). Suggestive interviewer behaviour in surveys: an experimental study. Journal of Official Statistics, 13(1), 19–28.Google Scholar
Smith, T. W. (2003). Developing comparable questions in cross-national surveys. In Harkness, J. A., van de Vivjer, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 69–92). Hoboken, NJ: John Wiley & Sons.Google Scholar
Suchman, L., & Jordan, B. (1990). Interactional troubles in face-to-face survey interviews. Journal of the American Statistical Association, 85(409), 232–41.CrossRefGoogle Scholar
Suchman, L., & Jordan, B. (1992). Validity and the collaborative construction of meaning in face-to-face surveys. In Tanur, J. M. (ed.), Questions about Questions: Inquiries into the Cognitive Bases of Surveys (pp. 241–70). New York: Russell Sage.Google Scholar
Tannen, D. (1993). Framing in Discourse. Oxford: Oxford University Press.Google Scholar
Thornton, A., Achen, A., Barber, J., Binstock, G., & Garrison, W. (2010). Creating questions and protocols for an international study of ideas about development and family life. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 59–74). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
van de Vijver, F. J. R. (1998). Towards a theory of bias and equivalence. In Harkness, J. A. (ed.), ZUMA-Nachrichten Spezial No. 3. Cross-Cultural Survey Equivalence. Mannheim: ZUMA.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×