Skip to main content Accessibility help
×
Hostname: page-component-7479d7b7d-t6hkb Total loading time: 0 Render date: 2024-07-12T21:35:12.678Z Has data issue: false hasContentIssue false

References

Published online by Cambridge University Press:  05 August 2012

Mick P. Couper
Affiliation:
University of Michigan, Ann Arbor
Get access

Summary

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2008

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Anderson, A. E., Murphy, E. D., Nichols, E. M., Sigman, R. S., and Willimack, D. K. (2004), “Designing Edits for Electronic Economic Surveys and Censuses: Issues and Guidelines.” Proceedings of the Joint Statistical Meetings of the American Statistical Association. Alexandria, VA, pp. 4912–4919.Google Scholar
Athale, N., Sturley, A., Skoczen, S., Kavanaugh, A., and Lenert, L. (2004), “A Web-Compatible Instrument for Measuring Self-Reported Disease Activity in Arthritis.” Journal of Rheumatology, 31 (2): 223–228.Google Scholar
Baker, R. P., Bradburn, N. M., and Johnson, R. A. (1995), “Computer-Assisted Personal Interviewing: An Experimental Evaluation of Data Quality and Costs.” Journal of Official Statistics, 11 (4): 415–431.Google Scholar
Baker, R. P., and Couper, M. P. (2007), “The Impact of Screen Size and Background Color on Response in Web Surveys.” Paper presented at the General Online Research Conference (GOR'07), Leipzig, March.
Baker, R. P., Couper, M. P., Conrad, F. G., and Tourangeau, R. (2004), “Don't Know and No Opinion Responses in Web Surveys.” Paper presented at the RC33 International Conference on Social Science Methodology, Amsterdam, August.
Baker, R. P., Crawford, S. D., and Swinehart, J. (2004), “Development and Testing of Web Questionnaires.” In Presser, S., Rothgeb, J., Couper, M. P., Lessler, J., Martin, E. A., Martin, J., and Singer, E., (eds.), Methods for Testing and Evaluating Survey Questionnaires. New YorkWiley, pp. 361–384.CrossRefGoogle Scholar
Baker-Prewitt, J. (2003), “All Web Surveys are not Created Equal: Your Design Choices Can Impact Results.” Presentation at the SumIT03 Global Market Research Symposium, Montreal, October. www.burke.com.
Bakken, D., and Frazier, C. L. (2006), “Conjoint Analysis: Understanding Consumer Decision Making.” In Grover, R. and Vriens, M. (eds.), The Handbook of Marketing Research. Thousand Oaks, CA: Sage, pp. 288–311.CrossRefGoogle Scholar
Bälter, O., and Bälter, K. A. (2005), “Demands on Web Survey Tools for Epidemiological Research.” European Journal of Epidemiology, 20: 137–139.CrossRefGoogle ScholarPubMed
Bartlett, J. (1919), Familiar Quotations (10th ed.). Boston: Little, Brown.Google Scholar
Bartram, D. (1982), “The Perception of Semantic Quality in Type: Differences Between Designers and Non-Designers.” Information Design Journal, 3 (1): 38–50.CrossRefGoogle Scholar
Bates, N., and Nichols, E. (1998), “The Census Bureau WWW Hiring Questionnaire: A Case Study of Usability Testing.” Paper presented at the Joint Statistical Meetings of the American Statistical Association, Dallas, TX, August.
Bayer, L. R., and Thomas, R. K. (2004), “A Comparison of Sliding Scales with Other Scale Types in Online Surveys.” Paper presented at the RC33 International Conference on Social Science Methodology, Amsterdam, August.
Bell, D. S., Mangione, C. M., and Kahn, C. E. (2001), “Randomized Testing of Alternative Survey Formats Using Anonymous Volunteers on the World Wide Web.” Journal of the American Medical Informatics Association, 8 (6): 616–620.CrossRefGoogle Scholar
Bentley, M., and Tancreto, J. G. (2006), “Analysis of Self-Response Options and Respondent-Friendly Design from the 2005 National Census Test.” Proceedings of the American Statistical Association, Survey Research Methods Section. Alexandria, VA: ASA, pp. 2755–2762 [CD].
Benway, J. P. (1998), “Banner Blindness: The Irony of Attention Grabbing on the World Wide Web.” Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting, pp. 463–467.CrossRef
Benway, J. P., and Lane, D. M. (1998), “Banner Blindness: Web Searchers Often Miss ‘Obvious’ Links.” ITG Newsletter, 1 (3). http://www.internettg.org/newsletter/dec98/banner_blindness.html.Google Scholar
Bergman, L. R., Kristiansson, K.-E., Olofsson, A., and Säfström, M. (1994), “Decentralized CATI versus Paper and Pencil Interviewing: Effects on the Results in the Swedish Labor Force Surveys.” Journal of Official Statistics, 10 (2): 181–195.Google Scholar
Bernard, M., and Mills, M. (2000), “So, What Size and Type of Font Should I Use on My Website?Usability News, 2.2 http://psychology.wichita.edu/surl/usabilitynews/2S/font.htm.Google Scholar
Bernard, M., Mills, M., Peterson, M., and Storrer, K. (2001), “A Comparison of Popular Online Fonts: Which Is Best and When?Usability News, 3.2 http://psychology.wichita.edu/surl/usabilitynews/3S/font.htm. Accessed March 14, 2008.Google Scholar
Billiet, J., and Loosveldt, G. (1988), “Improvement of the Quality of Responses to Factual Survey Questions by Interviewer Training.” Public Opinion Quarterly, 52 (2): 190–211.CrossRefGoogle Scholar
Birnbaum, M. H. (ed.) (2000a), Psychological Experiments on the Internet. San Diego: Academic Press.
Birnbaum, M. H. (2000b), “SurveyWiz and FactorWiz: JavaScript Web Pages That Make HTML Forms for Research on the Internet.” Behavior Research Methods, Instruments, and Computers, 32 (2): 339–346.CrossRefGoogle Scholar
Birnbaum, M. H. (2001), Introduction to Behavioral Research on the Internet. Upper Saddle River, NJ: Prentice-Hall.Google Scholar
Birnholtz, J. P., Horn, D. B., Finholt, T. A, and Bae, S. J. (2004), “The Effects of Cash, Electronic, and Paper Gift Certificates as Respondent Incentives for a Web-based Survey of Technologically Sophisticated Respondents.” Social Science Computer Review, 22 (3): 355–362.CrossRefGoogle Scholar
Bischoping, K., and Schuman, H. (1992), “Pens and Polls in Nicaragua: An Analysis of the 1990 Preelection Surveys.” American Journal of Political Science, 36 (2): 331–350.CrossRefGoogle Scholar
Blackwell, A. F. (2001), “Pictorial Representation and Metaphor in Visual Language Design.” Journal of Visual Languages and Computing, 12: 223–252.CrossRefGoogle Scholar
Bogen, K. (1996), “The Effect of Questionnaire Length on Response Rates – a Review of the Literature.” Proceedings of the American Statistical Association, Survey Research Methods Section. Alexandria, VA: American Statistical Association, pp. 1020–1025.
Bosnjak, M., Neubarth, W., Couper, M. P., Bandilla, W., and Kaczmirek, L. (in press), “Prenotification in Web Surveys: The Influence of Mobile Text Messaging versus E-Mail on Response Rates and Sample Composition.” Social Science Computer Review.
Bosnjak, M., and Tuten, T. L. (2001), “Classifying Response Behaviors in Web-Based Surveys.” Journal of Computer-Mediated Communication, 6 (3), http://jcmc.indiana.edu/vol6/issue3/boznjak.html.Google Scholar
Bosnjak, M., and Tuten, T. L. (2002), “Prepaid and Promised Incentives in Web Surveys – An Experiment.” Social Science Computer Review, 21 (2): 208–217.Google Scholar
Bosnjak, M., and Wenzel, O. (2005), “Effects of Two Innovative Techniques to Apply Incentives in Online Access Panels.” Paper presented at the German Online Research Conference, Zurich, March.
Bowker, D., and Dillman, D. A. (2000), “An Experimental Evaluation of Left and Right Oriented Screens for Web Questionnaires” Paper presented at the annual meeting of the American Association for Public Opinion Research, Portland, OR, May.
Boyarski, D., Neuwirth, C., Forlizzi, J., and Regli, S. H. (1998), “A Study of Fonts Designed for Screen Display.” Proceedings of CHI 98. New York: ACM, pp. 87–94.CrossRefGoogle Scholar
Bradburn, N. M. (1978), “Respondent Burden.” Proceedings of the American Statistical Association, Survey Research Methods Section. Alexandria, VA: American Statistical Association, pp. 35–40.Google Scholar
Brenner, M. (1982), “Response-Effects of ‘Role-Restricted’ Characteristics of the Interviewer.” In Dijkstra, W. and Zouwen, J. (eds.), Response Behaviour in the Survey-Interview. London: Academic Press, pp. 131–165.Google Scholar
Brinck, T., Gergle, D., and Wood, S. D. (2002), Usability for the Web: Designing Web Sites that Work. San Francisco: Morgan Kaufmann.Google Scholar
Brophy, S., Hunniford, T., Taylor, G., Menon, A., Roussou, T., and Callin, A. (2004), “Assessment of Disease Severity (in Terms of Function) Using the Internet.” Journal of Rheumatology, 31 (9): 1819–1822.Google Scholar
Brosius, H.-B., Donsbach, W., and Birk, M. (1996), “How Do Text-Picture Relations Affect the Informational Effectiveness of Television Newscasts?Journal of Broadcasting & Electronic Media, 40: 180–195.CrossRefGoogle Scholar
Brown, C. M. (1988), Human-Computer Interface Design Guidelines. Norwood, NJ: Ablex.Google Scholar
Burnside, R. (2000), “Towards Best Practice for Design of Electronic Data Capture Instruments (Methodology Advisory Committee).” Belconnen, ACT: Australian Bureau of Statistics, Research Paper 1352.0.55.036.
Burris, J., Chen, J., Graf, I., Johnson, T., and Owens, L. (2001), “An Experiment in Web Survey Design.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Montreal, Quebec, May.
Carney, R. N., and Levin, J. R. (2002), “Pictorial Illustrations Still Improve Students' Learning From Text.” Educational Psychology Review, 14 (1): 5–26.CrossRefGoogle Scholar
Castro, E. (2003), HTML for the World Wide Web, Fifth Edition, with XHTML and CSS. Berkeley, CA: Peachpit Press.Google Scholar
Chesney, T. (2006), “The Effect of Communication Medium on Research Participation Decisions.” Journal of Computer-Mediated Communication, 11 (3): article 10 http://jcmc.indiana.edu/vol11/issue3/chesney.html.Google Scholar
Childers, T. L., and Jass, J. (2002), “All Dressed Up with Something to Say: Effects of Typeface Semantic Associations on Brand Perceptions and Consumer Memory.” Journal of Consumer Psychology, 12 (2): 93–106.CrossRefGoogle Scholar
Christian, L. M. (2003), “The Influence of Visual Layout on Scalar Questions in Web Surveys.” Unpublished Master's Thesis. Pullman: Washington State University.
Christian, L. M., and Dillman, D. A. (2004), “The Influence of Graphical and Symbolic Language Manipulations on Responses to Self-Administered Questions.” Public Opinion Quarterly, 68 (1): 57–80.CrossRefGoogle Scholar
Christian, L. M., Dillman, D. A., and Smyth, J. D. (2007), “Helping Respondents Get It Right the First Time: The Influence of Words, Symbols, and Graphics in Web Surveys.” Public Opinion Quarterly, 71 (1): 113–125.CrossRefGoogle Scholar
Church, A. H. (1993), “Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis.” Public Opinion Quarterly, 57 (1): 62–79.CrossRefGoogle Scholar
Clark, R. L., and Nyiri, Z. (2001), “Web Survey Design: Comparing a Multi-Screen to a Single Screen Survey.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Montreal, Quebec, May.
Clayton, R. L., and Werking, G. S. (1998), “Business Surveys of the Future: The World Wide Web as a Data Collection Methodology.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., Nicholls, W. L., and O'Reilly, J.. (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 543–562.Google Scholar
Cockburn, A., McKenzie, B., and JasonSmith, M. (2002), “Pushing Back: Evaluating a New Behaviour for the Back and Forward Buttons in Web Browsers.” International Journal of Human-Computer Studies, 57: 397–414.CrossRefGoogle Scholar
Coles, P., and Foster, J. J. (1975), “Typographic Cues as an Aid to Learning from Typewritten Text.” Programmed Learning and Educational Technology, 12: 102–108.CrossRefGoogle Scholar
Conrad, F. G., Couper, M. P., Tourangeau, R., and Galesic, M. (2005), “Interactive Feedback Can Improve Quality of Responses in Web Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Miami Beach, May.
Conrad, F. G., Couper, M. P., Tourangeau, R., and Peytchev, A. (2005), “Impact of Progress Indicators on Task Completion: First Impressions Matter.” Proceedings of CHI (Computer Human Interaction) '05, Portland, OR. New York: Association for Computing Machinery.Google Scholar
Conrad, F. G., Couper, M. P., Tourangeau, R., and Peytchev, A. (2006), “Use and Non-Use of Clarification Features in Web Surveys.” Journal of Official Statistics, 22 (2): 245–269.Google Scholar
Conrad, F. G., and Schober, M. F. (2000), “Clarifying Question Meaning in a Household Telephone Survey.” Public Opinion Quarterly, 64 (1): 1–28.CrossRefGoogle Scholar
Conrad, F. G., and Schober, M. F. (2005), “Promoting Uniform Question Understanding in Today's and Tomorrow's Surveys.” Journal of Official Statistics, 21 (2): 215–231.Google Scholar
Conrad, F. G., Schober, M. F., and Coiner, T. (2007), “Bringing Features of Human Dialogue to Web Surveys.” Applied Cognitive Psychology, 21: 165–187.CrossRefGoogle Scholar
Cook, C., Heath, F., and Thompson, R. L. (2001), “Score Reliability in Web- or Internet-Based Surveys: Unnumbered Graphic Rating Scales Versus Likert-Type Scales.” Educational and Psychological Measurement, 61 (4): 697–706.CrossRefGoogle Scholar
Coon, D. (2002), “Challenges to Creating Accessible Web and Internet Surveys.” Paper presented at the Federal CASIC Workshops, Washington, DC, February.
Cooper, A. (1995), About Face: The Essentials of User Interface Design. Foster City, CA: IDG Books.Google Scholar
Couper, M. P. (1994), “Discussion: What Can CAI Learn from HCI?” In Proceedings of the Seminar on New Directions in Statistical Methodology. Washington, DC: Statistical Policy Office, Office of Management and Budget (Statistical Policy Working Paper No. 23, pp. 363–377. http://www.fcsm.gov/working-papers/spwp23index.html.Google Scholar
Couper, M. P. (1999), “The Application of Cognitive Science to Computer Assisted Interviewing.” In Sirken, M. G., Hermann, D. J., Schechter, S., Schwarz, N., Tanur, J. M., and Tourangeau, R. (eds.), Cognition and Survey Research. New York: Wiley, pp. 277–300.Google Scholar
Couper, M. P. (2000), “Web Surveys: A Review of Issues and Approaches.” Public Opinion Quarterly, 64 (4), 464–494.CrossRefGoogle Scholar
Couper, M. P. (2005), “Technology Trends in Survey Data Collection.” Social Science Computer Review, 23 (4): 486–501.CrossRefGoogle Scholar
Couper, M. P. (2007a), “Whither the Web: Web 2.0 and the Changing World of Web Surveys.” in Trotman, M., et al. (eds.), The Challenges of a Changing World: Proceedings of the Fifth International Conference of the Association for Survey Computing. Berkeley, UK: ASC, pp.7–16.
Couper, M. P. (2007b), “Technology and the Survey Interview/Questionnaire.” In Schober, M. F. and Conrad, F. G. (eds.), Envisioning the Survey Interview of the Future. New York: Wiley, pp. 58–76.CrossRefGoogle Scholar
Couper, M. P., Blair, J., and Triplett, T. (1999), “A Comparison of Mail and E-Mail for a Survey of Employees in Federal Statistical Agencies.” Journal of Official Statistics, 15 (1): 39–56.Google Scholar
Couper, M. P., Conrad, F. G., and Tourangeau, R. (2003), “The Effect of Images on Web Survey Responses.” In Banks, R. et al. (eds.), Survey and Statistical Computing IV: The Impact of Technology on the Survey Process, pp. 343–350.Google Scholar
Couper, M. P., Conrad, F. G., and Tourangeau, R. (2007), “Visual Context Effects in Web Surveys.” Public Opinion Quarterly, 71 (4): 623–634.CrossRefGoogle Scholar
Couper, M. P., Hansen, S. E., and Sadosky, S. A. (1997), “Evaluating Interviewer Performance in a CAPI Survey.” In Lyberg, L., Biemer, P., Collins, M., Leeuw, E., Dippo, C., Schwarz, N., and Trewin, D. (eds.), Survey Measurement and Process Quality. New York: Wiley, pp. 267–285.Google Scholar
Couper, M. P., Horm, J., and Schlegel, J. (1997), “Using Trace Files to Evaluate the National Health Interview Survey CAPI Instrument.” Proceedings of the Section on Survey Research Methods, American Statistical Association. Alexandria: ASA, pp. 825–829.
Couper, M. P., Kapteyn, A., Schonlau, M., and Winter, J. (2007), “Noncoverage and Nonresponse in an Internet Survey.” Social Science Research, 36 (1): 131–148.CrossRefGoogle Scholar
Couper, M. P., Kenyon, K., and Tourangeau, R. (2004), “Picture This! An Analysis of Visual Effects in Web Surveys.” Public Opinion Quarterly, 68 (2): 255–266.CrossRefGoogle Scholar
Couper, M. P., and Lyberg, L. E. (2005), “The Use of Paradata in Survey Research.” Paper presented at the International Statistical Institute, Sydney, Australia, April.
Couper, M. P., and Nicholls, W. L., II (1998), “The History and Development of Computer Assisted Survey Information Collection.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., Nicholls, W. L., and O'Reilly, J. (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 1–21.Google Scholar
Couper, M. P., Peytchev, A., Little, R. J. A., and Rothert, K. (2005), “Combining Information from Multiple Modes to Evaluate and Reduce Nonresponse Bias.” Alexandria, VA: Proceedings of the Survey Research Methods Section, American Statistical Association [CD].
Couper, M. P., Peytchev, A., Strecher, V. J., Rothert, K., and Anderson, J. (2007), “Following Up Nonrespondents to an Online Weight Management Intervention: Randomized Trial Comparing Mail versus Telephone.” Journal of Medical Internet Research, 9 (2): e16.CrossRefGoogle Scholar
Couper, M. P., Singer, E., and Tourangeau, R. (2004), “Does Voice Matter? An Interactive Voice Response (IVR) Experiment.” Journal of Official Statistics, 20 (3): 551–570.Google Scholar
Couper, M. P., Singer, E., Tourangeau, R., and Conrad, F. G. (2006), “Evaluating the Effectiveness of Visual Analog Scales: A Web Experiment.” Social Science Computer Review, 24 (2): 227–245.CrossRefGoogle Scholar
Couper, M. P., and Tourangeau, R. (2006), “Taking the Audio out of Audio-CASI?” Paper presented at the European Conference on Quality in Survey Statistics, Cardiff, Wales, April.
Couper, M. P., Tourangeau, R., and Conrad, F. G. (2007), “How the Shape and Format of Input Fields Affect Answers.” Paper presented at the Internet Survey Methodology workshop, Lillehammer, Norway, September.
Couper, M. P., Tourangeau, R., Conrad, F. G., and Crawford, S. D. (2004), “What They See Is What We Get: Response Options for Web Surveys.” Social Science Computer Review, 22 (1): 111–127.CrossRefGoogle Scholar
Couper, M. P., Traugott, M., and Lamias, M. (2001), “Web Survey Design and Administration.” Public Opinion Quarterly, 65 (2): 230–253.CrossRefGoogle Scholar
Crawford, S. D. (1999), “The Web Survey Choice in a Mixed Mode Data Collection.” Unpublished Paper. Ann Arbor: University of Michigan.
Crawford, S. D., Couper, M. P., and Lamias, M. (2001), “Web Surveys: Perceptions of Burden.” Social Science Computer Review, 19 (2): 146–162.CrossRefGoogle Scholar
Crawford, S. D., McCabe, S. E., and Pope, D. (2003), “Applying Web-Based Survey Design Standards.” Journal of Prevention and Intervention in the Community, 29 (1/2): 43–66.Google Scholar
Crawford, S. D., McCabe, S. E., Saltz, B., Boyd, C. J., Freisthler, B., and Paschall, M. J. (2004), “Gaining Respondent Cooperation in College Web-Based Alcohol Surveys: Findings from Experiments at Two Universities.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Phoenix, AZ, May.
Curry, J. (2003), “Complementary Capabilities for Conjoint, Choice, and Perceptual Mapping Web Data Collection.” Paper presented at the Tenth Sawtooth Software Conference, April.
Czikszentmihalyi, M. (1990), Flow: The Psychology of Optimal Experience. New York: Harper & Row.Google Scholar
Dahan, E., and Hauser, J. R. (2002), “The Virtual Customer.” Journal of Product Innovation Management, 19: 332–353.CrossRefGoogle Scholar
Dahan, E., and Srinivasan, V. (2000), “The Predictive Power of Internet-Based Product Concept Testing Using Visual Depiction and Animation.” Journal of Product Innovation Management, 17: 99–109.CrossRefGoogle Scholar
David, P. (1998), “News Concreteness and Visual-Verbal Association: Do News Pictures Narrow the Recall Gap Between Concrete and Abstract News?Human Communication Research, 25 (2): 180–201.CrossRefGoogle Scholar
Leeuw, E. D. (2001), “Reducing Missing Data in Surveys: an Overview of Methods.” Quality & Quantity, 35: 147–160.CrossRefGoogle Scholar
Leeuw, E. D. (2005), “To Mix or Not to Mix Data Collection Modes in Surveys.” Journal of Official Statistics, 21 (2): 233–255.Google Scholar
Leeuw, E. D., Callegaro, M., Hox, J. J., Korendijk, E., and Lensvelt-Mulders, G. (2007), “The Influence of Advance Letters on Response in Telephone Surveys: A Meta-Analysis.” Public Opinion Quarterly, 71 (3): 413–444.CrossRefGoogle Scholar
DeMay, C. C., Kurlander, J. L., Lundby, K. M., and Fenlason, K. J. (2002), “Web Survey Comments: Does Length Impact ‘Quality’?” Paper presented at the International Conference on Questionnaire Development, Evaluation and Testing Method, Charleston, SC, November.
Dennis, M., deRouvray, C., and Couper, M. P. (2000), “Questionnaire Design for Probability-Based Web Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Portland, OR, May.
DeRouvray, C., and Couper, M. P. (2002), “Designing a Strategy for Capturing ‘Respondent Uncertainty’ in Web-Based Surveys.” Social Science Computer Review, 20 (1): 3–9.CrossRefGoogle Scholar
Deutskens, E., Ruyter, K., Wetzels, M., and Oosterveld, P. (2004), “Response Rate and Response Quality of Internet-Based Surveys: An Experimental Study.” Marketing Letters, 15 (1): 21–36.CrossRefGoogle Scholar
DeVoto, J. A. E. (1998), “Seven Mortal Sins of Professional Web Designers.” http://www.jaedworks.com/shoebox/no-cookie/mortal-sins.html.
Dillman, D. A. (1978), Mail and Telephone Surveys: The Total Design Method. New York: Wiley.Google Scholar
Dillman, D. A. (2000), Mail and Internet Surveys: The Tailored Design Method. New York: Wiley.Google Scholar
Dillman, D. A., and Christian, L. M. (2005), “Survey Mode as a Source of Instability in Responses across Surveys.” Field Methods, 17 (1): 30–52.CrossRefGoogle Scholar
Dillman, D. A., Redline, C. D., and Carley-Baxter, L. R. (1999), “Influence of Type of Question on Skip Pattern Compliance in Self-Administered Questionnaires.” Proceedings of the American Statistical Association, Survey Research Methods Section. Alexandria, VA: American Statistical Association, pp. 979–984.Google Scholar
Dillman, D. A., and Smyth, J. D. (2007), “Design Effects in the Transition to Web-Based Surveys.” American Journal of Preventive Medicine, 32 (5S): S90–S96.CrossRefGoogle Scholar
Dillman, D. A., Smyth, J. D., Christian, L. M., and Stern, M. J. (2003), “Multiple Answer Questions in Self-Administered Surveys: the Use of Check-All-That-Apply and Forced-Choice Question Formats.” Paper presented at the Joint Statistical Meetings of the American Statistical Association, San Francisco, August.
Dondis, D. A. (1973), A Primer of Visual Literacy. Cambridge, MA: MIT Press.Google Scholar
Downes-Le Guin, T., Mechling, J., and Baker, R. P. (2006), “Great Results from Ambiguous Sources.” Paper presented at the ESOMAR Conference on Panel Research '06, Barcelona, Spain.
Duchastel, P. C. (1978), “Illustrating Instructional Texts.” Educational Technology, 18 (11): 36–39.Google Scholar
Duchastel, P. C. (1980), “Research on Illustrations in Text: Issues and Perspectives.” Educational Communication and Technology Journal, 28 (4): 283–287.Google Scholar
Duchastel, P. C., and Waller, R. (1979), “Pictorial Illustration in Instructional Texts.” Educational Technology, 19 (11): 20–25.Google Scholar
Dyson, M. C. (2004), “How Physical Text Layout Affects Reading from Screen.” Behaviour and information technology, 23 (6): 377–393.CrossRefGoogle Scholar
Edwards, P., Cooper, R., Roberts, I., and Frost, C. (2005), “Meta-Analysis of Randomised Trials of Monetary Incentives and Response to Mailed Questionnaires.” Journal of Epidemiology and Community Health, 59 (11): 987–999.CrossRefGoogle Scholar
Ekman, A., Dickman, P. W., Klint, A., Weiderpass, E., and Litton, J.-E. (2006), “Feasibility of Using Web-Based Questionnaires in Large Population-Based Epidemiological Studies.” European Journal of Epidemiology, 21: 103–111.CrossRefGoogle ScholarPubMed
Elig, T., and Waller, V. (2001), “Internet versus Paper Survey Administration: Impact on Qualitative Responses.” Unpublished Paper. Arlington, VA: Defense Manpower Data Center.
,ESOMAR (2005), ESOMAR Guideline on Conducting Market and Opinion Research Using the Internet, updated August 2005, http://www.esomar.org.
Etter, J.-F., Cucherat, M., and Perneger, T. V. (2002), “Questionnaire Color and Response Rates to Mailed Surveys: A Randomized Trial and a Meta-Analysis.” Evaluation and the Health Professions, 25 (2): 185–199.CrossRefGoogle Scholar
Fagerlin, A., Wang, C., and Ubel, P. A. (2005), “Reducing the Influence of Anecdotal Reasoning on People's Health Care Decisions: Is a Picture Worth a Thousand Statistics?Medical Decision Making, 25 (4): 398–405.CrossRefGoogle Scholar
Farmer, T. (2000), “Using the Internet for Primary Research Data Collection.” InfoTek Research Group, Inc. http://www.researchinfo.com/library/infotek/index.shtml.
Faubert, J. (1994), “Seeing Depth in Color – More than Just what Meets the Eyes.” Vision Research, 34 (9): 1165–1186.CrossRefGoogle Scholar
Fitts, P. M. (1954), “The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement.” Journal of Experimental Psychology, 47: 381–391.CrossRefGoogle Scholar
Forsman, G., and Varedian, M. (2002), “Mail and Web Surveys: A Cost and Response Rate Comparison in a Study of Students Housing Conditions.” Paper presented at the International Conference on Improving Surveys, Copenhagen, August.
Fox, J. E., Mockovak, W., Fisher, S., and Rho, C. (2003), “Usability Issues Associated with Converting Establishment Surveys to Web-Based Data Collection.” Paper presented at the FCSM Conference, Arlington, VA, November.
Fox, S. (2005), Digital Divisions. Washington, D.C.: Pew Internet and American Life Project. http://www.pewinternet.org.Google Scholar
Freyd, M. (1923), “The Graphic Rating Scale.” Journal of Educational Psychology, 14: 83–102.CrossRefGoogle Scholar
Fricker, R. D., and Schonlau, M. (2002), “Advantages and Disadvantages of Internet Research Surveys: Evidence from the Literature.” Field Methods, 14 (4): 347–365.CrossRefGoogle Scholar
Fuchs, M. (2007), “Asking for Numbers and Quantities: Visual Design Effects in Web Surveys and Paper & Pencil Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Anaheim, CA, May.
Fuchs, M., and Couper, M. P. (2001), “Length of Input Field and the Responses Provided in a Self-Administered Survey: A Comparison of Paper and Pencil and a Web Survey.” Paper presented at the International Conference on Methodology and Statistics, Ljubljana, Slovenia, September.
Funke, F. (2005), “Visual Analogue Scales in Online Surveys.” Paper presented at the General Online Research (GOR '05) conference, Zurich, March.
Funke, F., and Reips, U.-D. (2007), “Dynamic Forms: Online Surveys 2.0.” Paper presented at the General Online Research Conference (GOR'07), Leipzig, March.
Gadeib, A., and Kunath, J. (2006), “Virtual Research Worlds – Simulate the Difference! Efficient Concept Testing with in Virtual Market Simulations Online.” Paper presented at the General Online Research Conference (GOR '06), Bielefeld, Germany, March.
Galesic, M. (2006), “Dropouts on the Web: Effects of Interest and Burden Experienced During an Online Survey.” Journal of Official Statistics, 22 (2): 313–328.Google Scholar
Galesic, M., Tourangeau, R., Couper, M. P., and Conrad, F. G. (in press), “Eye-Tracking Data: New Insights on Response Order Effects and Other Signs of Cognitive Shortcuts in Survey Responding.” Public Opinion Quarterly.
Galesic, M., Tourangeau, R., Couper, M. P., and Conrad, F. G. (2007), “Using Change to Improve Navigation in Grid Questions.” Paper presented at the General Online Research Conference (GOR'07), Leipzig, March.
Galitz, W. O. (1993), User-Interface Screen Design. Boston: QED.Google Scholar
Garrett, J. J. (2005), “Ajax: A New Approach to Web Applications.” http://www.adaptivepath.com/publications/essays/archives/00385.php.
Gaskell, G. D., O'Muircheartaigh, C. A., and Wright, D. B. (1994), “Survey Questions about the Frequency of Vaguely Defined Events: the Effects of Response Alternatives.” Public Opinion Quarterly, 58 (2): 241–254.CrossRefGoogle Scholar
Gillbride, T. J., and Allenby, G. M. (2004), “A Choice Model with Conjunctive, Disjunctive, and Compensatory Screening Rules.” Marketing Science, 23 (3): 391–406.CrossRefGoogle Scholar
Giner-Sorolla, R., Garcia, M. T., and Bargh, J. A. (1999), “The Automatic Evaluation of Pictures.” Social Cognition, 17 (1): 76–96.CrossRefGoogle Scholar
Glauer, R., and Schneider, D. (2004), “Online-Surveys: Effects of Different Display Formats, Response Orders as Well as Progress Indicators in a Non-Experimental Environment.” Paper presented at the 6th German Online Research Conference, Duisburg-Essen, Germany, March.
Godar, S. H. (2000), “Use of Color and Responses to Computer-Based Surveys.” Perceptual and Motor Skills, 91: 767–770.CrossRefGoogle ScholarPubMed
Göritz, A. S. (2004), “The Impact of Material Incentives on Response Quantity, Response Quality, Sample Composition, Survey Outcome, and Cost in Online Access Panels.” International Journal of Market Research, 46 (3): 327–345.CrossRefGoogle Scholar
Göritz, A. S. (2005), “Incentives in Web-Based Studies: What to Consider and How to Decide.” WebSM Guide No. 2. www.websm.org.
Göritz, A. S. (2006), “Incentives in Web Studies: Methodological Issues and a Review.” International Journal of Internet Science, 1 (1): 58–70.Google Scholar
Gorn, G. J., Chattopadhyay, A., Sengupta, J., and Tripathi, S. (2004), “Waiting for the Web: How Screen Color Affects Time Perception.” Journal of Marketing Research, XLI (May): 215–225.CrossRefGoogle Scholar
Gorn, G. J., Chattopadhyay, A., Yi, T., and Dahl, D. W. (1997), “Effects of Color as an Executional Cue in Advertising: They're in the Shade.” Management Science, 43 (10): 1387–1400.CrossRefGoogle Scholar
Graber, D. A. (1996), “Say it with Pictures.” Annals of the American Academy of Political and Social Science, 546: 85–96.CrossRefGoogle Scholar
Grabinger, R. S., and Osman-Jouchoux, R. (1996), “Designing Screens for Learning.” In Oostendorp, H. and Mul, S. (eds.), Cognitive Aspects of Electronic Text Processing. Norwood, NJ: Ablex, pp. 181–212.Google Scholar
Gräf, L. (2002), “Optimierung von WWW-Umfragen: Three Years After.” Paper presented at the German Online Research conference, Göttingen, May.
Gräf, L. (2005), “Befragung mit neuer Kommunikationstechnik: Online-Umfragen.” Unpublished Paper. Köln, Germany: GlobalPark.
Grandjean, E. (1987), Ergonomics in Computerized Offices. New York: Taylor & Francis.Google Scholar
Green, P. E., and Rao, V. (1971), “Conjoint measurement: A New Approach to Quantify Judgmental Data.” Journal of Marketing Research, 8 (3): 355–363.CrossRefGoogle Scholar
Grice, P. (1967), “Utterer's Meaning and Intentions.” In Grice, P. (ed.), Studies in the Way of Words. Cambridge, MA: Harvard University Press, 1989, pp. 86–116.Google Scholar
Groves, R. M. (1989), Survey Errors and Survey Costs. New York: Wiley.CrossRefGoogle Scholar
Groves, R. M. (2006), “Nonresponse Rates and Nonresponse Error in Household Surveys.” Public Opinion Quarterly, 70 (5): 646–675.CrossRefGoogle Scholar
Groves, R. M., and Couper, M. P. (1998), Nonresponse in Household Interview Surveys. New York: Wiley.CrossRefGoogle Scholar
Guéguen, N., and Jacob, C. (2002a), “Social Presence Reinforcement and Computer-Mediated Communication: The Effect of the Solicitor's Photograph on Compliance to a Survey Request Made by E-Mail.” CyberPsychology and Behavior, 5 (2): 139–142.CrossRefGoogle Scholar
Guéguen, N., and Jacob, C. (2002b), “Solicitations by E-Mail and Solicitor's Status: A Field Study of Social Influence on the Web.” CyberPsychology and Behavior, 5 (4): 377–383.CrossRefGoogle Scholar
Hagenaars, J. A., and Heinen, T. G. (1982), “Effects of Role-Independent Interviewer Characteristics on Responses.” In Dijkstra, W. and Zouwen, J. (eds.), Response Behaviour in the Survey-Interview. London: Academic Press, pp. 91–130.Google Scholar
Hall, R. H., and Hanna, P. (2004), “The Impact of Web Page Text-Background Colour Combinations on Readability, Retention, Aesthetics, and Behavioural Intent.” Behaviour and Information Technology, 23 (3): 183–195.CrossRefGoogle Scholar
Hansen, S. E., and Couper, M. P. (2004), “Usability Testing as a Means of Evaluating Computer Assisted Survey Instruments.” In Presser, S., Rothgeb, J., Couper, M. P., Lessler, J., Martin, E. A., Martin, J., and Singer, E. (eds.), Methods for Testing and Evaluating Survey Questionnaires. New York: Wiley, pp. 337–360.CrossRefGoogle Scholar
Hansen, S. E., Couper, M. P., and Fuchs, M. (1998), “Usability Evaluation of the NHIS Instrument.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, St. Louis, May.
Haraldsen, G. (2004), “Identifying and Reducing Response Burdens in Internet Business Surveys.” Journal of Official Statistics, 20 (2): 393–410.Google Scholar
Haraldsen, G., Dale, T., Dalheim, E., and Strømme, H. (2002), “Mode Effects in a Mail plus Internet Designed Census.” Paper presented at the International Conference on Improving Surveys, Copenhagen, August.
Haraldsen, G., Kleven, Ø., and Stålnacke, M. (2006), “Paradata Indications of Problems in Web Surveys.” Paper presented at the European Conference on Quality in Survey Statistics, Cardiff, Wales, April.
Harmon, M. A., Westin, E. C., and Levin, K. Y. (2005), “Does Type of Pre-Notification Affect Web Survey Response Rates?” Paper presented at the annual conference of the American Association for Public Opinion Research, Miami Beach, May.
Harrell, L., Rosen, R., Gomes, A., Chute, J., and Yu, H. (2006), “Web Versus Email Data Collection: Experience in the Current Employment Statistics Program.” Proceedings of the Joint Statistical Meetings of the American Statistical Association, Seattle, August. Alexandria, VA: ASA, pp. 3104–3108 [CD].
Harris, D. R. (2002), “In the Eye of the Beholder: Observed Race and Observer Characteristics.” Ann Arbor: University of Michigan, Population Studies Center Research Report 02-522.
Hartley, J., Davies, L., and Burnhill, P. (1977), “Alternatives in the Typographic Design of Questionnaires.” Journal of Occupational Psychology, 50: 299–304.CrossRefGoogle Scholar
Hayes, M. H., and Patterson, D. G. (1921), “Experimental Development of the Graphic Rating Method.” Psychological Bulletin, 18: 98–99.Google Scholar
Heerwegh, D. (2003), “Explaining Response Latencies and Changing Answers Using Client-Side Paradata from a Web Survey.” Social Science Computer Review, 21 (3): 360–373.CrossRefGoogle Scholar
Heerwegh, D. (2005a), Web Surveys: Explaining and Reducing Unit Nonresponse, Item Nonresponse and Partial Nonresponse. Unpublished Ph.D. thesis. Leuven, Belgium: Katholieke Universiteit Leuven, Faculteit Sociale Wetenschappen.Google Scholar
Heerwegh, D. (2005b), “Effects of Personal Salutations in E-Mail Invitations to Participate in a Web Survey.” Public Opinion Quarterly, 69 (1): 588–598.CrossRefGoogle Scholar
Heerwegh, D., and Loosveldt, G. (2002a), “Web Surveys: The Effect of Controlling Survey Access Using PIN Numbers.” Social Science Computer Review, 20 (1): 10–21.CrossRefGoogle Scholar
Heerwegh, D., and Loosveldt, G. (2002b), “An Evaluation of the Effect of Response Formats on Data Quality in Web Surveys.” Social Science Computer Review, 20 (4): 471–484.CrossRefGoogle Scholar
Heerwegh, D., and Loosveldt, G. (2003), “An Evaluation of the Semiautomatic Login Procedure to Control Web Survey Access.” Social Science Computer Review, 21 (2): 223–234.CrossRefGoogle Scholar
Heerwegh, D., and Loosveldt, G. (2006), “An Experimental Study on the Effects of Personalization, Survey Length Statements, Progress Indicators, and Survey Sponsor Logos in Web SurveysJournal of Official Statistics, 22 (2): 191–210.Google Scholar
Heerwegh, D., Vanhove, T., Matthijs, K., and Loosveldt, G. (2005), “The Effect of Personalization on Response Rates and Data Quality in Web Surveys.” International Journal of Social Research Methodology, 8 (2): 85–99.CrossRefGoogle Scholar
Hembroff, L. A., Rusz, D., Rafferty, A., McGee, H., and Ehrlich, N. (2005), “The Cost-Effectiveness of Alternative Advance Mailings in a Telephone Survey.” Public Opinion Quarterly, 69 (2): 232–245.CrossRefGoogle Scholar
Hemsing, W., and Hellwig, J. O. (2006), “The Impact of Visualization of Question Types and Screen Pages on the Answering Behavior in Online Surveys.” Paper presented at the General Online Research Conference (GOR '06), Bielefeld, Germany, March.
Hennessy, D. G. (2002), “An Evaluation of Methods for Testing Internet Survey Questions.” Unpublished Honors Thesis. Wollongong, Australia: University of Wollongong, Department of Psychology.
Hill, D. H. (1994), “The Relative Empirical Validity of Dependent and Independent Data Collection in a Panel Survey.” Journal of Official Statistics, 10: 359–380.Google Scholar
Hogg, A., and Masztal, J. J. (2002), “Drop-Down Boxes, Radio Buttons or Fill-in-the-Blank?CASRO Journal 2002, pp. 53–55.Google Scholar
Hogg, A., and Miller, J. (2003), “Watch out for Dropouts: Study Shows Impact of Online Survey Length on Research Findings.” Quirk's Marketing Research Review, July/August, article 1137. www.quirks.com.Google Scholar
Hoogendoorn, A. (2001), “Some Techniques for Internet Interviewing.” Proceedings of the Seventh International Blaise Users Conference, Washington D.C., September.
Hoogendoorn, A. (2004), “A Questionnaire Design for Dependent Interviewing that Addresses the Problem of Cognitive Satisficing.” Journal of Official Statistics, 20 (2): 219–232.Google Scholar
Hoogendoorn, A., and Sikkel, D. (2002), “Feedback in Web Surveys.” Paper presented at the International Conference on Improving Surveys, Copenhagen, August.
Horn, R. E. (1998), Visual Language: Global Communication for the 21st Century. Bainbridge Island, WA: MacroVU.Google Scholar
Horrigan, J. B. (2006), Home Broadband Adoption 2006. Washington, D.C.: Pew Internet and American Life Project. http://www.pewinternet.org.Google Scholar
Horrigan, J. B., and Smith, A. (2007), Home Broadband Adoption 2007. Washington, D.C.: Pew, press release June 3, 2007. http://www.pewinternet.org.Google Scholar
Horton, S. (2006), Access by Design: A Guide to Universal Usability for Web Designers. Berkeley, CA: New Riders.Google Scholar
Horton, W. (1991), “Overcoming Chromophobia: A Guide to the Confident and Appropriate Use of Color.” IEEE Transactions on Professional Communication, 34 (3): 160–171.CrossRefGoogle Scholar
House, C. C. (1985), “Questionnaire Design with Computer Assisted Telephone Interviewing.” Journal of Official Statistics, 1 (2): 209–219.Google Scholar
House, C. C., and Nicholls, W. L., II (1988), “Questionnaire Design for CATI: Design Objectives and Methods.” In Groves, R. M., Biemer, P. P., Lyberg, L. E., Massey, J. T., Nicholls, W. L., and Waksberg, J. (eds.), Telephone Survey Methodology. New York: Wiley, pp. 421–436.Google Scholar
Howlett, V. (1996), Visual Interface Design for Windows. New York: Wiley.Google Scholar
Iglesias, C. P., Birks, Y. F., and Torgerson, D. J. (2001), “Improving the Measurement of Quality of Life in Older People: The York SF-12.” Quarterly Journal of Medicine, 94: 695–698.CrossRefGoogle ScholarPubMed
,Inside Research (2007), “U.S. Online MR Continues Strong.” January. www.MarketResearch.com.
Itten, J. (2003), The Elements of Color. New York: Wiley.Google Scholar
Jäckle, A. (2004), “Does Dependent Interviewing Really Increase Efficiency and Reduce Respondent Burden?” Colchester: University of Essex (Working Papers of the Institute for Social and Economic Research, paper 2005–11).
Jansen, B. J., Spink, A., and Saracevic, T. (2000), “Real Life, Real Users, and Real Needs: A Study and Analysis of User Queries on the Web.” Information Processing and Management, 36: 207–227.CrossRefGoogle Scholar
Jeavons, A. (1998), “Ethology and the Web: Observing Respondent Behaviour in Web Surveys.” Proceedings of the Worldwide Internet Conference, London, February. ESOMAR.
Jenkins, C. R., and Dillman, D. A. (1997), “Towards a Theory of Self-Administered Questionnaire Design.” In Lyberg, L., Biemer, P., Collins, M., Leeuw, E., Dippo, C., Schwarz, N., and Trewin, D. (eds.), Survey Measurement and Process Quality. New York: Wiley, pp. 165–196.Google Scholar
Johnson, R. M. (1987), “Adaptive Conjoint Analysis.” Sawtooth Software Conference on Perceptual Mapping, Conjoint Analysis, and Computer Interviewing. Ketchum, ID: Sawtooth Software, pp. 253–265. http://www.sawtoothsoftware.com.
Johnson, R. M. (2000), “Understanding HB: An Intuitive Approach.” Proceedings of the Sawtooth Software Conference. Ketchum, ID: Sawtooth Software, pp. 195–205. http://www.sawtoothsoftware.com.
Joinson, A. N. (2005), “Audience Power, Personalized Salutation and Responses to Web Surveys.” Paper presented at the ESF Workshop on Internet Survey Methodology, Dubrovnik, Croatia, September.
Joinson, A. N., and Reips, U.-D. (2007), “Personalized Salutation, Power of Sender and Response Rates to Web-Based Surveys.” Computers in Human Behavior, 23 (3): 1372–1383.CrossRefGoogle Scholar
Joinson, A. N., Woodley, A., and Reips, U.-D. (2007), “Personalization, Authentication and Self-Disclosure in Self-Administered Internet Surveys.” Computers in Human Behavior, 23: 275–285.CrossRefGoogle Scholar
Juran, J. M. (1979), ‘Basic Concepts’, in Juran, J. M., Gryna, F. M., and Bingham, R. S., (eds.), Quality Control Handbook, third edition. McGraw-Hill: New York, pp. 1–24.Google Scholar
Kaczmirek, L., Neubarth, W., Bosnjak, M., and Bandilla, W. (2004), “Progress Indicators in Filter Based Surveys: Computing Methods and their Impact on Drop Out.” Paper presented at the RC33 International Conference on Social Science Methodology, Amsterdam, August.
Kaczmirek, L., Neubarth, W., Bosnjak, M., and Bandilla, W. (2005), “Progress Indicators in Filter Based Surveys: Individual and Dynamic Calculation Methods.” Paper presented at the General Online Research Conference (GOR05), Zurich, March.
Kaczmirek, L., and Thiele, O. (2006), “Flash, JavaScript, or PHP? Comparing the Availability of Technical Equipment among University Applicants.” Paper presented at the 8th International General Online Research Conference (GOR06), Bielefeld, Germany, March.
Kalbach, J. (2001), “The Myth of 800x600.” Dr. Dobbs Journal, March 16, 2001. http://www.ddj.com/documents/s=2684/nam1012432092/index.html.Google Scholar
Kane, E. W., and Macauley, L. J. (1993), “Interviewer Gender and Gender Attitudes.” Public Opinion Quarterly, 57 (1): 1–28.CrossRefGoogle Scholar
Kaplowitz, M. D., Hadlock, T. D., and Levine, R. (2004), “A Comparison of Web and Mail Survey Response Rates.” Public Opinion Quarterly, 68 (1): 94–101.CrossRefGoogle Scholar
Karlgren, J., and Franzén, K. (1997), “Verbosity and Interface Design in Information Retrieval.” SICS Technical Report T2000:04. Swedish Institute for Computer Science, Stockholm. http://www.sics.se/~jussi/Artiklar/2000_TR_irinterface/irinterface.html.
Kent, R., and Brandal, H. (2003), “Improving Email Response in a Permission Marketing Context.” International Journal of Market Research, 45 (1): 489–506.Google Scholar
Kenyon, K., Couper, M. P., and Tourangeau, R. (2001), “Picture This! An Analysis of Visual Effects in Web Surveys.” Paper presented at the annual conference of the American Association for Public Opinion Research, Montreal, Canada, May.
Kerwin, J., Levin, K., Shipp, S., Wang, A., and Campbell, S. (2006), “A Comparison of Strategies for Reducing Item Nonresponse in Web Surveys.” Paper presented at the Joint Statistical Meetings of the American Statistical Association, Seattle, August.
Kiesler, S., and Sproull, L. S. (1986), “Response Effects in the Electronic Survey.” Public Opinion Quarterly, 50: 402–413.CrossRefGoogle Scholar
Kiousis, S. (2002), “Interactivity: A Concept Explication.” New Media and Society, 4 (3): 355–383.CrossRefGoogle Scholar
Kjellström, O., and Bälter, O. (2003), “Design of Follow-up Questions in Web Surveys.” Paper presented at the 25th International Conference on Information Technology Interfaces, Cavtat, Croatia, June.
Koffka, K. (1935), Principles of Gestalt Psychology. New York: Harcourt, Brace, and World.Google Scholar
Kostelnick, C., and Roberts, D. D. (1998), Designing Visual Language. Boston: Allyn and Bacon.Google Scholar
Kraut, R. M., Olson, J., Banaji, M., Bruckman, A., Cohen, J., and Couper, M. P. (2004), “Psychological Research Online: Report of Board of Scientific Affairs' Advisory Group on the Conduct of Research on the Internet.” American Psychologist, 59 (2): 106–117.Google Scholar
Krosnick, J. A. (1991), “Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys.” Applied Cognitive Psychology, 5: 213–236.CrossRefGoogle Scholar
Krosnick, J. A., and Alwin, D. F. (1987), “An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement.” Public Opinion Quarterly, 51 (2): 201–219.CrossRefGoogle Scholar
Krosnick, J. A., and Fabrigar, L. R. (1997), “Designing Rating Scales for Effective Measurement in Surveys.” In Lyberg, L., Biemer, P., Collins, M., Leeuw, E., Dippo, C., Schwarz, N., and Trewin, D. (eds.), Survey Measurement and Process Quality. New York: Wiley, pp. 141–164.Google Scholar
Krysan, M., and Couper, M. P. (2003), “Race in the Live and Virtual Interview: Racial Deference, Social Desirability, and Activation Effects in Attitude Surveys.” Social Psychology Quarterly, 66 (4): 364–383.CrossRefGoogle Scholar
Krysan, M., and Couper, M. P. (2005), “Race-of-Interviewer Effects: What Happens on the Web?International Journal of Internet Science, 1 (1): 5–16.Google Scholar
Květon, P., Jelínek, M., Vobořil, , and Klimusová, H. (2007), “Computer-Based Tests: the Impact of Test Design and Problem of Equivalency.” Computers in Human Behavior, 23 (1): 32–51.CrossRefGoogle Scholar
Kypri, K., and Gallagher, S. J. (2003), “Incentives to Increase Participation in an Internet Survey of Alcohol Use: A Controlled Experiment.” Alcohol & Alcoholism, 38 (5): 437–441.CrossRefGoogle Scholar
Kypri, K., Gallagher, S. J., and Cashell-Smith, M. L. (2004), “An Internet-Based Survey Method for College Student Drinking Research.” Drug and Alcohol Dependence, 76 (1): 45–53.CrossRefGoogle Scholar
Lenert, L. A., Sturley, A., and Watson, M. E. (2002), “iMPACT3: Internet-Based Development and Administration of Utility Elicitation Protocols.” Medical Decision Making, November–December: 464–474.CrossRefGoogle ScholarPubMed
Lie, H. W., and Bos, B. (2005), Cascading Style Sheets: Designing for the Web, Third Edition. Upper Saddle River, NJ: Addison-Wesley.Google Scholar
Link, M. W., and Mokdad, A. (2005), “Advance Letters as a Means of Improving Respondent Cooperation in Random Digit Dial Studies: A Multistate Experiment.” Public Opinion Quarterly, 69 (4): 572–587.CrossRefGoogle Scholar
Little, R. J. A., and Rubin, D. A. (2002), Statistical Analysis with Missing Data (2nd ed.). New York: Wiley.CrossRefGoogle Scholar
Lowney, G. (1998), “But Can They Read It?” MDSN News, July/August. http://www.microsoft.com/mdsn/news/julaug98/access7.htm.
Lozar Manfreda, K., and Vehovar, V. (2002), “Design of Web Survey Questionnaires: Three Basic Experiments.” Journal of Computer Mediated Communication, 7 (3). http://jcmc.indiana.edu/vol7/issue3/vehovar.html.Google Scholar
Lütters, H., Westphal, D., and Heublein, F. (2007), “SniperScale: Graphical Scaling in Data Collection and its Effect on the Response Behaviour of Participants in Online Studies.” Paper presented at the General Online Research Conference (GOR '07), Leipzig, March.
Lynch, P. J., and Horton, S. (1997), Yale Center for Advanced Media WWW Style Manual, 1st ed. http://info.med.yale.edu/caim/manual/.
Lynch, P. J., and Horton, S. (2001), Web Style Guide: Basic Design Principles for Creating Web Sites (2nd ed.). New Haven, CT: Yale University Press.Google Scholar
Lynn, P., Jäckle, A., Jenkins, S. P., and Sala, E. (2006), “The Effects of Dependent Interviewing on Responses to Questions on Income Sources.” Journal of Official Statistics, 22 (3): 357–384.Google Scholar
MacElroy, B. (2000), “Variables Influencing Dropout Rates in Web-Based Surveys.” Quirk's Marketing Research Review, July/August. www.modalis.com.Google Scholar
MacElroy, B., Mikucki, J., and McDowell, P. (2002), “A Comparison of Quality in Open-Ended Responses and Response Rates Between Web-Based and Paper and Pencil Survey Modes.” Journal of Online Research, 1 (1). http://www.ijor.org/.Google Scholar
Magee, C. G., Straight, R. L., and Schwartz, L. (2001), “Conducting Web-Based Surveys: Keys to Success.” The Public Manager, Summer: 47–50.Google Scholar
Marsh, E. E., and White, M. D. (2003), “A Taxonomy of Relationships between Images and Text.” Journal of Documentation, 59 (6): 647–672.CrossRefGoogle Scholar
Marshall, P., and Bradlow, E. T. (2002), “A Unified Approach to Conjoint Analysis Methods.” Journal of the American Statistical Association, 97 (459): 674–682.CrossRefGoogle Scholar
Martin, E. A., Childs, J. H., DeMaio, T., Hill, J., Reiser, C., Gerber, E., Styles, K., and Dillman, D. A. (2007), Guidelines for Designing Questionnaires for Administration in Different Modes. Washington, D.C.: U.S. Census Bureau.Google Scholar
May, V. A. (1999), “Survey 2000: Charting Communities and Change.” National Geographic, 196 (6): 130–133.Google Scholar
Maynard, D. W., Houtkoop-Steenstra, H., Schaeffer, N. C., and Zouwen, J. (eds.) (2002), Standardization and Tacit Knowledge: Interaction and Practice in the Survey Interview. New York: Wiley.
McCabe, S., Boyd, C., Couper, M. P., Crawford, S. D., d'Arcy, E., Boyd, C., Couper, M. P., Crawford, S. D., and d'Arcy, H. (2002), “Mode Effects for Collecting Alcohol and Other Drug Use Data: Web and US Mail.” Journal of Studies on Alcohol, 63 (6): 755–761.CrossRefGoogle Scholar
McCabe, S. E., Diez, A., Boyd, C. J., Nelson, T. F., and Weitzman, E. R. (2006), “Comparing Web and Mail Responses in a Mixed Mode Survey in College Alcohol Use Research.” Addictive Behaviors, 31: 1619–1627.CrossRefGoogle Scholar
McCarthy, M. S., and Mothersbaugh, D. L. (2002), “Effects of Typographic Factors in Advertising-Based Persuasion: A General Model and Initial Empirical Tests.” Psychology & Marketing, 19 (7–8): 663–691.CrossRefGoogle Scholar
McCloud, S. (1993), Understanding Comics: The Invisible Art. New York: HarperCollins.Google Scholar
McMillan, S. J., and Hwang, J.-S. (2002), “Measures of Perceived Interactivity: An Exploration of the Role of Direction of Communication, User Control, and Time in Shaping Perceptions of Interactivity.” Journal of Advertising, 31 (3): 29–42.CrossRefGoogle Scholar
Meadows, K. A., Greene, T., Foster, L., and Beer, S. (2000), “The Impact of Different Response Alternatives on Responders' Reporting of Health-Related Behaviour in a Postal Survey.” Quality of Life Research, 9: 385–391.CrossRefGoogle Scholar
Mehta, R., and Sivadas, E. (1995), “Comparing Response Rates and Response Content in Mail Versus Electronic Mail Surveys.” Journal of the Market Research Society, 37 (4): 429–439.Google Scholar
Miller, J. M., and Krosnick, J. A. (1998), “The Impact of Candidate Name Order on Election Outcomes.” Public Opinion Quarterly, 62 (3): 291–330.CrossRefGoogle Scholar
Miller, S., and Jarrett, C. (2001), “Should I Use a Drop-down? Four Steps for Choosing Form Elements on the Web.” Unpublished Paper. Leighton Buzzard, England: Effortmark.
Mockovak, W. (2005), “An Evaluation of Different Design Options for Presenting Edit Messages in Web Forms.” Paper presented at the FedCASIC Workshop, Washington, D.C., March.
Mooney, G. M., Rogers, B., and Trunzo, D. (2003), “Examining the Effect of Error Prompting on Item Nonresponse and Survey Nonresponse in Web Surveys.” Paper presented at the annual conference of the American Association for Public Opinion Research, Nashville, TN, May.
Moore, P., and Fitz, C. (1993a), “Gestalt Theory and Instructional Design.” Journal of Technical Writing and Communication, 23 (2): 137–157.CrossRefGoogle Scholar
Moore, P., and Fitz, C. (1993b), “Using Gestalt Theory to Teach Document Design and Graphics.” Technical Communication Quarterly, 2 (4): 389–410.CrossRefGoogle Scholar
Moreno, R., and Mayer, R. E. (1999), “Cognitive Principles of Multimedia Learning: The Role of Modality and Contiguity.” Journal of Experimental Psychology, 91 (2): 358–368.Google Scholar
Murphy, E., and Ciochetto, S. (2006), “Usability Testing of Alternative Design Features for Web-Based Data Collection: Selected Issues, Results, and Recommendations.” Paper presented at FedCASIC, Washington, D.C., March.
Nass, C., Isbister, K., and Lee, E.-J. (2001), “Truth Is Beauty: Researching Embodied Conversational Agents.” In Cassell, J., Sullivan, J., Prevost, S., and Churchill, E. (eds.), Embodied Conversational Agents. Cambridge, MA: MIT Press, pp. 374–402.Google Scholar
Nass, C., Moon, Y., and Carney, P. (1999), “Are People Polite to Computers? Responses to Computer-Based Interviewing Systems.” Journal of Applied Social Psychology, 29 (5): 1093–1110.CrossRefGoogle Scholar
Nass, C., Moon, Y., and Green, N. (1997), “Are Machines Gender Neutral? Gender-Stereotypic Responses to Computers with Voices.” Journal of Applied Social Psychology, 27 (10): 864–876.CrossRefGoogle Scholar
Nass, C., Robles, E., Bienenstock, H., Treinen, M., and Heenan, C. (2003), “Speech-Based Disclosure Systems: Effects of Modality, Gender of Prompt, and Gender of User.” International Journal of Speech Technology, 6: 113–121.CrossRefGoogle Scholar
Neubarth, W. (2006), “Ranking vs. Rating in an Online Environment.” Paper presented at the General Online Research Conference (GOR '06), Bielefeld, Germany, March.
Nielsen, J. (2000), Designing Web Usability. Berkeley, CA: New Riders.Google Scholar
Nielsen, J., and Loranger, H. (2005), Fundamental Guidelines for Web Usability. London: Nielsen Norman Group (Usability Week Tutorial Materials).Google Scholar
Norman, D. A. (1988), The Design of Everyday Things. New York: Doubleday.Google Scholar
Norman, D. A. (2004), Emotional Design: Why We Love (or Hate) Everyday Things. New York: Basic Books.Google Scholar
Norman, K. L. (2001), “Implementation of Conditional Branching in Computerized Self-Administered Questionnaires.” Unpublished Report. College Park: University of Maryland, Institute for Advanced Computer Studies.
Norman, K. L., Friedman, Z., Norman, K., and Stevenson, R. (2001), “Navigational Issues in the Design of Online Self-Administered Questionnaires.” Behaviour and Information Technology, 20 (1): 37–45.CrossRefGoogle Scholar
Norman, K. L., and Pleskac, T. (2002), “Conditional Branching in Computerized Self-Administered Questionnaires: An Empirical Study.” Unpublished Paper. College Park: University of Maryland, Institute for Advanced Computer Studies.
Novemsky, N., Dhar, R., Schwarz, N., and Simonson, I. (2007), “Preference Fluency in Choice.” Journal of Marketing Research, 44 (3): 347–356CrossRefGoogle Scholar
Nugent, G. C. (1992), “Pictures, Audio, and Print: Symbolic Representation and Effect on Learning.” Educational Communication and Technology Journal, 30: 163–174.Google Scholar
Nyiri, Z., and Clark, R. L. (2003), “Web Survey Design: Comparing Static and Dynamic Survey Instruments.” Paper presented at the annual conference of the American Association for Public Opinion Research, Nashville, TN, May.
Olsen, R. J. (1992), “The Effects of Computer-Assisted Interviewing on Data Quality.” Working Papers of the European Scientific Network on Household Panel Studies, Paper 36. Colchester, England: University of Essex.
O'Muircheartaigh, C. A. (1997), “Measurement Error in Surveys: A Historical Perspective.” In Lyberg, L., Biemer, P., Collins, M., Leeuw, E., Dippo, C., Schwarz, N., and Trewin, D. (eds.), Survey Measurement and Process Quality. New York: Wiley, pp. 1–25.Google Scholar
Pagendarm, M., and Schaumburg, H. (2001), “Why Are Users Banner-Blind? The Impact of Navigation Style on the Perception of Web Banners.” Journal of Digital Information, 2 (1). http://journals.tdl.org/jodi/article/view/jodi-37/38.Google Scholar
Paivio, A. (1979), Imagery and Verbal Processes. Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
Pearson, J., and Levine, R. A. (2003), “Salutations and Response Rates to Online Surveys.” Paper presented at the Association for Survey Computing Fourth International Conference on the Impact of Technology on the Survey Process, Warwick, England, September.
Peytchev, A. (2005), “How Questionnaire Layout Induces Measurement Error.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Miami Beach, FL, May.
Peytchev, A. (2006), “Participation Decisions and Measurement Error in Web Surveys.” Unpublished PhD Dissertation. Ann Arbor: University of Michigan.
Peytchev, A., Couper, M. P., McCabe, S. E., and Crawford, S. (2006), “Web Survey Design: Paging Versus Scrolling.” Public Opinion Quarterly, 70 (4): 596–607.CrossRefGoogle Scholar
Peytchev, A., and Crawford, S. (2005), “A Typology of Real-Time Validations in Web-Based Surveys.” Social Science Computer Review, 23 (2): 235–249.CrossRefGoogle Scholar
Piazza, T., and Sniderman, P. M. (1998), “Incorporating Experiments into Computer Assisted Surveys.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., Nicholls, W. L., and O'Reilly, J., (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 167–184.Google Scholar
Pope, D., and Baker, R. P. (2005), “Experiments in Color for Web-Based Surveys.” Paper presented at the FedCASIC Workshops, Washington, D.C., March.
Porter, S. R., and Whitcomb, M. E. (2003), “The Impact of Lottery Incentives on Student Survey Response Rates.” Research in Higher Education, 44 (4): 389–407.CrossRefGoogle Scholar
Porter, S. R., and Whitcomb, M. E. (2005), “E-Mail Subject Lines and Their Effect on Web Survey Viewing and Response.” Social Science Computer Review, 23 (3): 380–387.CrossRefGoogle Scholar
Poynter, R. (2001), “A Guide to Best Practice in Online Quantitative Research.” In Westlake, A., Sykes, W., Manners, T., and Rigg, M. (eds.), The Challenge of the Internet: Proceedings of the ASC International Conference on Survey Research Methods. London: Association for Survey Computing, pp. 3–19.Google Scholar
Presser, S., Rothgeb, J., Couper, M. P., Lessler, J., Martin, E. A., Martin, J., and Singer, E. (eds.) (2004), Methods for Testing and Evaluating Survey Questionnaires. New York: Wiley.CrossRef
Prior, M. (2002), “More Than a Thousand Words? Visual Cues and Visual Knowledge.” Paper presented at the annual meeting of the American Political Science Association, Boston, August.
Raghunathan, T. E., and Grizzle, J. E. (1995), “A Split Questionnaire Survey Design.” Journal of the American Statistical Association, 90 (429): 54–63.CrossRefGoogle Scholar
Ramirez, C., Sharp, K., and Foster, L. (2000), “Mode Effects in an Internet/Paper Survey of Employees.” Paper presented at the annual conference of the American Association for Public Opinion Research, Portland, OR, May.
Rasinski, K. A., Mingay, D., and Bradburn, N. M. (1994), “Do Respondents Really ‘Mark All That Apply’ on Self-Administered Questions?Public Opinion Quarterly, 58: 400–408.CrossRefGoogle Scholar
Reber, R., and Schwarz, N. (1999), “Effects of Perceptual Fluency on Judgments of Truth.” Consciousness and Cognition, 8: 338–342.CrossRefGoogle ScholarPubMed
Redline, C. D., and Dillman, D. A. (2002), “The Influence of Alternative Visual Designs on Respondents' Performance with Branching Instructions in Self-Administered Questionnaires.” In Groves, R. M., Dillman, D. A., Eltinge, J. A., and Little, R. J. A. (eds.), Survey Nonresponse. New York: Wiley, pp. 179–193.Google Scholar
Redline, C., Dillman, D. A., Dajani, A., and Scaggs, M. A. (2003), “Improving Navigational Performance in U.S. Census 2000 by Altering the Visually Administered Languages of Branching Instructions.” Journal of Official Statistics, 19 (4): 403–419.Google Scholar
Redline, C. D., Dillman, D. A., Carley-Baxter, L., and Creecy, R. (2005), “Factors that Influence Reading and Comprehension of Branching Instructions in Self-Administered Questionnaires.” Allgemeines Statistisches Archiv, 89 (1); 29–38.Google Scholar
Reed, B. D., Crawford, S., Couper, M. P., Cave, C., and Haefner, H. K. (2004), “Pain at the Vulvar Vestibule – A Web-Survey.” Journal of Lower Genital Tract Disease, 8 (1): 48–57.CrossRefGoogle Scholar
Reips, U.-D. (2001), “The Web Experimental Psychology Lab: Five Years of Data Collection on the Internet.” Behavior Research Methods, Instruments, and Computers, 33 (2): 201–211.CrossRefGoogle Scholar
Reips, U.-D. (2002), “Internet-Based Psychological Experimenting; Five Dos and Five Don'ts.” Social Science Computer Review, 20 (3): 241–249.Google Scholar
Reips, U.-D., and Funke, F. (in press), “Interval Level Measurement with Visual Analogue Scales in Internet Based Research: VAS Generator.” Behavior Research Methods.
Reja, U., Lozar Manfreda, K., Hlebec, V., and Vehovar, V. (2002), “Open vs. Closed Questions in Web Surveys.” Paper presented at the International Conference on Methodology and Statistics, Ljubljana, Slovenia, September.
Richman, W. L., Kiesler, S., Weisband, S., and Drasgow, F. (1999), “A Meta-Analytic Study of Social Desirability Distortion in Computer-Administered Questionnaires, Traditional Questionnaires, and Interviews.” Journal of Applied Psychology, 84 (5): 754–775.CrossRefGoogle Scholar
Rigden, C. (1999), “‘The Eye of the Beholder’ – Designing for Colour-Blind Users.” British Telecommunications Engineering, 17: 2–6.Google Scholar
Rivers, D. (2000), “Fulfilling the Promise of the Web.” Quirk's Marketing Research Review, February 2000. Online: article 0562. www.quirks.com.Google Scholar
Rivers, D. (2006), “Web Surveys for Health Measurement.” Paper presented at Building Tomorrow's Patient-Reported Outcome Measures: The Inaugural PROMIS Conference, Gaithersburg, MD, September 11–13, 2006.
Robinson, J. P., Neustadtl, A., and Kestnbaum, M. (2002), “Why Public Opinion Polls are Inherently Biased: Public Opinion Differences among Internet Users and Non-Users.” Paper presented at the annual meeting of the American Association for Public Opinion Research, St. Petersburg, FL, May.
Rossett, B. (2006), “The Basics of Section 508.” Paper presented at the Federal Workshop on Computer Assisted Survey Information Collection (FedCASIC), Washington, DC, March.
Rowe, C. L. (1982), “The Connotative Dimensions of Selected Display Typefaces.” Information Design Journal, 1: 30–37.CrossRefGoogle Scholar
Schaefer, D. R., and Dillman, D. A. (1998), “Development of a Standard E-Mail Methodology: Results of an Experiment.” Public Opinion Quarterly, 62 (3): 378–397.CrossRefGoogle Scholar
Schafer, J. L., and Graham, J. W. (2002), “Missing Data: Our View of the State of the Art.” Psychological Methods, 7 (2): 147–177.CrossRefGoogle Scholar
Schneiderman, B. (1992), Designing the Use Interface: Strategies for Effective Human-Computer Interaction, Second Edition. Reading, MA: Addison-Wesley.Google Scholar
Schober, M. F., and Conrad, F. G. (eds.) (2007), Envisioning Survey Interviews of the Future. New York: Wiley.
Schober, M. F., Conrad, F. G., Ehlen, P., and Fricker, S. S. (2003), “How Web Surveys Differ from Other Kinds of User Interfaces.” Proceedings of the Joint Statistical Meetings of the American Statistical Association. Alexandria: ASA, pp. 190–195 [CD].
Schonlau, M., Asch, B. J., and Du, C. (2003), “Web Surveys as Part of a Mixed-Mode Strategy for Populations That Cannot Be Contacted by E-Mail.” Social Science Computer Review, 21 (2): 218–222.CrossRefGoogle Scholar
Schonlau, M., Fricker, R. D., and Elliott, M. N. (2002), Conducting Research Surveys via E-Mail and the Web. Santa Monica, CA: RAND.Google Scholar
Schonlau, M., Zapert, K., Simon, L. P., Sanstad, K. H., Marcus, S. M., Adams, J., Spranca, M., Kan, H.-J., Turner, R., and Berry, S. H. (2004), “A Comparison between Responses from a Propensity-Weighted Web Survey and an Identical RDD Survey.” Social Science Computer Review, 22 (1): 128–138.CrossRefGoogle Scholar
Schriver, K. A. (1997), Dynamics of Document Design. New York: Wiley.Google Scholar
Schuman, H., and Presser, S. (1979), “The Open and Closed Question.” American Sociological Review, 44: 692–712.CrossRefGoogle Scholar
Schuman, H., and Presser, S. (1981), Questions and Answers in Attitude Surveys. New York: Academic Press.Google Scholar
Schwarz, N. (1996), Cognition and Communication: Judgmental Biases, Research Methods and the Logic of Conversation. Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
Schwarz, N., Grayson, C. E., and Knäuper, B. (1998), “Formal Features of Rating Scales and the Interpretation of Question Meaning.” International Journal of Public Opinion Research, 10 (2): 177–183.CrossRefGoogle Scholar
Schwarz, N., Knäuper, B., Hippler, H.-J., Noelle-Neumann, E., and Clark, F. (1991), “Rating Scales: Numeric Values May Change the Meaning of Scale Labels.” Public Opinion Quarterly, 55: 618–630.CrossRefGoogle Scholar
Schwarz, N., and Sudman, S. (Eds.) (1992), Context Effects in Social and Psychological Research. New York: Springer-Verlag.CrossRef
Schwarz, S., and Reips, U.-D. (2001), “CGI Versus JavaScript: A Web Experiment on the Reversed Hindsight Bias.” In Reips, U.-D. and Bosnjak, M. (eds.), Dimensions of Internet Science. Lengerich, Germany: Pabst Science Publishers, pp. 75–90.Google Scholar
Sethuraman, R., Kerin, R. A., and Cron, W. L. (2005), “A Field Study Comparing Online and Offline Data Collection Methods for Identifying Product Attribute Preferences Using Conjoint Analysis.” Journal of Business Research, 58: 602–610.CrossRefGoogle Scholar
Sikkel, D. (1998), “The Individualized Interview.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., Nicholls, W. L., and O'Reilly, J., (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 147–165.Google Scholar
Silverstein, C., Marais, H., Henzinger, M., and Moricz, M. (1999), “Analysis of a Very Large Web Search Engine Query Log.” SIGIR Forum, 33 (1): 6–12.CrossRefGoogle Scholar
Singer, E. (2002), “The Use of Incentives to Reduce Nonresponse in Household Surveys.” In Groves, R. M., Dillman, D. A., Eltinge, J. L., and Little, R. J. A. (eds), Survey Nonresponse. New York: Wiley, pp. 163–177.Google Scholar
Singer, E., Couper, M. P., Conrad, F. G., and Groves, R. M. (in press), “Risk of Disclosure, Perceptions of Risk, and Concerns about Privacy and Confidentiality as Factors in Survey Participation.” Journal of Official Statistics.
Sinibaldi, J., Crawford, S. D., Saltz, R., and Showen, S. (2006), “Using Interactive Web-Based Maps to Collect Geographical Data in a Student Survey.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Montreal, Canada, May.
Smith, R. M., and Kiniorski, K. (2003), “Participation in Online Surveys: Results from a Series of Experiments.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Nashville, TN, May.
Smith, S. M., Smith, J., and Allred, C. R. (2006), “Advanced Techniques and Technologies in Online Research.” In Grover, R. and Vriens, M. (eds.), The Handbook of Marketing Research. Thousand Oaks, CA: Sage, pp. 132–158.CrossRefGoogle Scholar
Smyth, J. D., Dillman, D. A., Christian, L. M., and Stern, M. J. (2004), “How Visual Grouping Influences Answers to Internet Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Nashville, May.
Smyth, J. D., Dillman, D. A., Christian, L. M., and Stern, M. J. (2005), “Comparing Check-All and Forced-Choice Question Formats in Web Surveys: The Role of Satisficing, Depth of Processing, and Acquiescence in Explaining Differences.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Miami Beach, FL, May.
Sperry, S., Edwards, B., Dulaney, R., and Potter, D. E. B. (1998), “Evaluating Interviewer Use of CAPI Navigation Features.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., W. Nicholls, L., and O'Reilly, J. (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 351–365.Google Scholar
Stanley, N., and Jenkins, S. (2007), “Watch What I Do! Using Graphic Input Controls in Web Surveys.” In Trotman, M. et al. (eds.), The Challenges of a Changing World; Proceedings of the Fifth International Conference of the Association for Survey Computing. Berkeley, England: ASC, pp.81–92.Google Scholar
Staples, L. (2000), “Typography and the Screen: A Technical Chronology of Digital Typography, 1984–1997.” Design Issues, 16 (3): 19–34.Google Scholar
,Statistics Netherlands (1996), Blaise Developer's Guide (Blaise III). Heerlen, The Netherlands: Statistics Netherlands, Department of Statistical Informatics.Google Scholar
,Statistics Netherlands (2002), Blaise 4.5 Developer's Guide. Heerlen, The Netherlands: Statistics Netherlands, Methods and Informatics Department.Google Scholar
Stenbjerre, M., and Laugesen, J. N. (2005), “Conducting Representative Online Research.” Proceedings of ESOMAR Conference on Worldwide Panel Research; Developments and Progress, Budapest, Hungary. Amsterdam: ESOMAR, pp. 369–391 [CD].Google Scholar
Sudman, S., and Bradburn, N. M. (1982), Asking Questions: A Practical Guide to Questionnaire Design. San Francisco: Jossey-Bass.Google Scholar
Szabó, Z. G. (2006), “The Distinction between Semantics and Pragmatics.” In Lepore, E. and Smith, B. (eds.), The Oxford Handbook of Philosophy and Language. Oxford: Oxford University Press.Google Scholar
Tarnai, J., and Allen, T. (2002), “Characteristics of Respondents to a Web Survey of the General Public.” Paper resented at the annual meeting of the American Association for Public Opinion Research, St. Petersburg Beach, FL, May.
Terhanian, G. (2000), “How To Produce Credible, Trustworthy Information through Internet-Based Survey Research.” Paper presented at the annual conference of the American Association for Public Opinion Research, Portland, OR, May.
Thomas, R. K., Bayer, L. R., Johnson, A., and Behnke, S. (2005), “A Comparison of an Online Card Sorting Task to a Rating Task.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Miami Beach, FL, May.
Thomas, R. K., and Couper, M. P. (2007), “A Comparison of Visual Analog and Graphic Rating Scales.” Paper presented at the General Online Research Conference (GOR'07), Leipzig, March.
Thomas, R. K., Lafond, R. C., and Behnke, S. (2003), “Can What We Don't' Know (About “Don't Know”) Hurt Us? Effects of Item Non-Response.” Paper presented at the annual conference of the American Association for Public Opinion Research, Nashville, May.
Toepoel, V., Das, M., and Soest, A. (2005), “Design of Web Questionnaires: A Test for Number of Items per Screen.” Tilburg University: CentERdata Discussion Paper No. 2005-114.CrossRef
Toepoel, V., Das, M., and Soest, A. (2006), “Design of Web Questionnaires: The Effect of Layout in Rating Scales.” Tilburg University: CentERdata Discussion Paper No. 2006-30.CrossRef
Tourangeau, R., Couper, M. P., and Conrad, F. G. (2004), “Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions.” Public Opinion Quarterly, 68 (3): 368–393.CrossRefGoogle Scholar
Tourangeau, R., Couper, M. P., and Conrad, F. G. (2007), “Color, Labels, and Interpretive Heuristics for Response Scales.” Public Opinion Quarterly, 71 (1): 91–112.CrossRefGoogle Scholar
Tourangeau, R., Couper, M. P., Galesic, M., and Givens, J. (2004), “A Comparison of Two Web-Based Surveys: Static vs Dynamic Versions of the NAMCS Questionnaire.” Paper presented at the RC33 International Conference on Social Science Methodology, Amsterdam, August.
Tourangeau, R., Couper, M. P., and Steiger, D. M. (2003), “Humanizing Self-Administered Surveys: Experiments on Social Presence in Web and IVR Surveys.” Computers in Human Behavior, 19: 1–24.CrossRefGoogle Scholar
Tourangeau, R., Rips, L., and Rasinski, K. (2000), The Psychology of Survey Response. Cambridge, England: Cambridge University Press.CrossRefGoogle Scholar
Tourangeau, R., and Smith, T. W. (1996), “Asking Sensitive Questions: The Impact of Data Collection Mode, Question Format, and Questions Context.” Public Opinion Quarterly, 60 (2): 275–304.CrossRefGoogle Scholar
Trollip, S., and Sales, G. (1986), “Readability of Computer-Generated Fill-Justified Text.” Human Factors, 28: 159–164.CrossRefGoogle ScholarPubMed
Trouteaud, A. R. (2004), “How You Ask Counts: A Test of Internet-Related Components of Response Rates to a Web-Based Survey.” Social Science Computer Review, 22 (3): 385–392.CrossRefGoogle Scholar
Tufte, E. R. (1990), Envisioning Information. Cheshire, CT: Graphics Press.Google Scholar
Tufte, E. R. (2001), The Visual Display of Quantitative Information (2nd ed.). Cheshire, CT: Graphics Press.Google Scholar
Tullis, T. S. (1983), “The Formatting of Alphanumeric Displays: A Review and Analysis.” Human Factors, 25 (6): 657–682.CrossRefGoogle Scholar
Tullis, T. S. (1988), “A System for Evaluating Screen Formats: Research and Application.” In Hartson, H. R. and Hix, D. (eds.), Advances in Human-Computer Interaction (Vol. 2). Norwood: Ablex, pp. 214–286.Google Scholar
Turner, C. F., Forsyth, B. H., O'Reilly, J. M., Cooley, P. C., Smith, T. K., Rogers, S. M., and Miller, H. G. (1998), “Automated Self-Interviewing and the Survey Measurement of Sensitive Behaviors.” In Couper, M. P., Baker, R. P., Bethlehem, J., Clark, C. Z. F., Martin, J., Nicholls, W. L., and O'Reilly, J. (eds.), Computer Assisted Survey Information Collection. New York: Wiley, pp. 455–473.Google Scholar
Tuten, T. L. (2005), “Do Reminders Encourage Response but Affect Response Behaviors? Reminders in Web-Based Surveys.” Paper presented at the ESF Workshop on Internet Survey Methodology, Dubrovnik, Croatia, September.
Tuten, T. L., Galesic, M., and Bosnjak, M. (2004), “Effects of Immediate Versus Delayed Notification of Prize Draw Results on Response Behavior in Web Surveys.” Social Science Computer Review, 22 (3): 377–384.CrossRefGoogle Scholar
,U.S. Department of Health and Human Services (HHS) (2006), Research-Based Web Design & Usability Guidelines. Washington, D.C.: Government Printing Office.Google Scholar
Valdez, P., and Mehrabian, A. (1994), “Effect of Color on Emotions.” Journal of Experimental Psychology: General, 123 (4): 394–409.CrossRefGoogle Scholar
Horst, W., Snijders, C., and Matzat, U. (2006), “The Effect of Progress Indicators in Online Survey Compliance.” Paper presented at the General Online Research Conference (GOR '06), Bielefeld, Germany, March.
Linden, W. J., and Glas, C. A. W. (eds.) (2000), Computerized Adaptive Testing: Theory and Practice. Boston: Kluwer Academic.CrossRef
Molen, W. J. H. (2001), “Assessing Text-Picture Correspondence in Television News: The Development of a New Coding Scheme.” Journal of Broadcasting and Electronic Media, 43 (3): 483–498.CrossRefGoogle Scholar
Heesen, B. (2005), “Online Access Panels and Effects on Response Rates with Different Types of Incentives.” Proceedings of ESOMAR Conference on Worldwide Panel Research; Developments and Progress, Budapest, Hungary. Amsterdam: ESOMAR, pp. 393–408 [CD].Google Scholar
Vartabedian, A. G. (1971), “The Effect of Letter Size, Case and Generation Method on CRT Display Search Time.” Human Factors, 13 (4): 363–368.CrossRefGoogle Scholar
Vehovar, V., Lozar Manfreda, K., and Batagelj, Z. (1999), “Design Issues in WWW Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Portland, OR, May.
Vehovar, V., Batagelj, Z., Lozar Manfreda, K., and Zaletel, M. (2002), “Nonresponse in Web Surveys.” In Groves, R. M., Dillman, D. A., Eltinge, J. L., and Little, R. J. A. (eds.), Survey Nonresponse. New York: Wiley, pp. 229–242.Google Scholar
Vehovar, V., Manfreda, K. L., and Batagelj, Z. (2000), “Design Issues in Web Surveys.” Proceedings of the Survey Research Methods Section of the American Statistical Association, pp. 983–988.
Ahn, L., Blum, M., and Langford, J. (2004), “Telling Humans and Computers Apart Automatically.” Communications of the ACM, 47 (2): 57–60.Google Scholar
Vriens, M., Loosschilder, G. H., Rosbergen, E., and Wittink, D. R. (1998), “Verbal versus Realistic Pictorial Representations in Conjoint Analysis with Design Attributes.” Journal of Product Innovation Management, 15 (5): 455–467.CrossRefGoogle Scholar
Wainer, H., Dorans, N. J., Flaugher, R., Green, B. F., Mislevy, R. J., Steinberg, L., and Thissen, D. (2000), Computerized Adaptive Testing; A Primer (2nd ed.). Mahwah: Lawrence Erlbaum.Google Scholar
Ware, C. (2000), Information Visualization: Perception for Design. San Francisco: Morgan Kaufman.Google Scholar
Weinman, L. (1996), Designing Web Graphics. Berkeley, CA: New Riders.Google Scholar
Weller, L., and Livingston, R. (1988), “Effects of Color of Questionnaire on Emotional Responses.” Journal of General Psychology, 115 (4): 433–440.CrossRefGoogle Scholar
Wertheimer, M. (1938a), “Gestalt Theory.” In Ellis, W. D. (ed.), A Source Book of Gestalt Psychology. New York: Humanities Press, pp. 1–11.Google Scholar
Wertheimer, M. (1938b), “Laws of Organization in Perceptual Forms.” In Ellis, W. D. (ed.), A Source Book of Gestalt Psychology. New York: Humanities Press, pp. 71–88.CrossRefGoogle Scholar
Wertheimer, M. (1958), Principles of Perceptual Organization. New York: Van Nostrand.Google Scholar
Wherrett, J. R. (1999), “Issues in Using the Internet as a Medium for Landscape Preference Research.” Landscape and Urban Planning, 45: 209–217.CrossRefGoogle Scholar
Wherrett, J. R. (2000), “Creating Landscape Preference Models Using Internet Survey Techniques.” Landscape Research, 25 (1): 79–96.CrossRefGoogle Scholar
White, J. V. (1990), Color for the Electronic Age. New York: Watson-Guptill Publications.Google Scholar
Williams, J. R. (1988), “The Effects of Case and Spacing on Menu Option Search Time.” Proceedings of the Human Factors Society 32nd Annual Meeting, pp. 341–343.CrossRef
Witte, J. C., Amoroso, L. M., and Howard, P. E. N. (2000), “Method and Representation in Internet-Based Survey Tools – Mobility, Community, and Cultural Identity in Survey2000.” Social Science Computer Review, 18 (2): 179–195.CrossRefGoogle Scholar
Witte, J. C., Pargas, R. P., Mobley, C., and Hawdon, J. (2004), “Instrument Effects of Images in Web Surveys: A Research Note.” Social Science Computer Review, 22 (3): 363–369.CrossRefGoogle Scholar
Wojtowicz, T. (2001), “Designing Lengthy Internet Questionnaires: Suggestions and Solutions.” In Westlake, A., Sykes, W., Manners, T., and Rigg, M. (eds.), The Challenge of the Internet; Proceedings of the ASC International Conference on Survey Research Methods. London: Association for Survey Computing, pp. 25–32.Google Scholar
Yan, T. (2005a), “What They See Is Not What We Intend – Gricean Effects in Web Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Miami Beach, FL, May.
Yan, T. (2005b), Gricean Effects in Self-Administered Surveys. Unpublished Doctoral Dissertation. College Park: University of Maryland.Google Scholar
Yan, T., Conrad, F. G., Tourangeau, R., and Couper, M. P. (2007), “Should I Stay or Should I Go? The Effects of Progress Indicators, Promised Duration, and Questionnaire Length on Completing Web Surveys.” Paper presented at the annual meeting of the American Association for Public Opinion Research, Anaheim, CA, May.
Yan, T., and Tourangeau, R. (2008), “Fast Times and Easy Questions: The Effects of Age, Experience and Question Complexity on Web Survey Response Times.” Applied Cognitive Psychology, 22 (1): 51–68.CrossRefGoogle Scholar
Yost, P. R., and Homer, L. E. (1998), “Electronic Versus Paper Surveys: Does the Medium Affect the Response?” Paper presented at the annual meeting of the Society for Industrial/Organizational Psychology, Dallas, TX, April.
Zetie, C. (1995), Practical User Interface Design: Making GUIs Work. New York: McGraw-Hill.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • References
  • Mick P. Couper, University of Michigan, Ann Arbor
  • Book: Designing Effective Web Surveys
  • Online publication: 05 August 2012
  • Chapter DOI: https://doi.org/10.1017/CBO9780511499371.008
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • References
  • Mick P. Couper, University of Michigan, Ann Arbor
  • Book: Designing Effective Web Surveys
  • Online publication: 05 August 2012
  • Chapter DOI: https://doi.org/10.1017/CBO9780511499371.008
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • References
  • Mick P. Couper, University of Michigan, Ann Arbor
  • Book: Designing Effective Web Surveys
  • Online publication: 05 August 2012
  • Chapter DOI: https://doi.org/10.1017/CBO9780511499371.008
Available formats
×