You are using a web browser we do not support. To improve your experience please try one of the following options:

2014

Margaret E. Roberts (University of California, San Diego), Brandon M. Stewart (Harvard University), Dustin Tingley (Harvard University), Christopher Lucas (Harvard University), Jetson Leder-Luis (California Institute of Technology), Shana Gadarian (Syracuse University), Bethany Albertson (University of Texas at Austin), and David Rand (Yale University) 

Citation

Margaret E. Roberts, Brandon M. Stewart, Dustin Tingley, Christopher Lucas, Jetson Leder-Luis, Shana Gadarian, Bethany Albertson, and David Rand's "Topic models for open ended survey responses with applications to experiments" advances political methodology with a novel Bayesian measurement model for text analysis, by enhancing the utility of open-ended survey responses and by connecting text analysis and topic modeling to causal inference for survey experiments. What is the causal effect of a change in survey frame on the topics mentioned in open-ended responses to questions? The model and model evaluation and checking methods proposed here enable researchers to answer such questions within a fully Bayesian framework. They engage with the problem of naming topics by reporting exemplar documents for given topics and by proposing a measure of semantic interpretability --- both of which allow a scholar to double-check intuitions about what a given topic represents. They show that this approach closely approximates human coding and thus opens new opportunities for innovations in research design in political science. Finally, by simultaneously addressing the literatures on text analysis, causal inference, and survey research, this paper promotes communication between a diverse set of methodological communities excellent paper had the unanimous support of the selection committee for the Miller Prize, awarded to the best article published in Political Analysis in 2014. The paper develops a framework for defining and estimating causal effects in conjoint survey experiments. Whereas survey experimental treatments such as vignettes necessarily bundle many components, conjoint design independently manipulates distinct elements of those treatments, allowing analysts to assess and compare each elements effect on responses or choices. Rarely are our treatments unidimensional, yet it is often difficult to isolate the influence of particular dimensions or components of treatment. Conjoint analysis offers a promising tool in this regard, and it has therefore seen growing recent use in political science, in part due to the substantive work of co-authors on this paper. This method should not only improve how we study political questions, but should apply to numerous fields as well.

The paper makes several important methodological contributions that go well beyond what has been accomplished elsewhere, for example, in the marketing research literature in which conjoint analysis was initially developed. First, it formally analyzes the causal properties of conjoint analysis using the Neyman-Rubin potential outcomes framework. The authors use this framework to define a causal quantity of interest the average marginal component effect which is the marginal effect of each attribute averaged over the joint distribution of other attributes included in the experiment. They define effects for two different kinds of conjoint experimental designs, those in which respondents are forced to make a choice between two profiles (e.g. two candidates), and ratings-based outcomes in which respondents express degree of preference for one or more manipulated profiles. Second, they clearly explain how this effect can be identified under assumptions that either must hold due to the experimental design or are partially empirically testable. The paper then provides a series of diagnostic tests to assess these identification assumptions (for example, to test whether there are indeed no profile or attribute order effects). Importantly, in contrast to existing approaches in econometrics or marketing research that assume a particular behavioral model and define and estimate parameters in terms of that model approaches which have the advantage of statistical efficiency but the disadvantage that the assumed model must be true in this paper the identification results allow inferences about causal quantities without resorting to strong functional form and other assumptions. Finally, the authors illustrate the technique using two interesting examples, from studies of voting and immigration, and they offer sample R and Stata scripts as well as a graphical user interface for conjoint survey design.

In sum, the articles merit lies in (1) introducing a simple, design-based approach to analysis of an important experimental design; and (2) carefully discussing assumptions, advantages, examples, and especially limitations of the approach. We believe the paper will be widely used and widely cited, and deservingly so. We are happy to award the prize to this excellent piece of work.

Gosnell Prize