RL-1: What Would It Take to Change Your Inference? Quantifying the Discourse About Causal Inferences in the Social Sciences
RL-1: What Would It Take to Change Your Inference? Quantifying the Discourse About Causal Inferences in the Social Science
Kenneth Frank, Michigan State University
Statistical inferences are often challenged because of uncontrolled bias. There may be bias due to uncontrolled confounding variables or nonrandom selection into a sample. We will turn concerns about potential bias into questions about how much bias there must be to invalidate an inference. For example, we will transform challenges such as “But the inference of a treatment effect might not be valid because of preexisting differences between the treatment groups” to questions such as “How much bias must there have been due to uncontrolled preexisting differences to make the inference invalid?” By reframing challenges about bias in terms of specific quantities, this course will contribute to scientific discourse about uncertainty of causal inferences. In Part I, we use Rubin’s causal model to interpret how much bias there must be to invalidate an inference in terms of replacing observed cases with counterfactual cases or cases from an unsampled population (e.g., Frank et al., 2013).
In Part II, we quantify the robustness of causal inferences in terms of correlations associated with unobserved variables or in unsampled populations (e.g., Frank, 2000). Calculations will be presented using the app http://konfound-it.com with links to STATA and R modules. In Part III, we extend to nonlinear models, interaction effects, specific study designs (e.g., regression discontinuity), and thresholds for inferences. The format will be a mixture of presentation, individual exploration, and group work. Participants should be comfortable with the general linear model (e.g., multiple regression) and statistical inference.