Towards a Unified Framework for Robustness and Sensitivity

Thursday, 10 July 2025: 00:00
Location: FSE024 (Faculty of Education Sciences (FSE))
Oral Presentation
Pablo GERALDO BASTIAS, Nuffield College, University of Oxford, United Kingdom
Credible empirical research requires conducting “robustness checks” and/or “sensitivity analysis” as a way to assess how much the conclusions of a study depend on critical assumptions. However, current practice is highly heterogeneous, and there is no agreement on how to conduct and report such analyses in a transparent and coherent way. In this article, I aim to clarify the relationship between different forms of robustness (to model specification, variable inclusion, and observation inclusion), sensitivity (to unobserved confounding and sample selection, among others), and more generally uncertainty quantification approaches. I first show the shortcomings of common practice, including the lack of replicability, inconsistent application of tests, and obscure and uninterpretable reporting. Of particular importance is the change of the quantity of interest (“estimand”) across analyses, making the comparison between alternative conclusions impossible. I also discuss other forms of assumption-dependency and uncertainty quantification, such as multiverse analysis and computational multi-model analysis, which do not neatly map into the traditional distinction. However, as they become more computationally tractable, they should be incorporated into a coherent framework. Finally, I discuss the issues with the stepwise approach to testing one assumption at a time, showing ways of assessing the sensitivity to multiple assumptions simultaneously. I exemplify this approach by discussing the sensitivity of robustness checks, and the robustness of sensitivity analyses.