Inference after model selection

This work involves challenging mathematical and computational aspects of conducting inference after model selection procedures with complicated underlying geometry. This enables significance testing, for example, after using some of the most popular model selection procedures such as the lasso with regularization chosen by cross-validation, or forward stepwise with number of steps chosen by AIC or BIC. Often the resulting significance tests are slightly modified versions of the classical tests in regression analysis.

Author

Joshua Loftus

Published

February 17, 2022

Post-selection inference

This work involves challenging mathematical and computational aspects of conducting inference after model selection procedures with complicated underlying geometry. This enables significance testing, for example, after using some of the most popular model selection procedures such as the lasso with regularization chosen by cross-validation, or forward stepwise with number of steps chosen by AIC or BIC. Often the resulting significance tests are slightly modified versions of the classical tests in regression analysis.

Software

I’m a co-author of the selectiveInference package on CRAN.

Publications

  • X. Tian, J. R. Loftus, J. E. Taylor. Selective inference with unknown variance via the square-root LASSO. Biometrika, 2018. [link]

  • J. E. Taylor, J. R. Loftus, and R. J. Tibshirani. Inference in adaptive regression via the Kac-Rice formula. Annals of Statistics, 2016. [link]

  • J. R. Loftus. Selective inference after cross-validation. Preprint.

  • J. R. Loftus and J. E. Taylor. Selective inference in regression models with groups of variables. Preprint.

  • J. R. Loftus, J. E. Taylor. A significance test for forward stepwise model selection. Preprint.