S5E09 Regularized Variable Selection Methods

S5E09_graphic

In today’s episode Greg and Patrick talk about regularization, which includes ridge, LASSO, and elastic net procedures for variable selection within the general linear model and beyond. Along the way they also mention Bowdlerizing, The Family Shakespeare, disturbance in the force, McNeish on his bike, Spandex, C’mon guys wait up, the altar of unbiasedness, Curranizing, shooting arrows, stepwise goat rodeo, volume knobs, Hancockizing, always angry, getting slapped, betting a chicken, mission from God, hypothetico-deductive porpoising, and letting go of truth (which you can’t handle anyway).

Related Episodes

  • S4E02: Underachievers, Overachievers, & Maximum Likelihood Estimation
  • S4E01: Ordinary Least Squares: Back Where It All Began
  • S2E18: Regression: Like That Old High School Friend You’ve Outgrown
  • S2E11: The Replication…Dilemma with Samantha Anderson

Suggested Readings

Bauer, D. J., Belzak, W. C., & Cole, V. T. (2020). Simplifying the assessment of measurement invariance over multiple background variables: Using regularized moderated nonlinear factor analysis to detect differential item functioning. Structural Equation Modeling: A Multidisciplinary Journal27, 43-55.

Belzak, W., & Bauer, D. J. (2020). Improving the assessment of measurement invariance: Using regularization to select anchor items and identify differential item functioning. Psychological Methods25, 673-690.

Chen, S. M., Bauer, D. J., Belzak, W. M., & Brandt, H. (2022). Advantages of spike and slab priors for detecting differential item functioning relative to other Bayesian regularizing priors and frequentist lasso. Structural Equation Modeling: A Multidisciplinary Journal29, 122-139.

Desboulets, L. D. D. (2018). A review on variable selection in regression analysis. Econometrics6, 45.

Efron, B. (2014). Estimation and accuracy after model selection. Journal of the American Statistical Association109(507), 991-1007.

Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regression. The Annals of Statistics, 32, 407-499.

McNeish, D. M. (2015). Using lasso for predictor selection and to assuage overfitting: A method long overlooked in behavioral sciences. Multivariate Behavioral Research50, 471-484.

 

15585

join our
email list

Scroll to Top