Okay LASSO is meh, but what makes Ridge good?
what made me fell in love with ridge
Welcome back my friends. Today we will talk about Ridge and LASSO again.
Last time I talked about why I'm not a huge fan of LASSO, and it was probably the most divisive article I have here; apparently LASSO has a huge fanbase out there (the article is free btw).
The article mostly talked about the shortcoming of LASSO; I didn't really talk about, what makes Ridge good? And of course I have many articles talking about different flavours of Ridge, and when it might make sense to use each one of them. But in this article, I will talk about one of my favourite things about Ridge.
This article will go through a lot of things; and will touch on these particular topics among many others:
🧯If the idea of dropping variables is too extreme, given that in finance we model things that have weak signals, should we just jam everything into Ridge Regression and let it regularise, or should we still try to remove things that we believe have no predictive power?
💡We know that in some regression techniques, we are better off dropping some variables even if they are important (check out the below article), but does this hold when we are using Ridge? Or can Ridge just handle it and regularise it properly?
🕶️ We learn in school that Ridge introduces bias in exchange for lower variance, but in real world Ridge seems to reduce both bias and variance. Why and how?
💡We know that in some regression techniques, we are better off dropping some variables even if they are important (check out the below article), but does this hold when we are using Ridge? Or can Ridge just handle it and regularise it properly?
🕶️ We learn in school that Ridge introduces bias in exchange for lower variance, but in real world Ridge seems to reduce both bias and variance. Why and how?
This post is for paying subscribers only
Already have an account? Sign in.