Random Forest, XGBoost, beyond just importing sklearn
I was never really familiar with tree models.
should I embark on a journey of studying tree models (Random Forest/XGB/LightGBM wtv) *only* for 30 days…… can’t say I’m excited about it tho. but I think I’m not excited bcs I don’t even know enough about it….
— quantymacro (@quantymacro) May 14, 2024
So I thought, okay, why not learn about it? After some reading, I realised that tree models are quite powerful and cool, and there are many interesting tricks to enhance the power of the models.
back from vacation my friends. been reading about Random Forest, and damn there are *so many* interesting tricks to get the most out of RF. and all of it require hacking the underlying model. friends don’t let friends import sklearn
— quantymacro (@quantymacro) June 3, 2024
Articles
Ridge Regression x Random Forest
— quantymacro (@quantymacro) June 14, 2024
15k special. all who RT & leave email in the form gets access
the article is long. I'm on a journey to study tree models deeply-ish, & I wouldn't do my audience a disservice by robbing them from the opportunity to tag along
ref @OrthogonalAlpha pic.twitter.com/Jy7TePz1Hc

But tree models are still notoriously hard to understand, so one way to make myself comfortable with tree models, is to convert them to Ordinary Least Squares.
> be me
— quantymacro (@quantymacro) July 17, 2024
> only comfortable w linear regression
> learning about trees (RF/XGB/LightGBM) but still feel uncomfortable with it
> learned how to represent any tree model with Ordinary Least Squares
> now I’m more comfortable w trees
3rd post for the month. dont get used to it tho
— quantymacro (@quantymacro) July 23, 2024
It’s confirmed and *guaranteed*: Ridge Regression >> Tree Model
+Python code
🌳Decision Trees as OLS
🔥Blazingly Fast Regularisation - Why I don’t tune max_depth/min_samples for trees
ref:@0xfdf
giving access to some who RT pic.twitter.com/mBw44an0Bc
