1 min readfrom Data Science

I’m really excited to share my latest blog post where I walkthrough how to use Gradient Boosting to fit entire Parameter Vectors, not just a single target prediction.

I’ve always wanted to explore the idea that boosted trees could fit entire coefficients of parameters of a distribution instead of only being able to predict a single value per leaf node. Well using {Jax} I was able to fit a Gradient Boosting Spline model where the model learns to predict the spline coefficients that best fit each individual observation. I think this has an implications for a lot of the advanced modeling techniques available to us; survival modeling, casual inference, and probabilistic modeling. I hope this post is helpful for anyone looking to learn more about gradient boosting.

submitted by /u/millsGT49
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#financial modeling
#financial modeling with spreadsheets
#natural language processing for spreadsheets
#generative AI for data analysis
#Excel alternatives for data analysis
#rows.com
#AI formula generation techniques
#Gradient Boosting
#Parameter Vectors
#spline model
#boosted trees
#spline coefficients
#Jax
#individual observation
#advanced modeling techniques
#survival modeling
#causal inference
#probabilistic modeling
#coefficients
#predict