-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I cannot reproduce results of quantile regression when using a custom metric or objective. #6062
Comments
Note that with quantile loss, LightGBM will not only fit on the gradients and hessians. After the tree is grown, there is another step to renew the leaf values directly with the quantile of the residuals.
Thus, though the customized function calculates the gradients and hessians accordingly, with a customized objective function, there's no such renewing step. |
@shiyu1994 Thanks for the information. Is there any way I can implement this extra step when constructing my model in python? |
Would you be open to me making an MR for a new regression objective for something like quadratic quantile loss function? Such an objective would be very valuable to my company . |
I'm happy with that. It would be good to hear opinions from @guolinke @jameslamb @jmoralez |
What target functional Examples for |
Another option would be to implement a situation where users could use a custom objective in combination with quantile regression, would that be possible? |
@shiyu1994 Can you explain in more detail what is actually happening in this renewal step? Are the gradients all scaled by a constant depending on the average absolute value of the residuals? Edit: Actually found a good explanation here: https://jmarkhou.com/lgbqr/ Basically, after fitting the tree on the gradients, there's an extra step to calculate the empirical quantile of the residuals of the data points within each leaf. I.e., fit the tree using the first order approximation, but do a full derivative calculation to determine the values each leaf should predict in the tree |
@robert-robison have you found a way to combine a custom metric with quantile regression execution? |
@HowardRiddiough No. The closest we've been able to come is weighting the gradients by the absolute value of the error. But this is a kind of "squared quantile loss" or something--it won't give you the same results as the quantile objective will. The renewing step cannot be duplicated (as far as I can tell) within a custom objective function. Would love to hear of better solutions if you (or anyone) has them! |
Any updates! |
Description
I am working on a project where we want to make a conservative prediction, aka increase likelihood that the model produces a negative error.
At the moment we are using LightGBM's quantile regression to predict the 90th quantile, I am happy with the results but would like to tweak the loss function ever so slightly to introduce a quadratic loss function to punish large negative errors disproportionately more than small negative errors. As I mentioned earlier we are looking to make conservative predictions but a prediction that is too conservative doesn't deliver any value.
With that end in mind I have been trying to reproduce the stock quantile regressor results, I want to begin with reproducing stock behaviour so I know I have a good foundation to start modifying the quantile loss function from.
When running the examples below you will see that the predictions made by each model do not match. That may be because I haven't constructed the quantile loss function correctly. It may also be that my custom quantile regressor does not calculate tree output in the same way as the stock quantile regressor. I can see in the
regression_objective.hpp
file that theRegressionQuantileloss
class is doing something with percentiles when calculating the tree output that the standard regression loss may not be doing.Reproducible example
Environment info
python==3.9.5
numpy=1.23.5
matplotlib==3.7.1
lightgbm==3.3.5
pandas==1.5.3
scikit-learn==1.3.0
I am working on a macbook with an M1 chip.
Summary
Thanks for taking a look at this issue and I really would appreciate any help. My questions are as follows:
The text was updated successfully, but these errors were encountered: