New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LightGBM] Regression: how to penalize negative predictions #918
Comments
|
|
|
The simplest way might be to duplicate positive examples in your training set so that they naturally carry more influence during training. |
|
"define (many example for default LightGBM model) and pass a custom regression objective" |
|
I should update this thread that custom objectives have been added recently, so this should be possible now |
|
Hi @imatiach-msft could you let me know if I can work on this? |
|
@DefUs3r sure, what do you want to work on specifically? Are you thinking of adding an example on how to penalize negative predictions using a custom loss function? |
|
Hii @brunocous everything is almost fine on your provided link, but it could be more Attractive by using some more UI features, color schemes, and images. Also in your Python code learning, you can give the idea for them starting from the basics. |
|
@Mihir-Khandelwal are you commenting on the right thread? I think you might have meant to comment elsewhere? |
|
Can I be assigned this @imatiach-msft ? |
|
@pragyasrivastava0805 sure, I just assigned the task to you |
|
@imatiach-msft i am very new to open source,could you please walk me through and tell what exactly needs to be done |
|
@imatiach-msft Sorry i am new to this, as an microsoft recruting task, i was assigned a task to comment a code correction on any of the 100+ stars github account. |
I want to ask just for clarity purpose that here ,positive and negatives samples refer to overestimation and underestimation , correct? |
Correct. In the end, I went with just using the quantile regression objective to penalise overestimations more. That worked well enough for my use case. Second trick that helped a lot is assigning samples that are more prone to overestimation a higher sample weight for training. Not perfect solutions, but it does the trick. |
|
Hi ,
Thanks for responding , that resonates with me as well. Also, while I was
looking at it (the problem) I optimised objective function a bit for better
results since in the 50th percent quantile it turns out to be mae , I
changed it a bit for better results.Please have a look and let me know what
you think (I have submitted the pull request with that function).
Hoping it helps and awaiting your reply.
Thankfully
Navya
…On Mon, Aug 30, 2021, 18:01 brunocous ***@***.***> wrote:
I have a simple regression task (using a LightGBMRegressor) where I want
to penalize negative predictions more than positive ones. Is there a way to
achieve this with the default regression LightGBM objectives (see
https://lightgbm.readthedocs.io/en/latest/Parameters.html)? If not, is it
somehow possible to define (many example for default LightGBM model) and
pass a custom regression objective?
Thx in advance!
I want to ask just for clarity purpose that here ,positive and negatives
samples refer to overestimation and underestimation , correct?
Correct. In the end, I went with just using the quantile regression
objective to penalise overestimations more. That worked enough for my task
at hand. Second trick that helped a lot is assigning samples more prone to
overestimate a higher sample weight. Not perfect solutions, but it does the
trick.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#918 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ALH4H56VPQRNJKBSBFWUF53T7N22XANCNFSM4QS2APIA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
Please assign me this @brunocous @imatiach-msft |
|
Hi, Penalizing the Negative Prediction in a Regression Problem.docx |
|
Penalizing.the.Negative.Prediction.in.a.Regression.Problem.docx I have Created a pipeline using Azure ML for Regression Model |


I have a simple regression task (using a LightGBMRegressor) where I want to penalize negative predictions more than positive ones. Is there a way to achieve this with the default regression LightGBM objectives (see https://lightgbm.readthedocs.io/en/latest/Parameters.html)? If not, is it somehow possible to define (many example for default LightGBM model) and pass a custom regression objective?
Thx in advance!
The text was updated successfully, but these errors were encountered: