You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However in the case I'm dealing with, the loss during holidays is considered more important, not the distance between predicted timestamp and current timestamp.
I add a column to the input DataFrame representing the weight of each timestamp and try to modify the base window class to extract weights for different time windows. Is there any way to adding weight more easily?
Thanks
Use case
Setting different horizon weight to each window by simply adding a weight column in the input DataFrame.
The text was updated successfully, but these errors were encountered:
Thanks for the suggestion. While we consider it, options to account for different sample importances are:
Use a different loss function, e.g. MSE if holidays typically show higher target values. MSE will penalize errors on those higher values to a larger extent than MAE.
This is a bit iffy, but you can scale the target by the sample weights yourself and scale back with the correct weights for prediction when predicting a holiday or non-holiday. In that case, the forecasting task would be to forecast the importance-weighted time series rather than the actual time series.
Description
To my understanding, currently the implementation of horizon weights is by adding a fixed horizon_weight list when initializing the loss function.
However in the case I'm dealing with, the loss during holidays is considered more important, not the distance between predicted timestamp and current timestamp.
I add a column to the input DataFrame representing the weight of each timestamp and try to modify the base window class to extract weights for different time windows. Is there any way to adding weight more easily?
Thanks
Use case
Setting different horizon weight to each window by simply adding a weight column in the input DataFrame.
The text was updated successfully, but these errors were encountered: