Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Additional metrics #220

Open
ideusoes opened this issue Dec 21, 2023 · 1 comment
Open

Additional metrics #220

ideusoes opened this issue Dec 21, 2023 · 1 comment

Comments

@ideusoes
Copy link

As per the autoTS tutorial, metric_weighting allows the user to choose from a list of available metrics, but there is no option to use other metrics if they are not present in the dictionary keys.

I would like to use the MASE metric (Mean Absolute Scaled Error). Is there a plan to allow integration of custom metrics, or perhaps to add metrics requested by users ?

Thank you!

@winedarksea
Copy link
Owner

No current option to support custom metrics.
Yes it is on my to do list but it is tricky because of how it is all set up.

For your immediate needs:
SMAPE should correlate well with MASE. uwmse is another scaled error that would work. Between those you should be able to have model selection that is effectively the same as MASE, I should think.

You can access model.initial_results.per_series_mae and scale that pretty easily if you want to view results. For even more custom metrics you can run model.retrieve_validation_forecasts(models=[list of model ids]) and then calculate whatever you want afterwards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants