Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Auto] Auto* models can not be used with any config as a subset of default_config #929

Open
yarnabrina opened this issue Mar 14, 2024 · 0 comments

Comments

@yarnabrina
Copy link
Contributor

Description

Few models (e.g. TFT) takes required positional arguments (e.g. input_size). If user passes a config to AutoTFT that does not have this key, training fails with following as the root cause:

TypeError: TFT.init() missing 1 required positional argument: 'input_size'

Here's a reproducible example:

import pandas
from neuralforecast import NeuralForecast
from neuralforecast.auto import AutoTFT
from ray import tune

from sktime.datasets import load_longley
from sktime.split import temporal_train_test_split

y, X = load_longley()
y_train, y_test, X_train, X_test = temporal_train_test_split(y, X, test_size=4)

algorithm = AutoTFT(
    4,
    config={"max_steps": tune.choice([5, 10, 15]), "random_seed": tune.choice([0])},
    backend="ray",
)
model = NeuralForecast([algorithm], "A-DEC")

train_data = {
    "unique_id": 1,
    "ds": y_train.index.to_timestamp(freq="A-DEC").to_numpy(),
    "y": y_train.to_numpy(),
}
for column in X.columns:
    train_data[column] = X_train[column].to_numpy()

train_dataset = pandas.DataFrame(data=train_data)

model.fit(df=train_dataset)
# RuntimeError: No best trial found for the given metric: loss. This means that no trial has reported this metric, or all values reported for this metric are NaN. To not ignore NaN values, you can set the `filter_nan_and_inf` arg to False.

It will be helpful to document mandatory keys to be passed in config in the documentation of Auto* models.

If it is possible to allow users to pass few arugments to the underlying model directly without tuning, for __init__ (where tuning is not necessary) or to fit or predict (for example configurations for PyTorch Lighning's trainer), it will be very useful.

Link

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant