Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the size of embeddings? #31

Open
arittenbach opened this issue Mar 28, 2024 · 1 comment
Open

What is the size of embeddings? #31

arittenbach opened this issue Mar 28, 2024 · 1 comment
Labels
FAQ Frequently asked question

Comments

@arittenbach
Copy link

I'm interested in using embeddings generated by Chronos for training a downstream anomaly detection model. For these models, if I use your sample example for generation of embeddings, I get 144 embedding vectors, which is the same length of time series in the example you provide. However, with my test data case, I have a time series of length 300000, and when I run that through your model for embedding generation I end up with 512 embedding vectors. Is there an upper bound of time series length that I should be using with this model, or is this expected output? Thanks so much!

@abdulfatir
Copy link
Contributor

@arittenbach The current Chronos models were trained with a context length of upto 512, so when you pass time series of length greater than that only the last 512 steps are actually used. See this line:
https://github.com/amazon-science/chronos-forecasting/blob/main/src/chronos/chronos.py#L140

One potential way of extracting embeddings for a long time series is to feed sliding windows into the model.

@lostella lostella added the FAQ Frequently asked question label Mar 31, 2024
@lostella lostella changed the title embeddings question What is the size of embeddings? Mar 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
FAQ Frequently asked question
Projects
None yet
Development

No branches or pull requests

3 participants