Replies: 2 comments 2 replies
-
GPU is used for only a few models: tensorflow (via sklearn -Regression models written by me), pytorch-forecasting (uses pytorch), and gluonts (uses mxnet) are the only ones that come to mind. Installation then depends on each of those upstream packages: Tensorflow, MXNet, and Pytorch being setup to utilize GPU in your local environment, then AutoTS will utilize the GPU automatically. Go to the source documentation for each and they will guide on the GPU setup. Let me know if you run into issues. I've found MXNet for GluonTS very well optimized for my Ryzen 5950X, so wouldn't bother with the GPU version as it isn't any faster on my RTX3060 12 GB. You might find what I wrote here helpful: https://syllepsis.live/2022/06/19/pytorch-tensorflow-and-mxnet-on-gpu-in-the-same-environment-and-gpu-vs-cpu-performance/ the TLDR for this is that because these neural networks are relatively small (compared to image processing and text monsters, we don't have that size of data here), and my CPU is a nice one, GPU's are often more trouble and cost to setup than they are worth. |
Beta Was this translation helpful? Give feedback.
-
Thanks for your answer. It's awesome you're supporting this package to level. I'm currently running this on a laptop with an i7 10870h (laptop cpu) and a laptop 3080. I'm thinking my next step is to try and get the intel optimization packages installed and working? do you still maintain the "pip install autots[additional]" function? When I originally wrote my code on collab I was able to install everything easily with that command and most models worked excellently, however, trying to set up the environment locally, i got an error using that command. something about a string error. I ended up following your installation guides but still getting some errors on models. i'll have to investigate these farther. the last question I did have is: after running a few csv files with model lists and parameters. is there a way i can combine them into my own "best model list" and only run those? i know you have a "run only one model" in the documentation, is the method similar? |
Beta Was this translation helpful? Give feedback.
-
Is there a way to get gpu support working on windows systems? I noticed in the documentation that said it was only available that is was only useable on Linux.
Beta Was this translation helpful? Give feedback.
All reactions