-
-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading Models Locally #100
Comments
Hi.
So if you already have the model locally, you can put the model into the appropriate folder by implementation and it will work. |
Hi, Thanks for getting back to me, i made the recommended changes and still receive the same error. In the terminal , it says it seems to keeps focusing on faster whisper even after disabling it like you suggested. |
Same here, that will be great if we have option to load local models. |
Hi, sorry for late reply. Usage:
Or if you start the Web UI with
And the Web UI will use the custom path for the models. So this is fixed in #154, I'm closing it. If you encounter any problems with these updates, please re-open and let me know! |
Wow, thanks! |
Hello, i set this up on a closed network, but I'm having issues with the models. I had the medium model downloaded on my system and placed it in the models directory but the UI gives an error when i try to generate subtitle file, i believe this is because its trying to download the models from the internet. I'm trying to make use of my already downloaded models instead of having whisper try to download a model itself, that cant work for me since I'm on a closed network and cant send requests to the internet. I'm trying to figure out where to make changes so that it makes use of my local models instead of trying to connect to the internet and download one. Any tips ? thanks.
The text was updated successfully, but these errors were encountered: