Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support for running through docker #823

Open
qxprakash opened this issue May 7, 2024 · 2 comments
Open

support for running through docker #823

qxprakash opened this issue May 7, 2024 · 2 comments

Comments

@qxprakash
Copy link

Due to requirement of specific cuda version and other libraries , it would be nice if we add support for running faster-whisper through a docker container as it would save all the hassle of setting up the environment for new users.

@DeoLeung
Copy link

DeoLeung commented May 8, 2024

can use base image pytorch/pytorch:2.1.0-cuda12.1-cudnn8-runtime for the cuda version suitable for you

then simply install faster-whisper and onnxruntime-gpu will be good to go

@trungkienbkhn
Copy link
Collaborator

@qxprakash , hello. You can refer Dockerfile in this comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants