Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python performance is too poor, can it provide an inference library in C++ and provide an OpenAI-compatible API #73

Open
geffzhang opened this issue Feb 12, 2024 · 3 comments

Comments

@geffzhang
Copy link

Using bigdl-llm in a production environment, Python performance is too poor, can you provide an inference library in C++ and provide an OpenAI-compatible API

@jason-dai
Copy link
Collaborator

@geffzhang
Copy link
Author

geffzhang commented Feb 14, 2024

This is written in Python, can C++ be added as well? Python's performance is not satisfactory.

@jason-dai
Copy link
Collaborator

jason-dai commented Feb 16, 2024

This is written in Python, can C++ be added as well? Python's performance is not satisfactory.

Unfortunately there is no such plan at this moment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants