Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How does llama cpp backend work? #10803

Open
chsasank opened this issue Apr 19, 2024 · 3 comments
Open

How does llama cpp backend work? #10803

chsasank opened this issue Apr 19, 2024 · 3 comments

Comments

@chsasank
Copy link

I am curious to see the latest update where IPEX is actually working as backend for llama.cpp. This is a very unusual architecture. Interested to know how this work! Can you point me to the right source files?

@jason-dai
Copy link
Contributor

@chsasank To clarify, IPEX is not working as backend for llama.cpp. Many performance critical operations in IPEX-LLM is implemented in C++ (or sycl); by exposing a C++ interface, we can use these operations in IPEX-LLM as a backend for llama.cpp.

@chsasank
Copy link
Author

Can I see the code for this? Where is it implemented in the repo?

@chsasank
Copy link
Author

It looks like bigdl-core-cpp are the cpp bindings. Where is the source for this package?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants