Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM in javascript directly? #116

Open
bc opened this issue Feb 23, 2024 · 2 comments
Open

LLM in javascript directly? #116

bc opened this issue Feb 23, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@bc
Copy link

bc commented Feb 23, 2024

Thanks for your awesome work here. I have a more architectural question—As deployment is a challenge for non-technical users, is it possible to wrap a llama model directly into the chrome extension JS code? Or are there any limitations to chrome extension memory/storage/CPU that would make that difficult or impossible? Thanks!

@bc bc added the enhancement New feature or request label Feb 23, 2024
Copy link

Thank you for your contribution. We will check and reply to you as soon as possible.

@shreyaskarnik
Copy link
Owner

@bc Thank you for checking out the project. There are couple of limitations in bundling the llama model in the Chrome extension memory/storage being the primary. Also wanted to keep the extension light and working with various models that Ollama supports. I did play around with embedding models using https://github.com/xenova/transformers.js but ran into issues with GPU/CPU/memory and thus relied on the Ollama API. Hope the explanation helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants