Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MoonDream2 .gguf support #61

Open
Ratinod opened this issue Mar 31, 2024 · 1 comment
Open

MoonDream2 .gguf support #61

Ratinod opened this issue Mar 31, 2024 · 1 comment

Comments

@Ratinod
Copy link

Ratinod commented Mar 31, 2024

Can you add support for .gguf version of MoonDream2? Currently .gguf gives an error with "LLava Clip Loader".

moondream2-mmproj-f16.gguf
moondream2-text-model-f16.gguf

Specialized MoonDream2 node gives me a CUDA error (8GB VRAM not enough?).

@Ratinod Ratinod changed the title MoonDream2 gguf support MoonDream2 .gguf support Mar 31, 2024
@gokayfem
Copy link
Owner

gokayfem commented May 29, 2024

They added different classes to llama.cpp for nanollava and moondream i will add them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants