Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic on model loading in edgen_rt_llama_cpp #57

Open
toschoo opened this issue Feb 13, 2024 · 1 comment
Open

panic on model loading in edgen_rt_llama_cpp #57

toschoo opened this issue Feb 13, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@toschoo
Copy link
Contributor

toschoo commented Feb 13, 2024

Description

This line, very sporadically, causes a panic.

Solution

I guess that the problem is the lazy implementation of UnloadingModel but I couldn't prove it yet. The bug is simply too rare. If this is the problem, however, retry should solve the issue.

Remark

The code in Whisper is similar and, whatever solution is found for LLM, it should also be applied there.

@toschoo toschoo self-assigned this Feb 13, 2024
@toschoo toschoo added the bug Something isn't working label Feb 13, 2024
@toschoo
Copy link
Contributor Author

toschoo commented Feb 15, 2024

I tried for hour but I cannot reproduce the bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant