Skip to content

Issues: SciSharp/LLamaSharp

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

How to better provide system information for LLMs question Further information is requested
#734 opened May 12, 2024 by K1tiK4
Add debug mode of LLamaSharp enhancement New feature or request good first issue Good for newcomers
#732 opened May 12, 2024 by AsakusaRinne
Add unit test about long context good first issue Good for newcomers
#731 opened May 12, 2024 by AsakusaRinne
[BUG]: WSL2 has problem running LLamaSharp with cuda11 bug Something isn't working
#727 opened May 10, 2024 by AsakusaRinne
Android Backend backend
#695 opened Apr 25, 2024 by AmSmart
Mamba
#694 opened Apr 25, 2024 by JoaoVictorVP
Namespace should be consistent
#693 opened Apr 24, 2024 by AsakusaRinne
How to rebuild LLamaSharp backends question Further information is requested
#690 opened Apr 24, 2024 by kuan2019
[Proposal] Backend-free support discussion enhancement New feature or request
#670 opened Apr 16, 2024 by AsakusaRinne
Debian 12 x LLamaSharp 0.11.2 Crashed Silently Upstream Tracking an issue in llama.cpp
#668 opened Apr 15, 2024 by kuan2019
IndexOutOfRangeException when calling IKernelMemory.AskAsync() bug Something isn't working good first issue Good for newcomers
#661 opened Apr 11, 2024 by WesselvanGils
ProTip! What’s not been updated in a month: updated:<2024-04-18.