Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

什么时候支持amd的gpu #274

Open
youcanyouupsb opened this issue Mar 8, 2024 · 3 comments
Open

什么时候支持amd的gpu #274

youcanyouupsb opened this issue Mar 8, 2024 · 3 comments

Comments

@youcanyouupsb
Copy link

如何支持amd硬件

@ZUIcat
Copy link

ZUIcat commented Mar 9, 2024

同问,或者说能将chatglm.cpp合并到llama.cpp就好了,挺希望能在AMD上跑的。

@li-plus
Copy link
Owner

li-plus commented Mar 12, 2024

我手上没有 amd gpu 所以没办法测试,有空会逐渐迁移到 llama.cpp 去的

@youcanyouupsb
Copy link
Author

llama.cpp 支持 AMD上跑的,llama.cpp我跑AMD ok , 期待chatglm.cpp!!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants