Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error: wheels for chatglm.cpp on windows #287

Open
srdevore opened this issue Apr 10, 2024 · 1 comment
Open

error: wheels for chatglm.cpp on windows #287

srdevore opened this issue Apr 10, 2024 · 1 comment

Comments

@srdevore
Copy link

I get an error with pip install chatglm.cpp due to problem with wheel despite multiple troubleshooting attempts. Details are:

Context: want to use Chinese LLMs in xinference.

Windows machine (local); 100+ gb free space
Python 3.9.12
Installed Cmake 3.29.1
Visual studio 2022 including Desktop development with C++ (build tools inlcuded)
These installs fixed similar issue with llama-cpp-python, so it doesn't seem like the config is the problem.

ERROR: Could not build wheels for chatglm.cpp, which is required to install pyproject.toml-based projects
full output is attached, but root cause seems to be here:
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631.
-- The CXX compiler identification is MSVC 19.39.33523.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - failed
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual
chatglm_output.txt

Any help is greatly appreciated!
chatglm_output.txt

@BradKML
Copy link

BradKML commented May 6, 2024

Apparently the online package file has not been updated (and requires direct download from GitHub)? xorbitsai/inference#1393

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants