How to use different LLM's and host everything locally. #271
Unanswered
shawnZhang-3
asked this question in
Q&A
Replies: 1 comment
-
I also have this issue. Were you able to find the solution? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I tried to host everything locally followed the instructions:
https://github.com/arc53/DocsGPT/wiki/How-to-use-different-LLM's#hosting-everything-locally;but it doesn't work.
1.Firstly, I runned follow :
Use Manifest to host API locally: Just run in a seperate shell. You will need 80gb of extra space on your disk
python3 -m manifest.api.app
--model_type huggingface
--model_name_or_path bigscience/T0pp
--device 0
3.set the env file:
LLM_NAME=manifest
API_KEY=http://xxx.xxx.xxx.xxx:5000/
EMBEDDINGS_NAME=huggingface_sentence-transformers/all-mpnet-base-v2
EMBEDDINGS_KEY=http://xxx.xxx.xxx.xxx:5000/embed
VITE_API_STREAMING=false
CELERY_BROKER_URL=redis://localhost:6379/0
CELERY_RESULT_BACKEND=redis://localhost:6379/1
MONGO_URI=mongodb://localhost:27017/docsgpt
HUGGINGFACEHUB_API_TOKEN=http://xxx.xxx.xxx.xxx:5000
4.then I runned docker-compose build && docker-compose up, It worked well
5.when I uploaded file and pressed train
I got
"Did not find openai_api_key, please add an environment variable
OPENAI_API_KEY
which contains it, or passopenai_api_key
as a named parameter. (type=value_error)", it seems like still used openai releated.I don't know what's the problem and don't know how to do next step.
Beta Was this translation helpful? Give feedback.
All reactions