Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行报错, 提示404 #1385

Closed
he-aook opened this issue Apr 26, 2024 · 31 comments
Closed

运行报错, 提示404 #1385

he-aook opened this issue Apr 26, 2024 · 31 comments
Labels
question Further information is requested severity:low Minor issues, code cleanup, etc

Comments

@he-aook
Copy link

he-aook commented Apr 26, 2024

Describe your question

运行报错, 提示404

raise self._make_status_error_from_response(err.response) from None

openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
File "/root/.cache/pypoetry/virtualenvs/opendevin-QzKVoApH-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1012, in _request
^^^^^^^^^^^^^^
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
File "/root/.cache/pypoetry/virtualenvs/opendevin-QzKVoApH-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1012, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
ode': '404', 'message': 'Resource not found'}}
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
raise self._make_status_error_from_response(err.response) from None

Additional context

@he-aook he-aook added the question Further information is requested label Apr 26, 2024
@he-aook
Copy link
Author

he-aook commented Apr 26, 2024

有大佬在线吗

@he-aook
Copy link
Author

he-aook commented Apr 26, 2024

LLM_MODEL="gpt-35-turbo"
LLM_API_KEY="XXXXXXXXXXXXXXXXXXXXXXXX"
LLM_EMBEDDING_MODEL="azureopenai"
LLM_BASE_URL="https://XXXXXX.openai.azure.com/"
LLM_EMBEDDING_DEPLOYMENT_NAME="gpt35-11XXXXXX-XXXX"
LLM_API_VERSION="1106"
WORKSPACE_BASE="./workspace"
SANDBOX_TYPE="exec"

@he-aook
Copy link
Author

he-aook commented Apr 26, 2024

INFO: connection open
INFO: 205.198.64.107:0 - "GET /api/configurations HTTP/1.1" 404 Not Found

@ayanjiushishuai
Copy link

just solved it
刚解决了 参考这个
#1187 (comment)

@enyst
Copy link
Collaborator

enyst commented Apr 27, 2024

The embedding deployment name doesn't look right (it's not a GPT). This is in addition, @ayanjiushishuai to the other error, the one you experienced and solved. The API version also seems like it should contain more than a single number, though I'm not sure.

LLM_MODEL="gpt-35-turbo"
LLM_API_KEY="XXXXXXXXXXXXXXXXXXXXXXXX"
LLM_EMBEDDING_MODEL="azureopenai"
LLM_BASE_URL="https://XXXXXX.openai.azure.com/"
LLM_EMBEDDING_DEPLOYMENT_NAME="gpt35-11XXXXXX-XXXX"
LLM_API_VERSION="1106"
WORKSPACE_BASE="./workspace"
SANDBOX_TYPE="exec"

@he-aook
Please verify and try:

LLM_API_VERSION=<the full version as you see it in Azure for the model, maybe 1106, maybe gpt3.5-1106, or gpt-35-1106-preview...>
WORKSPACE_BASE=<absolute path>

For embeddings, this is your choice. You can choose to try local:

LLM_EMBEDDING_MODEL='local'
LLM_EMBEDDING_DEPLOYMENT_NAME=<doesn't matter>

Or, alternative:

LLM_EMBEDDING_MODEL='azureopenai'
LLM_EMBEDDING_DEPLOYMENT_NAME=<the name defined in your Azure account for "deployment name" corresponding to the "text-embedding-ada-002" model

For the chat model:
please use the UI to set the model, as the comment linked above says.

The model is the deployment name defined in your Azure account that corresponds to GPT-3.5-turbo. It might have the same name, so try GPT-3.5-turbo.

@ayanjiushishuai
Copy link

@enyst You are right.
I have noticed that but didn't mention it.
For me,my LLM_API_VERSION="2024-01-25-preview"
@he-aook 你是都用的azure吧?参考文档:https://github.com/OpenDevin/OpenDevin/blob/main/docs/guides/AzureLLMs.md

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

@ayanjiushishuai 好的, 我尝试一下 #1385 (comment)

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

@ayanjiushishuai 还是没法正常启动对话, 我这边用的是微软的模型, gpt 3.5 and 4.0 都提示不支持

@ayanjiushishuai
Copy link

什么报错?你LiteLLM本地能调起来吗?代码有更新到最新版本吗?页面选择你设置的模型了吗

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

#1385 (comment)
@ayanjiushishuai 我再详细的看下

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

@ayanjiushishuai 我选择的是我的模型, 部署的是最新的版本, 也还是不行的;
#1385 (comment)
image

@ayanjiushishuai
Copy link

LLM_MODEL=azure/你的模型部署名 这里你写的应该是embedding模型的模型名
LLM_EMBEDDING_DEPLOYMENT_NAME应该写你的embedding模型部署名 你这里写了可能是你的gpt4模型部署名
其他暂时应该没有错误
请问你按照我的用例 测试了本地是否能运行LiteLLM了吗?OpenDevin底层是通过LiteLLM调用的API 你如果本地LiteLLM都跑不起来 OpenDevin里肯定也不行

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

@ayanjiushishuai
#1385 (comment)
image

LiteLLM 这个本地怎么跑呢?

@ayanjiushishuai
Copy link

import问题 是不是venv/conda没有装litLLM?
pip install litellm看看有没有安装

@he-aook
Copy link
Author

he-aook commented Apr 28, 2024

@ayanjiushishuai 这个我安装了
image
(base) [root@iZrj9f205caqao6ghsyor0Z OpenDevin-2]# pip3 install litellm
Looking in indexes: http://mirrors.cloud.aliyuncs.com/pypi/simple/
Requirement already satisfied: litellm in /usr/local/python312/lib/python3.12/site-packages (1.35.31)
Requirement already satisfied: aiohttp in /usr/local/python312/lib/python3.12/site-packages (from litellm) (3.9.5)
Requirement already satisfied: click in /usr/local/python312/lib/python3.12/site-packages (from litellm) (8.1.7)
Requirement already satisfied: importlib-metadata>=6.8.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (7.1.0)
Requirement already satisfied: jinja2<4.0.0,>=3.1.2 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (3.1.3)
Requirement already satisfied: openai>=1.0.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (1.23.6)
Requirement already satisfied: python-dotenv>=0.2.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (1.0.1)
Requirement already satisfied: requests<3.0.0,>=2.31.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (2.31.0)
Requirement already satisfied: tiktoken>=0.4.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (0.6.0)
Requirement already satisfied: tokenizers in /usr/local/python312/lib/python3.12/site-packages (from litellm) (0.19.1)
Requirement already satisfied: zipp>=0.5 in /usr/local/python312/lib/python3.12/site-packages (from importlib-metadata>=6.8.0->litellm) (3.18.1)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/python312/lib/python3.12/site-packages (from jinja2<4.0.0,>=3.1.2->litellm) (2.1.5)
Requirement already satisfied: anyio<5,>=3.5.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.3.0)
Requirement already satisfied: distro<2,>=1.7.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (1.9.0)
Requirement already satisfied: httpx<1,>=0.23.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (0.27.0)
Requirement already satisfied: pydantic<3,>=1.9.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (2.7.1)
Requirement already satisfied: sniffio in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (1.3.1)
Requirement already satisfied: tqdm>4 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.66.2)
Requirement already satisfied: typing-extensions<5,>=4.7 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.11.0)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (3.6)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (2.2.1)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (2024.2.2)
Requirement already satisfied: regex>=2022.1.18 in /usr/local/python312/lib/python3.12/site-packages (from tiktoken>=0.4.0->litellm) (2024.4.16)
Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.3.1)
Requirement already satisfied: attrs>=17.3.0 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (23.2.0)
Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.4.1)
Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (6.0.5)
Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.9.4)
Requirement already satisfied: huggingface-hub<1.0,>=0.16.4 in /usr/local/python312/lib/python3.12/site-packages (from tokenizers->litellm) (0.22.2)
Requirement already satisfied: httpcore==1.* in /usr/local/python312/lib/python3.12/site-packages (from httpx<1,>=0.23.0->openai>=1.0.0->litellm) (1.0.5)
Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/python312/lib/python3.12/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai>=1.0.0->litellm) (0.14.0)
Requirement already satisfied: filelock in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (3.13.4)
Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (2024.3.1)
Requirement already satisfied: packaging>=20.9 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (24.0)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (6.0.1)
Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/python312/lib/python3.12/site-packages (from pydantic<3,>=1.9.0->openai>=1.0.0->litellm) (0.6.0)
Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/python312/lib/python3.12/site-packages (from pydantic<3,>=1.9.0->openai>=1.0.0->litellm) (2.18.2)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

@zhonggegege
Copy link

@ayanjiushishuai 这个我安装了 image (base) [root@iZrj9f205caqao6ghsyor0Z OpenDevin-2]# pip3 install litellm Looking in indexes: http://mirrors.cloud.aliyuncs.com/pypi/simple/ Requirement already satisfied: litellm in /usr/local/python312/lib/python3.12/site-packages (1.35.31) Requirement already satisfied: aiohttp in /usr/local/python312/lib/python3.12/site-packages (from litellm) (3.9.5) Requirement already satisfied: click in /usr/local/python312/lib/python3.12/site-packages (from litellm) (8.1.7) Requirement already satisfied: importlib-metadata>=6.8.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (7.1.0) Requirement already satisfied: jinja2<4.0.0,>=3.1.2 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (3.1.3) Requirement already satisfied: openai>=1.0.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (1.23.6) Requirement already satisfied: python-dotenv>=0.2.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (1.0.1) Requirement already satisfied: requests<3.0.0,>=2.31.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (2.31.0) Requirement already satisfied: tiktoken>=0.4.0 in /usr/local/python312/lib/python3.12/site-packages (from litellm) (0.6.0) Requirement already satisfied: tokenizers in /usr/local/python312/lib/python3.12/site-packages (from litellm) (0.19.1) Requirement already satisfied: zipp>=0.5 in /usr/local/python312/lib/python3.12/site-packages (from importlib-metadata>=6.8.0->litellm) (3.18.1) Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/python312/lib/python3.12/site-packages (from jinja2<4.0.0,>=3.1.2->litellm) (2.1.5) Requirement already satisfied: anyio<5,>=3.5.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.3.0) Requirement already satisfied: distro<2,>=1.7.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (1.9.0) Requirement already satisfied: httpx<1,>=0.23.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (0.27.0) Requirement already satisfied: pydantic<3,>=1.9.0 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (2.7.1) Requirement already satisfied: sniffio in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (1.3.1) Requirement already satisfied: tqdm>4 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.66.2) Requirement already satisfied: typing-extensions<5,>=4.7 in /usr/local/python312/lib/python3.12/site-packages (from openai>=1.0.0->litellm) (4.11.0) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (3.3.2) Requirement already satisfied: idna<4,>=2.5 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (3.6) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (2.2.1) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/python312/lib/python3.12/site-packages (from requests<3.0.0,>=2.31.0->litellm) (2024.2.2) Requirement already satisfied: regex>=2022.1.18 in /usr/local/python312/lib/python3.12/site-packages (from tiktoken>=0.4.0->litellm) (2024.4.16) Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.3.1) Requirement already satisfied: attrs>=17.3.0 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (23.2.0) Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.4.1) Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (6.0.5) Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/python312/lib/python3.12/site-packages (from aiohttp->litellm) (1.9.4) Requirement already satisfied: huggingface-hub<1.0,>=0.16.4 in /usr/local/python312/lib/python3.12/site-packages (from tokenizers->litellm) (0.22.2) Requirement already satisfied: httpcore==1.* in /usr/local/python312/lib/python3.12/site-packages (from httpx<1,>=0.23.0->openai>=1.0.0->litellm) (1.0.5) Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/python312/lib/python3.12/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai>=1.0.0->litellm) (0.14.0) Requirement already satisfied: filelock in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (3.13.4) Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (2024.3.1) Requirement already satisfied: packaging>=20.9 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (24.0) Requirement already satisfied: pyyaml>=5.1 in /usr/local/python312/lib/python3.12/site-packages (from huggingface-hub<1.0,>=0.16.4->tokenizers->litellm) (6.0.1) Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/python312/lib/python3.12/site-packages (from pydantic<3,>=1.9.0->openai>=1.0.0->litellm) (0.6.0) Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/python312/lib/python3.12/site-packages (from pydantic<3,>=1.9.0->openai>=1.0.0->litellm) (2.18.2) WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

如果你按照上述配置还是错误,最简单的办法就是重建conda环境,我尝试过很多次,最简单的办法就是直接使用docker run的方法,关键在于启动参数的方式似乎存在我不理解的混乱(我没有更深研究这里的错误),你可以参考这个:
docker run
-e LLM_API_KEY
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE
-e LLM_MODEL="openai/lm-studio"
-e SANDBOX_TYPE=exec
-e LLM_BASE_URL="http://192.168.0.93:1234/v1"
-e LLM_MODEL="openai/bartowski/CodeQwen1.5-7B-GGUF"
-v $WORKSPACE_BASE:/opt/workspace_base
-v /var/run/docker.sock:/var/run/docker.sock
-p 3000:3000
--add-host host.docker.internal=host-gateway
ghcr.io/opendevin/opendevin:0.4.0

关键点:e LLM_MODEL、-e LLM_MODEL,最重要的是在WEB UI中使用时模型的选择LLM_MODEL,这里比较容易出错。当然我是本地LLM的运行方式,你可以参考。还有一种就是开发模式make build.

@ayanjiushishuai
Copy link

@he-aook 抱歉 我之前没注意 你这个问题实际上不是因为没安装 是因为你本地文件起的名字和类库的名字重叠了 然后他import了他自己导致没有找到那个completion 修改一下文件名称再跑一下脚本 确认一下你的配置是否正确
如果正确了且前台正常启动就直接在页面找自己的模型名就可以

@he-aook
Copy link
Author

he-aook commented Apr 29, 2024

@ayanjiushishuai
#1385 (comment)
image
<img width="559" alt="image" src="https://github.com/OpenDevin/OpenDevin/assets/48249754/7bd6cd5f-a123-49f8-a577-7ceb02719a5d"
是继续执行这个文件吗? 我有些不太理解具体该怎么操作;

@ayanjiushishuai
Copy link

@he-aook 我看这个报错可能是你的参数配置有问题 建议确认下你的Azure上的参数和配置的是否有出入
查看你之前的截图 可能是部分参数没有写
#1385 (comment)

from litellm import completion

## set ENV variables
os.environ["AZURE_API_KEY"] = ""
os.environ["AZURE_API_BASE"] = ""       ## miss
os.environ["AZURE_API_VERSION"] = ""   ## miss

# azure call
response = completion(
    model = "azure/<your_deployment_name>", 
    messages = [{ "content": "Hello, how are you?","role": "user"}]
)

如果这个用例本地运行正常了 再对应填到opendevin的参数 应该就行了

@he-aook
Copy link
Author

he-aook commented Apr 29, 2024 via email

@he-aook
Copy link
Author

he-aook commented Apr 29, 2024

@ayanjiushishuai
(base) [root@iZrj9f205caqao6ghsyor0Z python]# cat a.py
from litellm import completion

set ENV variables

os.environ["OPENAI_API_KEY"] = "XXXXXXXXXXXXXX"
os.environ["AZURE_API_BASE"] = "https://ai-XXXXX-dev.openai.azure.com/"
os.environ["AZURE_API_VERSION"] = "0125-Preview"

messages = [{ "content": "Hello, how are you?","role": "user"}]

cohere call

response = completion(
model = "azure/gpt4-1106-test",
messages = [{ "content": "Hello, how are you?","role": "user"}]
)
print(response)
(base) [root@iZrj9f205caqao6ghsyor0Z python]#

这样配置执行也是报错的, 使用的是python3.11 执行的

@ayanjiushishuai
Copy link

os.environ["AZURE_API_VERSION"] = "0125-Preview"
这个配置的格式是错误的 应该为类似于2024-01-25-preview这样格式

@he-aook
Copy link
Author

he-aook commented Apr 30, 2024 via email

@rbren rbren added the severity:low Minor issues, code cleanup, etc label May 2, 2024
@SmartManoj
Copy link
Contributor

https://litellm.vercel.app/docs/providers/azure
set AZURE_API_KEY instead of OPENAI_API_KEY

@he-aook
Copy link
Author

he-aook commented May 6, 2024 via email

@he-aook
Copy link
Author

he-aook commented May 6, 2024

@SmartManoj 不管怎么配置调试, 都会报错
File "/root/.cache/pypoetry/virtualenvs/opendevin-QzKVoApH-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1012, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

ERROR:root:<class 'KeyError'>: "Please set 'LLM_API_KEY' in config.toml or .env."

config.toml
LLM_MODEL="azure/gpt4-test-ncus-0125"
#LLM_API_KEY="aaad"
AZURE_API_KEY="aaa"
LLM_EMBEDDING_MODEL="azureopenai"
LLM_BASE_URL="https://ai-test-dev.openai.azure.com/"
LLM_EMBEDDING_DEPLOYMENT_NAME="gpt-4"
LLM_API_VERSION="2024-01-25-Preview"
WORKSPACE_BASE="./workspace"
SANDBOX_TYPE="exec"

(base) [root@iZrj9f205caqao6ghsyor0Z python]# cat a.py
from litellm import completion
import os

set ENV variables

os.environ["AZURE_API_KEY"] = "PPPPPPP"
os.environ["AZURE_API_BASE"] = "https://ai-test-dev.openai.azure.com/"
os.environ["AZURE_API_VERSION"] = "2024-01-25-Preview"

messages = [{ "content": "Hello, how are you?","role": "user"}]

cohere call

response = completion(
model = "azure/gpt4-1106-test",
messages = [{ "content": "Hello, how are you?","role": "user"}]
)
print(response)

@SmartManoj
Copy link
Contributor

SmartManoj commented May 6, 2024

ERROR:root:<class 'KeyError'>: "Please set 'LLM_API_KEY' in config.toml or .env."

Please set LLM_API_KEY instead of AZURE_API_KEY coz the LLM_API_KEY is required var.

@he-aook
Copy link
Author

he-aook commented May 6, 2024

@SmartManoj
File "/root/miniforge3/lib/python3.11/site-packages/litellm/utils.py", line 8760, in exception_type
raise APIError(
litellm.exceptions.APIError: AzureException - Missing credentials. Please pass one of api_key, azure_ad_token, azure_ad_token_provider, or the AZURE_OPENAI_API_KEY or AZURE_OPENAI_AD_TOKEN environment variables.
(base) [root@iZrj9f205caqao6ghsyor0Z python]# cat a.py
from litellm import completion
import os

set ENV variables

os.environ["LLM_API_KEY"] = "LLLLLL"
os.environ["AZURE_API_BASE"] = "https://ai-test-dev.openai.azure.com/"
os.environ["AZURE_API_VERSION"] = "2024-01-25-Preview"

messages = [{ "content": "Hello, how are you?","role": "user"}]

cohere call

response = completion(
model = "azure/gpt4-1106-test",
messages = [{ "content": "Hello, how are you?","role": "user"}]
)
print(response)

还是不对

@SmartManoj
Copy link
Contributor

SmartManoj commented May 6, 2024

Please add full traceback.

api_key is already passed. Could you also set AZURE_OPENAI_API_KEY as LLM_API_KEY

@he-aook
Copy link
Author

he-aook commented May 8, 2024

@SmartManoj 不搞了, 实在搞不懂了

@SmartManoj
Copy link
Contributor

#LLM_API_KEY="aaad"

Please uncomment this and run.

@he-aook he-aook closed this as completed May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested severity:low Minor issues, code cleanup, etc
Projects
None yet
Development

No branches or pull requests

6 participants