Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPT-4o: model_not_found #604

Closed
tanders opened this issue May 16, 2024 · 5 comments
Closed

GPT-4o: model_not_found #604

tanders opened this issue May 16, 2024 · 5 comments
Labels
question Further information is requested

Comments

@tanders
Copy link

tanders commented May 16, 2024

Issue

Since switching to Aider v0.35.0, I cannot use the new default model. Regardless of what I input as a prompt, aider always crashes and I get a long stack trace that includes the following:
litellm.llms.openai.OpenAIError: Error code: 403 - {'error': {'message': 'Project proj_uFBGSkr9FAbFof6HIsCIszAPdoes not have access to modelgpt-4o', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

What does this mean?

For completeness, using different models works fine as before, but perhaps GPT-4o is not quite ready to be the default yet?

Below is the full stack trace.

[Some arbitrary prompt]

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/llms/openai.py", line 427, in completion
raise e
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/llms/openai.py", line 345, in completion
return self.streaming(
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/llms/openai.py", line 528, in streaming
response = openai_client.chat.completions.create(**data, timeout=timeout)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 590, in create
return self._post(
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
return self._request(
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.PermissionDeniedError: Error code: 403 - {'error': {'message': 'Project proj_uFBGSkr9FAbFof6HIsCIszAP does not have access to model gpt-4o', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/main.py", line 1055, in completion
raise e
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/main.py", line 1028, in completion
response = openai_chat_completions.completion(
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/llms/openai.py", line 433, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 403 - {'error': {'message': 'Project proj_uFBGSkr9FAbFof6HIsCIszAP does not have access to model gpt-4o', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/.local/bin/aider", line 8, in
sys.exit(main())
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/aider/main.py", line 408, in main
coder.run()
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/aider/coders/base_coder.py", line 500, in run
list(self.send_new_user_message(new_user_message))
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/aider/coders/base_coder.py", line 681, in send_new_user_message
yield from self.send(messages, functions=self.functions)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/aider/coders/base_coder.py", line 809, in send
hash_object, completion = send_with_retries(model, messages, functions, self.stream)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/backoff/_sync.py", line 105, in retry
ret = target(*args, **kwargs)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/aider/sendchat.py", line 71, in send_with_retries
res = litellm.completion(**kwargs)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/utils.py", line 3222, in wrapper
raise e
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/utils.py", line 3116, in wrapper
result = original_function(*args, **kwargs)
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/main.py", line 2228, in completion
raise exception_type(
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/utils.py", line 9283, in exception_type
raise e
File "/root/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/litellm/utils.py", line 8028, in exception_type
raise NotFoundError(
litellm.exceptions.NotFoundError: OpenAIException - Error code: 403 - {'error': {'message': 'Project proj_uFBGSkr9FAbFof6HIsCIszAP does not have access to model gpt-4o', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Model: gpt-4o
API Base: https://api.openai.com
Messages: [{'role': 'system', 'content': "Act as an expert software developer.\nAlways use best practices when
root@d2704da8fca9:/app#

Version and model info

Aider v0.35.0
Models: openai/gpt-4o with diff edit format, weak model gpt-3.5-turbo
Git repo: .git with 34 files
Repo-map: using 1024 tokens

@paul-gauthier paul-gauthier added the question Further information is requested label May 16, 2024
@paul-gauthier
Copy link
Owner

Thanks for trying aider and filing this issue.

The error is Project proj_uFBGSkr9FAbFof6HIsCIszAP does not have access to model gpt-4o which sounds like your OpenAI account/project does not have access to gpt-4o.

Can you access it via the openai playground?

@ccasselman
Copy link

I get the same thing. How do I "access it via the openai playground"? I just setup today and have replaced the api key 3 times.

@tanders
Copy link
Author

tanders commented May 17, 2024

sounds like your OpenAI account/project does not have access to gpt-4o

Indeed, I can confirm that this problem is not Aider related. So, feel free to close this issue.

Thanks for trying aider and filing this issue.

Thanks for coming back so quickly.

In fact, I should have thanked you in the first place in my initial post for making aider available. It is a really helpful teammate for me!

@tanders
Copy link
Author

tanders commented May 17, 2024

How do I "access it via the openai playground"?

Log into your OpenAI account and have a look at this:
https://platform.openai.com/playground

@paul-gauthier
Copy link
Owner

I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants