New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: the app seems to hang unexpectedly #1420
Comments
Can you please try: |
That worked for the passthrough of the variable. However, now it starts the task, but then immediately quits the server after Step 1. INFO: 10.30.10.6:58371 - "GET /api/litellm-models HTTP/1.1" 200 OK ============== 15:21:43 - PLAN ============== 15:21:49 - PLAN |
I'm not sure why it would hang on such task. I just tried with the same version, 0.4.0, and GPT-3.5, the same prompt and it worked, in 3 steps, including its initial Edited to add: also, what operating system are you running on? |
Thank you @assertion ! ❤️ The fix is on main, @robertherbaugh if you wish to try it, it should behave more reasonably. |
Is there an existing issue for the same bug?
Describe the bug
OPENAI_API_KEY is not passed as expected using the startup instructions. Additionally, when passing -e OPENAI_API_KEY, the docker image still does not process it.
Current Version
Installation and Configuration
Model and Agent
No response
Reproduction Steps
Logs, Errors, Screenshots, and Additional Context
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO: 10.30.10.6:49745 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 10.30.10.6:49745 - "GET /index.html HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /assets/index-CZQzs2DR.css HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /assets/index-D59teWsw.js HTTP/1.1" 200 OK
03:10:35 - opendevin:ERROR: auth.py:31 - Invalid token
03:10:35 - opendevin:INFO: listen.py:74 - Invalid or missing credentials, generating new session ID: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2
INFO: 10.30.10.6:49745 - "GET /api/auth HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /locales/en-US/translation.json HTTP/1.1" 404 Not Found
INFO: 10.30.10.6:49746 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /favicon-32x32.png HTTP/1.1" 200 OK
INFO: ('10.30.10.6', 49747) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI2OWFhZTEwYy1hMjRlLTRmNGYtYTZiYS1mOTkzNTI2ZDFlYzIifQ.Z94inVNiUAws0YyhMvqeY5vROlV-Ha8547CjU9ACsdk" [accepted]
INFO: connection open
Starting loop_recv for sid: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2
INFO: 10.30.10.6:49746 - "GET /api/refresh-files HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/agents HTTP/1.1" 200 OK
03:10:36 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM gpt-3.5-turbo
03:10:36 - opendevin:INFO: llm.py:51 - Initializing LLM with model: gpt-3.5-turbo
03:10:37 - opendevin:INFO: ssh_box.py:353 - Container stopped
03:10:37 - opendevin:INFO: ssh_box.py:373 - Mounting workspace directory: /home/st-dev-autodev9000/devin-new
03:10:38 - opendevin:INFO: ssh_box.py:396 - Container started
03:10:39 - opendevin:INFO: ssh_box.py:413 - waiting for container to start: 1, container status: running
03:10:39 - opendevin:INFO: ssh_box.py:178 - Connecting to root@host.docker.internal via ssh. If you encounter any issues, you can try
ssh -v -p 44717 root@host.docker.internal
with the password '88f35f3a-eac7-43f6-9742-27a703381668' and report the issue on GitHub.==============
STEP 0
03:11:20 - PLAN
03:11:20 - opendevin:ERROR: agent_controller.py:102 - Error in loop
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = OpenAI(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1010, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 983, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 422, in completion
raise OpenAIError(status_code=500, message=traceback.format_exc())
litellm.llms.openai.OpenAIError: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = OpenAI(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/opendevin/controller/agent_controller.py", line 98, in _run
finished = await self.step(i)
^^^^^^^^^^^^^^^^^^
File "/app/opendevin/controller/agent_controller.py", line 211, in step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/agenthub/monologue_agent/agent.py", line 218, in step
resp = self.llm.completion(messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 289, in wrapped_f
return self(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 379, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 382, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/opendevin/llm/llm.py", line 78, in wrapper
resp = completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2977, in wrapper
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2875, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2137, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8665, in exception_type
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7431, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = openai(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
The text was updated successfully, but these errors were encountered: