Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: the app seems to hang unexpectedly #1420

Open
2 tasks done
robertherbaugh opened this issue Apr 28, 2024 · 5 comments
Open
2 tasks done

[Bug]: the app seems to hang unexpectedly #1420

robertherbaugh opened this issue Apr 28, 2024 · 5 comments
Labels
bug Something isn't working severity:low Minor issues, code cleanup, etc

Comments

@robertherbaugh
Copy link

Is there an existing issue for the same bug?

Describe the bug

OPENAI_API_KEY is not passed as expected using the startup instructions. Additionally, when passing -e OPENAI_API_KEY, the docker image still does not process it.

Current Version

ghcr.io/opendevin/opendevin:0.4.0

Installation and Configuration

export LLM_API_KEY="sk-..."
export WORKSPACE_BASE="$(pwd)/workspace-directory
docker run \
    -e LLM_API_KEY \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal=host-gateway \
    ghcr.io/opendevin/opendevin:0.4.0

Model and Agent

No response

Reproduction Steps

  1. Export LLM API KEY
  2. Export Workspaces Directory
  3. Use Docker Startup Script provided in github.

Logs, Errors, Screenshots, and Additional Context

INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO: 10.30.10.6:49745 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 10.30.10.6:49745 - "GET /index.html HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /assets/index-CZQzs2DR.css HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /assets/index-D59teWsw.js HTTP/1.1" 200 OK
03:10:35 - opendevin:ERROR: auth.py:31 - Invalid token
03:10:35 - opendevin:INFO: listen.py:74 - Invalid or missing credentials, generating new session ID: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2
INFO: 10.30.10.6:49745 - "GET /api/auth HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /locales/en-US/translation.json HTTP/1.1" 404 Not Found
INFO: 10.30.10.6:49746 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /favicon-32x32.png HTTP/1.1" 200 OK
INFO: ('10.30.10.6', 49747) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI2OWFhZTEwYy1hMjRlLTRmNGYtYTZiYS1mOTkzNTI2ZDFlYzIifQ.Z94inVNiUAws0YyhMvqeY5vROlV-Ha8547CjU9ACsdk" [accepted]
INFO: connection open
Starting loop_recv for sid: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2
INFO: 10.30.10.6:49746 - "GET /api/refresh-files HTTP/1.1" 200 OK
INFO: 10.30.10.6:49746 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO: 10.30.10.6:49745 - "GET /api/agents HTTP/1.1" 200 OK
03:10:36 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM gpt-3.5-turbo
03:10:36 - opendevin:INFO: llm.py:51 - Initializing LLM with model: gpt-3.5-turbo
03:10:37 - opendevin:INFO: ssh_box.py:353 - Container stopped
03:10:37 - opendevin:INFO: ssh_box.py:373 - Mounting workspace directory: /home/st-dev-autodev9000/devin-new
03:10:38 - opendevin:INFO: ssh_box.py:396 - Container started
03:10:39 - opendevin:INFO: ssh_box.py:413 - waiting for container to start: 1, container status: running
03:10:39 - opendevin:INFO: ssh_box.py:178 - Connecting to root@host.docker.internal via ssh. If you encounter any issues, you can try ssh -v -p 44717 root@host.docker.internal with the password '88f35f3a-eac7-43f6-9742-27a703381668' and report the issue on GitHub.

==============
STEP 0

03:11:20 - PLAN

03:11:20 - opendevin:ERROR: agent_controller.py:102 - Error in loop
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = OpenAI(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1010, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 983, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 422, in completion
raise OpenAIError(status_code=500, message=traceback.format_exc())
litellm.llms.openai.OpenAIError: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = OpenAI(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/app/opendevin/controller/agent_controller.py", line 98, in _run
finished = await self.step(i)
^^^^^^^^^^^^^^^^^^
File "/app/opendevin/controller/agent_controller.py", line 211, in step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/agenthub/monologue_agent/agent.py", line 218, in step
resp = self.llm.completion(messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 289, in wrapped_f
return self(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 379, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 382, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/opendevin/llm/llm.py", line 78, in wrapper
resp = completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2977, in wrapper
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2875, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2137, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8665, in exception_type
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7431, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion
openai_client = openai(
^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

@robertherbaugh robertherbaugh added the bug Something isn't working label Apr 28, 2024
@enyst
Copy link
Collaborator

enyst commented Apr 28, 2024

Can you please try:
-e LLM_API_KEY=$LLM_API_KEY \
instead, in the docker command?

@robertherbaugh
Copy link
Author

That worked for the passthrough of the variable. However, now it starts the task, but then immediately quits the server after Step 1.

INFO: 10.30.10.6:58371 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO: 10.30.10.6:58373 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO: 10.30.10.6:58373 - "GET /api/agents HTTP/1.1" 200 OK

==============
STEP 0

15:21:43 - PLAN
Can you write me a python scrip that will print numbers 0-100
15:21:49 - ACTION
CmdRunAction(command='ls', background=False, action='run')
15:21:49 - OBSERVATION

==============
STEP 1

15:21:49 - PLAN
Can you write me a python scrip that will print numbers 0-100

@enyst
Copy link
Collaborator

enyst commented Apr 28, 2024

I'm not sure why it would hang on such task. I just tried with the same version, 0.4.0, and GPT-3.5, the same prompt and it worked, in 3 steps, including its initial ls and the final action=finish. Did docker quit, can you inspect the container? Otherwise can you restart it and see if it hangs again?

Edited to add: also, what operating system are you running on?

@enyst enyst added the severity:low Minor issues, code cleanup, etc label Apr 29, 2024
@enyst enyst changed the title [Bug]: OPENAI_API_KEY issue [Bug]: the app seems to hang unexpectedly Apr 29, 2024
@assertion
Copy link
Contributor

assertion commented Apr 29, 2024

This pr will fix the problem of app hang when exception thrown in agent_controller step execution: #1445
@enyst

@enyst
Copy link
Collaborator

enyst commented Apr 29, 2024

Thank you @assertion ! ❤️

The fix is on main, @robertherbaugh if you wish to try it, it should behave more reasonably.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:low Minor issues, code cleanup, etc
Projects
None yet
Development

No branches or pull requests

3 participants