Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

local llama3 but log show gpt-3.5-turbo #1407

Closed
zlw123 opened this issue Apr 27, 2024 · 5 comments
Closed

local llama3 but log show gpt-3.5-turbo #1407

zlw123 opened this issue Apr 27, 2024 · 5 comments
Labels
question Further information is requested

Comments

@zlw123
Copy link

zlw123 commented Apr 27, 2024

use llama3 local, OpenDevin in docker start with

docker run
--add-host host.docker.internal=host-gateway
-e LLM_API_KEY="ollama"
-e LLM_BASE_URL="http://host.docker.internal:11434"
-e WORKSPACE_MOUNT_PATH=D:/opendevin/workspace
-vD:/opendevin/workspace:/opt/workspace_base
-vD:/opendevin/workspace/docker.sock:/var/run/docker.sock
-p 3000:3000
ghcr.io/opendevin/opendevin:main

error:

root@555c6b64db3f:/app/logs# cat opendevin_2024-04-27.log
12:11:49 - opendevin:ERROR: auth.py:31 - Invalid token
12:11:49 - opendevin:INFO: listen.py:75 - Invalid or missing credentials, generating new session ID: d59448f4-6979-4cbe-9c3e-a34f3c4bea4f
12:11:50 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM gpt-3.5-turbo
12:11:50 - opendevin:INFO: llm.py:52 - Initializing LLM with model: gpt-3.5-turbo
12:11:50 - opendevin:ERROR: ssh_box.py:69 - Please check Docker is running using docker ps.
12:11:50 - opendevin:ERROR: agent.py:155 - Error creating controller: Error while fetching server API version: ('Connection aborted.', ConnectionRefusedError(111, 'Connection refused'))
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 793, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 496, in _make_request
conn.request(
File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 400, in request
self.endheaders()
File "/usr/local/lib/python3.12/http/client.py", line 1331, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.12/http/client.py", line 1091, in _send_output
self.send(msg)
File "/usr/local/lib/python3.12/http/client.py", line 1035, in send
self.connect()
File "/app/.venv/lib/python3.12/site-packages/docker/transport/unixconn.py", line 27, in connect
sock.connect(self.unix_socket)
ConnectionRefusedError: [Errno 111] Connection refused

@zlw123 zlw123 added the question Further information is requested label Apr 27, 2024
@rbren
Copy link
Collaborator

rbren commented Apr 27, 2024

@zlw123 you need to set the model in the settings modal in the UI. There's a gear wheel in the bottom right

@isavita
Copy link
Contributor

isavita commented Apr 27, 2024

@zlw123 You will need to use ollama/llama3 as name

  1. The gear is in the bottom right
Screenshot 2024-04-27 at 16 21 19
  1. The modal in the UI. You might need to type ollama/llama3 because it does not appears in the list, also it doesn't work currently with simple llama3 you will need full ollama/llama3.
Screenshot 2024-04-27 at 16 20 55

@enyst
Copy link
Collaborator

enyst commented Apr 27, 2024

Can you please confirm, you have used llama3 this way, @isavita ?

@isavita
Copy link
Contributor

isavita commented Apr 27, 2024

@enyst
This is from my terminal log

INFO:     connection open
Starting loop_recv for sid: 1120a483-1c24-427d-b1fc-a06942e70053
INFO:     192.168.65.1:64810 - "GET /api/refresh-files HTTP/1.1" 200 OK
15:20:23 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM ollama/llama3
15:20:23 - opendevin:INFO: llm.py:52 - Initializing LLM with model: ollama/llama3
15:20:23 - opendevin:INFO: ssh_box.py:353 - Container stopped
15:20:23 - opendevin:WARNING: ssh_box.py:365 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
15:20:23 - opendevin:INFO: ssh_box.py:373 - Mounting workspace directory: /Users/isavita/code/workspace
15:20:23 - opendevin:INFO: ssh_box.py:396 - Container started
15:20:24 - opendevin:INFO: ssh_box.py:413 - waiting for container to start: 1, container status: running
15:20:25 - opendevin:INFO: ssh_box.py:178 - Connecting to root@host.docker.internal via ssh. If you encounter any issues, you can try `ssh -v -p 53053 root@host.docker.internal` with the password 'e9510b7f-aa92-490a-92e6-494f9461658d' and report the issue on GitHub.
INFO:     192.168.65.1:64810 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO:     192.168.65.1:64814 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     192.168.65.1:64814 - "GET /api/agents HTTP/1.1" 200 OK


==============
STEP 0

15:21:48 - PLAN
write me working snake game
15:22:02 - ACTION
FileReadAction(path='/workspace/snake_game.js', start=0, end=-1, thoughts='', action='read')
15:22:02 - OBSERVATION
File not found: /workspace/snake_game.js



==============
STEP 1

15:22:02 - PLAN
write me working snake game
15:22:09 - ACTION
AgentThinkAction(thought='I should probably install a package manager like npm to manage my dependencies for the snake game. I can then use it to install any necessary packages.', action='think')


==============
STEP 2

15:22:09 - PLAN
write me working snake game
15:22:17 - ACTION
AgentThinkAction(thought="I should probably start by installing Node.js, since it's a popular choice for building JavaScript applications. And then I can use npm to install the necessary packages for the snake game.", action='think')


==============
STEP 3

15:22:17 - PLAN
write me working snake game
15:22:24 - ACTION
AgentThinkAction(thought='I should probably start by setting up Node.js on my system. I can use `apt` to install it.', action='think')

....

@rbren
Copy link
Collaborator

rbren commented May 2, 2024

Looks like this was solved!

@rbren rbren closed this as completed May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants