Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make maximum monologue length configurable #1312

Open
lowlyocean opened this issue Apr 23, 2024 · 2 comments
Open

Make maximum monologue length configurable #1312

lowlyocean opened this issue Apr 23, 2024 · 2 comments
Labels
question Further information is requested severity:low Minor issues, code cleanup, etc

Comments

@lowlyocean
Copy link

lowlyocean commented Apr 23, 2024

Describe your question

Can anyone else reproduce this odd behavior when running ollama/codeqwen:7b-chat-v1.5-q4_0 ?

The first MonlogueAgent prompt has too many thoughts for it to succeeed. It just prints out something like :
101011 10 ▅ ▅10

This is the last thought that it can tolerate (and print correctly-formatted JSON response):

  {
    "action": "think",
    "args": {
      "thought": "Very cool. Now to accomplish my task."
    }
  },

Because as soon as the next one is added, it generates that nonsense output:

  {
    "action": "think",
    "args": {
      "thought": "I'll need a strategy. And as I make progress, I'll need to keep refining that strategy. I'll need to set goals, and break them into sub-goals."
    }
  },

Additional context

Using the docker image, launched from WSL on Windows 11. Ollama version 0.1.32

@lowlyocean lowlyocean added the question Further information is requested label Apr 23, 2024
@lowlyocean
Copy link
Author

lowlyocean commented Apr 23, 2024

Is there already a way to expose Agent-specific parameters to the UI, or would I need to perform a build to adjust this parameter below (which I suspect may be too high for this local LLM) ?

MAX_MONOLOGUE_LENGTH = 20000

Edit: looks like there's already a proposal to expose these to UI
#1135 (comment)

@rbren
Copy link
Collaborator

rbren commented Apr 25, 2024

Definitely! We could also add this to config.py so it can be set with an env var--feel free to open a PR!

@rbren rbren changed the title MonologueAgent initial thought list -- too large for CodeQwen ? Make maximum monologue length configurable May 2, 2024
@rbren rbren added the severity:low Minor issues, code cleanup, etc label May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested severity:low Minor issues, code cleanup, etc
Projects
None yet
Development

No branches or pull requests

2 participants