Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT]: Ollama Agent support #1242

Closed
timothycarambat opened this issue Apr 30, 2024 · 2 comments · Fixed by #1270
Closed

[FEAT]: Ollama Agent support #1242

timothycarambat opened this issue Apr 30, 2024 · 2 comments · Fixed by #1270

Comments

@timothycarambat
Copy link
Member

What would you like to see?

Add support for agents when using Ollama (or built-in AnythingLLM LLM on desktop)

@masonjames
Copy link

Have you used the Desktop app? You can download it. Go to Settings>LLM Preferences and under LLM Provider choose Ollama.

From there you point to the base URL (typically http://127.0.0.1:11434) and choose your model.

@timothycarambat
Copy link
Member Author

timothycarambat commented May 2, 2024

yes i know, i built the app 👍

joking aside, this issue is for tracking the use of the @agent directive with Ollama, which isn't yet supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants