Replies: 3 comments 4 replies
-
Hi @fxrobin, support for OLLAMA via REST can be added, would be awesome if you could contribute it! |
Beta Was this translation helpful? Give feedback.
-
I just noticed that initial support for OLLAMA was merged. The OLLAMA API has a concept referred to as "context", a part of the response which can be fed back to subsequent requests to maintain a chat memory of sorts. Does this conflict with the langchain4j "ChatMemory" concept? The javadoc states that "Since language models do not keep the state of the conversation, it is necessary to provide all previous messages on every interaction with the language model". Does this mean that we should avoid adding state (such as the OLLAMA context) to (Chat)LanguageModels? |
Beta Was this translation helpful? Give feedback.
-
First of all, fantastic work, congratulations :-) secondly I have a doubt , recently, ollama also supports the tools, this implementation is already available in LangChain, it would be interesting to include this functionality in LangChain4j, is there any work on it yet? https://python.langchain.com/v0.2/docs/integrations/chat/ollama_functions/ |
Beta Was this translation helpful? Give feedback.
-
LangChain supports Ollama but langchain4j does not up to now.
I feel like ollama is a great IA engine, so please when its support will be available?
Furthermore, I can contribute if there is an "how to contribute" document and a technical overview of langchain4java.
Best regards.
Beta Was this translation helpful? Give feedback.
All reactions