We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我在使用ChromaDB_VectorStore.__init__的时候没法设置使用Ollama的嵌入模型,类似这种embeddings = OllamaEmbeddings(base_url="http://127.0.0.1:8000",model="znbang/bge:large-zh-v1.5-f32")
现在的Ollama可以支持LLM和嵌入模型
The text was updated successfully, but these errors were encountered:
from vanna.base import VannaBase from vanna.ollama import Ollama from vanna.chromadb import ChromaDB_VectorStore from chromadb.utils.embedding_functions import OllamaEmbeddingFunction from time import sleep import os import traceback # 创建 OllamaEmbeddingFunction 实例 ollama_embedding_function = OllamaEmbeddingFunction( model_name= "mxbai-embed-large:latest", #model_name="dztech/bge-large-zh:v1.5", url="http://17.2.2.2.2:11343/api/embeddings", ) class MyVanna(ChromaDB_VectorStore, Ollama): def __init__(self, vsconfig=None, oconfig=None): ChromaDB_VectorStore.__init__(self, config=vsconfig) Ollama.__init__(self, config=oconfig) # 配置 config vsconfig = { "embedding_function": ollama_embedding_function, "n_results_sql": 5, "n_results_documentation": 5, "n_results_ddl": 3, "path": "/home/cusc/script/vanna/vannadb", } oconfig = { "model": "qwen:32b-chat-v1.5-q4_0", "ollama_host": "http://17.2.2.2.2:11343", } if __name__ == "__main__": vn = MyVanna(vsconfig=vsconfig, oconfig=oconfig)
Sorry, something went wrong.
No branches or pull requests
我在使用ChromaDB_VectorStore.__init__的时候没法设置使用Ollama的嵌入模型,类似这种embeddings = OllamaEmbeddings(base_url="http://127.0.0.1:8000",model="znbang/bge:large-zh-v1.5-f32")
现在的Ollama可以支持LLM和嵌入模型
The text was updated successfully, but these errors were encountered: