Skip to content

query_engine issue #231

Answered by fcakyon
yangboz asked this question in Q&A
Feb 6, 2024 · 3 comments · 6 replies
Discussion options

You must be logged in to vote

You just have to provide an ollama model name as llm_model="ollama/llama2" and ollama endpoint URL as llm_api_base="http://localhost:11434" in query_engine.

Feel free to open a PR for adding this information into readme :) @yangboz

Replies: 3 comments 6 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
5 replies
@yangboz
Comment options

@fcakyon
Comment options

Answer selected by fcakyon
@yangboz
Comment options

@fcakyon
Comment options

@yangboz
Comment options

Comment options

You must be logged in to vote
1 reply
@fcakyon
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #229 on February 06, 2024 05:29.