Skip to content

How to use locally hosted LLM's ? #193

Answered by fcakyon
notV3NOM asked this question in Q&A
Discussion options

You must be logged in to vote

You'll need to provide a local embedding model as well.

Example:

import os
from autollm import AutoQueryEngine

os.environ['HUGGINGFACE_API_KEY'] = ""

AutoQueryEngine.from_defaults(
    ...,
    embed_model='huggingface/BAAI/bge-large-zh',
)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@notV3NOM
Comment options

Answer selected by fcakyon
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #190 on January 06, 2024 12:46.