Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use with personal context #95

Open
Victordeleusse opened this issue Mar 20, 2024 · 1 comment
Open

Use with personal context #95

Victordeleusse opened this issue Mar 20, 2024 · 1 comment

Comments

@Victordeleusse
Copy link

Hello,
Trying to implement a way to question PDFs locally and get answers only based on data from the docs. I have already find a way to embed the data into a vector db (using Chroma) and then retrieve with a "similarity_search" the most relevant data from our query into the doc. I would like now to find a way to give to my model this context to generate answer on it, maybe by using a prompt into the generate call ?

query = "What is the date of the start of the battle ?"
    docs = db.similarity_search(query)
    print(docs[0].page_content)
    
    llm = Ollama(
        model=llm_model_name,
        callbacks=[StreamingStdOutCallbackHandler()],
    )
    my_retriever = db.as_retriever(search_kwargs={"k": 8})
    
    response = ollama.generate(
        "model": llm_model_name,
        
    )``` 
Thank you for your help !
@chandansp27
Copy link

Refer to this blog, it has a basic code example that can help your issue, feel free to reach out if you need further help.

https://ollama.com/blog/embedding-models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants