You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello guys I need a little help :d
So I have just started learning Haystack and all its functionality and I really like it but with RAG I seem to have an issue: So I created my vector database with Haystack+Pinenone works like a charm and I dont have an issue but when I apply the retrieval code I get an error revolving arround the " query_pipeline.connect("prompt_builder", "llm") " it says that the llm and prompt_builder does not match in input and ouput variables.... I have tried everything I even tried to switch to the DynamicChatPromptBuilder but the issue still there any help would be appreciated :D here is the full code :
from haystack.utils import Secret
from haystack.components.embedders import SentenceTransformersTextEmbedder
from haystack.components.builders import PromptBuilder
from haystack_integrations.components.retrievers.pinecone import PineconeEmbeddingRetriever
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack import Pipeline
from utility import pinecone_config
from dotenv import load_dotenv
prompt_template = """Answer the following query based on the provided context. If the context does
not include an answer, reply with 'I don't know'.\n
Query: {{query}}
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
Answer:
"""
def get_result(query):
query_pipeline = Pipeline()
query_pipeline.add_component("text_embedder", SentenceTransformersTextEmbedder())
query_pipeline.add_component("retriever", PineconeEmbeddingRetriever(document_store=pinecone_config()))
query_pipeline.add_component("prompt_builder", PromptBuilder(template=prompt_template))
query_pipeline.add_component("llm", OpenAIChatGenerator(api_key=Secret.from_token("API_HERE"), model="gpt-3.5-turbo"))
query_pipeline.connect("text_embedder.embedding", "retriever.query_embedding")
query_pipeline.connect("retriever.documents", "prompt_builder.documents")
query_pipeline.connect("prompt_builder", "llm")
query = query
results = query_pipeline.run(
{
"text_embedder": {"text": query},
"prompt_builder": {"query": query},
}
)
return results['llm']['replies'][0]
if __name__ == '__main__':
#loading the environment variable
'''load_dotenv()
PINECONE_API_KEY = os.getenv("PINECONE_API_KEY")
os.environ['PINECONE_API_KEY'] = PINECONE_API_KEY
print("Import Successfully")'''
result=get_result("what is rag?")
print(result)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello guys I need a little help :d
So I have just started learning Haystack and all its functionality and I really like it but with RAG I seem to have an issue: So I created my vector database with Haystack+Pinenone works like a charm and I dont have an issue but when I apply the retrieval code I get an error revolving arround the " query_pipeline.connect("prompt_builder", "llm") " it says that the llm and prompt_builder does not match in input and ouput variables.... I have tried everything I even tried to switch to the DynamicChatPromptBuilder but the issue still there any help would be appreciated :D here is the full code :
Beta Was this translation helpful? Give feedback.
All reactions