Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to load index with ID {index_id} #65

Open
DhananjayanOnline opened this issue Oct 26, 2023 · 5 comments
Open

Failed to load index with ID {index_id} #65

DhananjayanOnline opened this issue Oct 26, 2023 · 5 comments

Comments

@DhananjayanOnline
Copy link

DhananjayanOnline commented Oct 26, 2023

I'm encountering an issue when trying to retrieve information from a document. The error message reads 'Failed to load index with ID {index_id}.

INFO:     Application startup complete.
INFO:     127.0.0.1:48006 - "POST /api/conversation/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:33524 - "GET /api/conversation/1eecef0b-17af-48e8-9d5f-95fcbabc5f58 HTTP/1.1" 200 OK
INFO:     127.0.0.1:38606 - "GET /api/conversation/1eecef0b-17af-48e8-9d5f-95fcbabc5f58/message?user_message=what%20is%20this%20document%3F HTTP/1.1" 200 OK
Failed to load indices from storage. Creating new indices. If you're running the seed_db script, this is normal and expected.
Error in message publisher
Traceback (most recent call last):
  File "/home/jay/Documents/Pixl/AI/blue_pond/sec-insights-4/sec-insights/backend/app/chat/engine.py", line 151, in build_doc_id_to_index_map
    indices = load_indices_from_storage(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/loading.py", line 71, in load_indices_from_storage
    raise ValueError(f"Failed to load index with ID {index_id}")
ValueError: Failed to load index with ID e06dc2c1-bfa6-41d6-be82-cdbfa64d4d31

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/embeddings/openai.py", line 172, in get_embeddings
    data = openai.Embedding.create(input=list_of_text, model=engine, **kwargs).data
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
    raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jay/Documents/Pixl/AI/blue_pond/sec-insights-4/sec-insights/backend/app/api/endpoints/conversation.py", line 149, in event_publisher
    await task
  File "/usr/lib64/python3.11/asyncio/futures.py", line 290, in __await__
    return self.result()  # May raise too.
           ^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/lib64/python3.11/asyncio/tasks.py", line 267, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/jay/Documents/Pixl/AI/blue_pond/sec-insights-4/sec-insights/backend/app/chat/messaging.py", line 131, in handle_chat_message
    chat_engine = await get_chat_engine(
                  ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/Documents/Pixl/AI/blue_pond/sec-insights-4/sec-insights/backend/app/chat/engine.py", line 270, in get_chat_engine
    doc_id_to_index = await build_doc_id_to_index_map(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/Documents/Pixl/AI/blue_pond/sec-insights-4/sec-insights/backend/app/chat/engine.py", line 170, in build_doc_id_to_index_map
    index = VectorStoreIndex.from_documents(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/base.py", line 102, in from_documents
    return cls(
           ^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 46, in __init__
    super().__init__(
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/base.py", line 71, in __init__
    index_struct = self.build_index_from_nodes(nodes)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 241, in build_index_from_nodes
    return self._build_index_from_nodes(nodes)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 229, in _build_index_from_nodes
    self._add_nodes_to_index(
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 201, in _add_nodes_to_index
    embedding_results = self._get_node_embedding_results(nodes, show_progress)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 111, in _get_node_embedding_results
    ) = self._service_context.embed_model.get_queued_text_embeddings(show_progress)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/embeddings/base.py", line 218, in get_queued_text_embeddings
    embeddings = self._get_text_embeddings(cur_batch_texts)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/llama_index/embeddings/openai.py", line 323, in _get_text_embeddings
    return get_embeddings(
           ^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-ObpdDQgr-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 326, in iter
    raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7f03c9eb0e90 state=finished raised InvalidRequestError>]
@sourabhdesai
Copy link
Contributor

@DhananjayanOnline This error would come up if the document you selected in your chat hadn't been previously indexed. Did you run the seed script before running the server?

@DhananjayanOnline
Copy link
Author

@DhananjayanOnline This error would come up if the document you selected in your chat hadn't been previously indexed. Did you run the seed script before running the server?

@sourabhdesai No, I didn't execute the seed script; instead, I manually uploaded a custom document to Minio and performed upsert operation to store the document's URL in the database. Is it necessary to run the seed script?

@cipri-tom
Copy link

I'm on the same error. @DhananjayanOnline did you figure out a way ? I did the same as you, using upsert instead of seed_db

@cipri-tom
Copy link

cipri-tom commented Jan 11, 2024

I figured it out. The error is not Failed to lod index with ID {index_id}. This one is expected and normal: there is no index, so it starts creating one. You have to look in more details in the stack trace to find the real error.

The failure to find the index triggers calls to embeddings. And then that fails, and retries several times, finally failing completely with renacity.RetryError. Now, why does it fail in the first place to make embedings ? In my case, it was the wrong API_KEY, as I tried to use Azure instead of OpenAI directly.

In case of @DhananjayanOnline , the error can be seen in the first stack after During handling of the above exception, another exception occurred: ... It's openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'> . So your parameters deployment_id or engine are not specified, which means it cannot create embeddings.

@castelbu , try to look into more detail in your stack trace.

@savanth14
Copy link

savanth14 commented Feb 16, 2024

hi @cipri-tom @sourabhdesai i faced a similar error like the one mentioned in this thread. can anyone please point out where i am going wrong and how to correct it?
INFO: Application startup complete.
INFO: 127.0.0.1:54916 - "POST /api/conversation/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:54930 - "GET /api/conversation/e5f32045-28f1-432c-9364-d23670542e01 HTTP/1.1" 200 OK
INFO: 127.0.0.1:47042 - "GET /api/conversation/e5f32045-28f1-432c-9364-d23670542e01/message?user_message=what%20is%20this%20document%20about%3F HTTP/1.1" 200 OK
Failed to load indices from storage. Creating new indices. If you're running the seed_db script, this is normal and expected.
Error in message publisher
Traceback (most recent call last):
File "/workspaces/sec-insights/backend/app/chat/engine.py", line 215, in build_doc_id_to_index_map
indices = load_indices_from_storage(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/loading.py", line 71, in load_indices_from_storage
raise ValueError(f"Failed to load index with ID {index_id}")
ValueError: Failed to load index with ID 02ee7043-aae6-40c5-9481-c47bec6d13f4

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/workspaces/sec-insights/backend/app/api/endpoints/conversation.py", line 149, in event_publisher
await task
File "/usr/local/lib/python3.11/asyncio/futures.py", line 290, in await
return self.result() # May raise too.
^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
raise self._exception.with_traceback(self._exception_tb)
File "/usr/local/lib/python3.11/asyncio/tasks.py", line 267, in __step
result = coro.send(None)
^^^^^^^^^^^^^^^
File "/workspaces/sec-insights/backend/app/chat/messaging.py", line 134, in handle_chat_message
chat_engine = await get_chat_engine(
^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/sec-insights/backend/app/chat/engine.py", line 332, in get_chat_engine
doc_id_to_index = await build_doc_id_to_index_map(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/sec-insights/backend/app/chat/engine.py", line 234, in build_doc_id_to_index_map
index = VectorStoreIndex.from_documents(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/base.py", line 106, in from_documents
return cls(
^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 49, in init
super().init(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/base.py", line 71, in init
index_struct = self.build_index_from_nodes(nodes)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 255, in build_index_from_nodes
return self._build_index_from_nodes(nodes, **insert_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 236, in _build_index_from_nodes
self._add_nodes_to_index(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/indices/vector_store/base.py", line 190, in _add_nodes_to_index
new_ids = self._vector_store.add(nodes, **insert_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres.py", line 312, in add
session.commit()
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1972, in commit
trans.commit(_to_root=True)
File "", line 2, in commit
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/state_changes.py", line 139, in _go
ret_value = fn(self, *arg, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1257, in commit
self._prepare_impl()
File "", line 2, in _prepare_impl
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/state_changes.py", line 139, in _go
ret_value = fn(self, *arg, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1232, in _prepare_impl
self.session.flush()
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 4296, in flush
self._flush(objects)
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 4431, in _flush
with util.safe_reraise():
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/util/langhelpers.py", line 146, in exit
raise exc_value.with_traceback(exc_tb)
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 4392, in _flush
flush_context.execute()
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/unitofwork.py", line 466, in execute
rec.execute(self)
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/unitofwork.py", line 642, in execute
util.preloaded.orm_persistence.save_obj(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
_emit_insert_statements(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/persistence.py", line 1143, in _emit_insert_statements
result = connection.execute(
^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1408, in execute
return meth(
^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/sql/elements.py", line 513, in _execute_on_connection
return connection._execute_clauseelement(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1630, in _execute_clauseelement
ret = self._execute_context(
^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1834, in _execute_context
return self._exec_insertmany_context(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2112, in _exec_insertmany_context
self._handle_dbapi_exception(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2338, in _handle_dbapi_exception
raise exc_info[1].with_traceback(exc_info[2])
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2104, in _exec_insertmany_context
dialect.do_execute(
File "/root/.cache/pypoetry/virtualenvs/llama-app-backend--Qk0ygDj-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 924, in do_execute
cursor.execute(statement, parameters)
ValueError: A string literal cannot contain NUL (0x00) characters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants