Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] AttributeError: 'coroutine' object has no attribute 'get' #807

Open
salvogs opened this issue May 6, 2024 · 12 comments
Open

[BUG] AttributeError: 'coroutine' object has no attribute 'get' #807

salvogs opened this issue May 6, 2024 · 12 comments
Labels
bug Something isn't working

Comments

@salvogs
Copy link

salvogs commented May 6, 2024

My config consists in a local-cat with qdrant and Ollama.

After update from 1.5.1 to 1.6.1 i have this problem:

Schermata del 2024-05-06 13-17-36

traceback:

Traceback (most recent call last):
File "/app/cat/looking_glass/stray_cat.py", line 403, in run
cat_message = self.loop.run_until_complete(
File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
File "/app/cat/looking_glass/stray_cat.py", line 350, in call
raise e
File "/app/cat/looking_glass/stray_cat.py", line 340, in call
cat_message = await self.agent_manager.execute_agent(self)
File "/app/cat/looking_glass/agent_manager.py", line 236, in execute_agent
memory_chain_output = await self.execute_memory_chain(agent_input, prompt_prefix, prompt_suffix, stray)
File "/app/cat/looking_glass/agent_manager.py", line 170, in execute_memory_chain
return await memory_chain.ainvoke({**agent_input, "stop":"Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))
File "/usr/local/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke
raise e
File "/usr/local/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/usr/local/lib/python3.10/site-packages/langchain/chains/llm.py", line 275, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
File "/usr/local/lib/python3.10/site-packages/langchain/chains/llm.py", line 142, in agenerate
return await self.llm.agenerate_prompt(
File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 643, in agenerate_prompt
return await self.agenerate(
File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 1018, in agenerate
output = await self._agenerate_helper(
File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 882, in _agenerate_helper
raise e
File "/usr/local/lib/python3.10/site-packages/langchain_core/language_models/llms.py", line 866, in _agenerate_helper
await self._agenerate(
File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 444, in _agenerate
final_chunk = await super()._astream_with_aggregation(
File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 343, in _astream_with_aggregation
async for stream_resp in self._acreate_generate_stream(prompt, stop, **kwargs):
File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 174, in _acreate_generate_stream
async for item in self._acreate_stream(
File "/app/cat/factory/ollama_utils.py", line 121, in _acreate_stream_patch
optional_detail = await response.json().get("error")
AttributeError: 'coroutine' object has no attribute 'get'

@salvogs salvogs added the bug Something isn't working label May 6, 2024
@bositalia
Copy link
Contributor

Confirm possible bug with Ollama 1.33 and Qdrant 1.9.1, same problem.

@valentimarco
Copy link
Collaborator

#783 Will gonna resolve most problem with ollama, wait some time

@enrichicco
Copy link

enrichicco commented May 11, 2024

First of all, really great job, ceshire-cat is great.

Then...

Please, take my observation with some caution, since I am fairly new to this kind of environment.
Maybe, I am writing a lot of rubbish...

It seems that here there are two problems:

  1. the first is in the ceshire-cat, core application, code file:

[prjroot_core]/cat/factory/ollama_utils.py

where, at line 122, the handling of the exception passes through an async object, which "immediately" tries to call a get method on something which is not there yet:

            optional_detail = await response.json().get("error")

so the exception (I am inside the docker environment created through your compose file)

cheshire_cat_core           |   File "/app/cat/factory/ollama_utils.py", line 121, in _acreate_stream_patch
cheshire_cat_core           |     optional_detail = await response.json().get("error")
cheshire_cat_core           | AttributeError: 'coroutine' object has no attribute 'get'

My pesonal fix here, inspired by this stackoverflow post 'coroutine' object has no attribute get || pyppeteer

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")
  1. Apparently, with my fix, the exception catcher does catch the exception, which at the end is, on the web UI:

Ollama call failed with status code 500. Details: option "stop" must be of type array

here is an extract of the log inside the container:

cheshire_cat_core           |   File "/app/cat/factory/ollama_utils.py", line 123, in _acreate_stream_patch
cheshire_cat_core           |     raise ValueError(
cheshire_cat_core           | ValueError: Ollama call failed with status code 500. Details: option "stop" must be of type array

Which is the ollama exception.
On this one I am still lost.
I googled a little, giving the exact exception nothing was there.
I'll keep searching. however, still looking at the log, there are these two lines more or less immediately above the catcher:

cheshire_cat_core           |   File "/usr/local/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 343, in _astream_with_aggregation
cheshire_cat_core           |     async for stream_resp in self._acreate_generate_stream(prompt, stop, **kwargs):

which maybe could be interesting in order to solve this...

thank you again for this framework, really...

@enrichicco
Copy link

ok... I am back... I was able to obtain at least one answer.

It seems that the problem is in the file (inside the docker backend container):

/app/cat/looking_glass/agent_manager.py

I changed the line 170, from

return await memory_chain.ainvoke({**agent_input, "stop": "Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

to

return await memory_chain.ainvoke({**agent_input, "stop":["Human:"]}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

and the call apparently worked... again, take this with caution: smally tested (just one shot), from a beginner on both sides (python api and ollama api interface)

@valentimarco
Copy link
Collaborator

@enrichicco thank you for the effort you are putting! We already resolve the problem with the merge of #783 #813 in develop branch.
To summarize the problem:

  1. Langchain patched all problems with Ollama (before that we used the ollama_utils.py to patch some issues)
  2. Ollama update the body of the APIs (this is why we have the error)

With the new changes, we also resolve the tool selection issue when a local model tries to select a tool (now even Phi-3 can launch tools)

@TobioDev
Copy link

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

@valentimarco
Copy link
Collaborator

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

Appreciate it, @valentimarco! Any idea when we can expect this to be part of the main branch?

not soon, is already in the develop branch but need some testing. If you want you can try it and give a feedback

@enrichicco
Copy link

Hi @valentimarco,

I checked the commits and the development branch, and it seems to me that the coroutine object error is still there, i.e, this one:

         optional_detail = await response.json().get("error")

According to me, this should be something like this:

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")

it is not so bad, but in case of response codes != 200 and 404, it masks the (manually) raised value error (lets say, the real request exception) with the "coroutine object has no attribute get error" .

@valentimarco
Copy link
Collaborator

Hi @valentimarco,

I checked the commits and the development branch, and it seems to me that the coroutine object error is still there, i.e, this one:

         optional_detail = await response.json().get("error")

According to me, this should be something like this:

                    asy_optional_detail = await response.json()
                    optional_detail = asy_optional_detail.get("error")

it is not so bad, but in case of response codes != 200 and 404, it masks the (manually) raised value error (lets say, the real request exception) with the "coroutine object has no attribute get error" .

Langchain changed the code on that parts, infact i delete ollama_utils.py and from my test with ollama 1.33 and 1.34 i don't have any problem. If you want we can see it in discord, maybe i am forgetting something

@mecodj
Copy link

mecodj commented May 19, 2024

ok... I am back... I was able to obtain at least one answer.

It seems that the problem is in the file (inside the docker backend container):

/app/cat/looking_glass/agent_manager.py

I changed the line 170, from

return await memory_chain.ainvoke({**agent_input, "stop": "Human:"}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

to

return await memory_chain.ainvoke({**agent_input, "stop":["Human:"]}, config=RunnableConfig(callbacks=[NewTokenHandler(stray)]))

and the call apparently worked... again, take this with caution: smally tested (just one shot), from a beginner on both sides (python api and ollama api interface)

I solved it like this too

@valentimarco
Copy link
Collaborator

@Pingdred do you know if this fix needs to apply also for the new chain?

@Pingdred
Copy link
Member

@Pingdred do you know if this fix needs to apply also for the new chain?

Not for now because we no longer use the stop sequence (branch develop), however, there is to test and see if it is necessary to keep it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants