Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

本地部署时,执行tool出错 #325

Open
hbxmao opened this issue Mar 6, 2024 · 9 comments
Open

本地部署时,执行tool出错 #325

hbxmao opened this issue Mar 6, 2024 · 9 comments
Assignees
Labels
enhancement New feature or request tool questions with tool

Comments

@hbxmao
Copy link

hbxmao commented Mar 6, 2024

按照此教程配置好环境https://github.com/modelscope/modelscope-agent/blob/master/docs/local_deploy.md,执行时tool如何调用?比如我配置这个“语音生成”的tool,远程调用api,会有如下错误。

错误日志如下
2024-03-06 17:06:43.901 - modelscope-agent - INFO - | message: using builder model qwen1.5-4b-chat | uuid: local_user | details: {} | step: | error:
2024-03-06 17:06:46,909 - modelscope - WARNING - Model revision not specified, use revision: v1.1.0
2024-03-06 17:06:47,418 - modelscope - INFO - initiate model from /mnt/workspace/.cache/modelscope/damo/nlp_gte_sentence-embedding_chinese-base
2024-03-06 17:06:47,418 - modelscope - INFO - initiate model from location /mnt/workspace/.cache/modelscope/damo/nlp_gte_sentence-embedding_chinese-base.
2024-03-06 17:06:47,421 - modelscope - INFO - initialize model from /mnt/workspace/.cache/modelscope/damo/nlp_gte_sentence-embedding_chinese-base
2024-03-06 17:06:48,974 - modelscope - WARNING - No preprocessor field found in cfg.
2024-03-06 17:06:48,974 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file.
2024-03-06 17:06:48,974 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mnt/workspace/.cache/modelscope/damo/nlp_gte_sentence-embedding_chinese-base'}. trying to build by task and model information.
2024-03-06 17:06:49,012 - modelscope - WARNING - No preprocessor field found in cfg.
2024-03-06 17:06:49,012 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file.
2024-03-06 17:06:49,012 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mnt/workspace/.cache/modelscope/damo/nlp_gte_sentence-embedding_chinese-base', 'sequence_length': 128}. trying to build by task and model information.
2024-03-06 17:06:49.122 - modelscope-agent - INFO - | message: using model qwen1.5-4b-chat | uuid: local_user | details: {'model_config': {'type': 'openai', 'model': 'qwen/Qwen1.5-4B-Chat', 'api_base': 'http://localhost:8000/v1', 'is_chat': True, 'is_function_call': False, 'generate_cfg': {'top_p': 0.5, 'stop': 'Observation'}}} | step: | error:
2024-03-06 17:06:49.155 - modelscope-agent - ERROR - | message: | uuid: local_user | details: {'error_traceback': 'Traceback (most recent call last):\n File "/data/work/modelscope-agent/apps/agentfabric/app.py", line 31, in init_user\n user_agent, user_memory = init_user_chatbot_agent(uuid_str)\n File "/data/work/modelscope-agent/apps/agentfabric/user_core.py", line 42, in init_user_chatbot_agent\n agent = RolePlay(\n File "/data/work/modelscope-agent/modelscope_agent/agent.py", line 47, in init\n self._register_tool(function)\n File "/data/work/modelscope-agent/modelscope_agent/agent.py", line 109, in _register_tool\n self.function_map[tool_name] = TOOL_REGISTRYtool_name\n File "/data/work/modelscope-agent/modelscope_agent/tools/modelscope_tools/pipeline_tool.py", line 29, in init\n assert self.api_token is not None, 'api_token is not set'\nAssertionError: api_token is not set\n'} | step: | error: api_token is not set

同时,请问调用本地tool有例程吗?

@hbxmao hbxmao changed the title 本地部署时 本地部署时,执行tool出错 Mar 6, 2024
@mushenL
Copy link
Collaborator

mushenL commented Mar 7, 2024

您好,从您的报错信息上来看,应该是是您这边没有在环境变量上设置相应的api_token,您可以设置MODELSCOPE_API_TOKEN这个环境变量,关于每个tool中所使用的api_token在readme.md上都有说明,您可以参考,至于您说的本地调用tool,您可以参考本地部署模型服务的方式,将tool所使用的模型在您本地部署成服务,然后将配置文件里面的url改为您的服务接口即可,如果您需要本地直接加载,modelscope_agent/tools/modelscope_tools/pipeline_tool.py 上有开放_local_call接口函数,但是需要您自己适配tool模型

@hbxmao
Copy link
Author

hbxmao commented Mar 8, 2024

您好,从您的报错信息上来看,应该是是您这边没有在环境变量上设置相应的api_token,您可以设置MODELSCOPE_API_TOKEN这个环境变量,关于每个tool中所使用的api_token在readme.md上都有说明,您可以参考,至于您说的本地调用tool,您可以参考本地部署模型服务的方式,将tool所使用的模型在您本地部署成服务,然后将配置文件里面的url改为您的服务接口即可,如果您需要本地直接加载,modelscope_agent/tools/modelscope_tools/pipeline_tool.py 上有开放_local_call接口函数,但是需要您自己适配tool模型

如果不采用模型,采用本地tool(exe)或者api函数(dll)等方式,如何接入?

@mushenL
Copy link
Collaborator

mushenL commented Mar 11, 2024

暂时不支持,可能需要您参考 modelscope_agent/tools/modelscope_tools/pipeline_tool.py 相关的call函数自行适配

@zzhangpurdue zzhangpurdue added the tool questions with tool label Mar 12, 2024
@zzhangpurdue
Copy link
Collaborator

您好,从您的报错信息上来看,应该是是您这边没有在环境变量上设置相应的api_token,您可以设置MODELSCOPE_API_TOKEN这个环境变量,关于每个tool中所使用的api_token在readme.md上都有说明,您可以参考,至于您说的本地调用tool,您可以参考本地部署模型服务的方式,将tool所使用的模型在您本地部署成服务,然后将配置文件里面的url改为您的服务接口即可,如果您需要本地直接加载,modelscope_agent/tools/modelscope_tools/pipeline_tool.py 上有开放_local_call接口函数,但是需要您自己适配tool模型

如果不采用模型,采用本地tool(exe)或者api函数(dll)等方式,如何接入?

如果您能把对应的exe或者dll变成服务进行部署,也是可以调用的。
或者您新建一个tool,并在tool上实现子进程的调用,并监控子进程的结果。

@zzhangpurdue zzhangpurdue added the enhancement New feature or request label Mar 12, 2024
@zzhangpurdue zzhangpurdue self-assigned this Mar 17, 2024
@cgwyx
Copy link

cgwyx commented Mar 19, 2024

本地文生图服务已开启,但修改这个文件modelscope-agent\apps\agentfabric\config\tool_config.json

"image_gen": {
"name": "Wanx Image Generation",
"url": "http://localhost:8000/v1",
"is_active": true,
"use": true,
"is_remote_tool": true
也无法调用本地文生图tool

@zzhangpurdue
Copy link
Collaborator

本地文生图服务已开启,但修改这个文件modelscope-agent\apps\agentfabric\config\tool_config.json

"image_gen": { "name": "Wanx Image Generation", "url": "http://localhost:8000/v1", "is_active": true, "use": true, "is_remote_tool": true 也无法调用本地文生图tool

报错信息有么?我们跟进一下。

@cgwyx
Copy link

cgwyx commented Mar 21, 2024

本地文生图服务已开启,但修改这个文件modelscope-agent\apps\agentfabric\config\tool_config.json
"image_gen": { "name": "Wanx Image Generation", "url": "http://localhost:8000/v1", "is_active": true, "use": true, "is_remote_tool": true 也无法调用本地文生图tool

报错信息有么?我们跟进一下。

不报错,就是无法调用本地文生图API,
2024-03-21 09:58:27.817 - modelscope-agent - INFO - | message: frame | uuid: local_user | details: {'frame': "{'exec_result': {'result': Config (path: /tmp/agentfabric/config/local_user/builder_config.json): {'name': '文生图创作助手', 'avatar': 'custom_bot_avatar.png', 'description': '一个能将输入文本转化为对应视觉图像的智能创作工具', 'instruction': '根据用户提供的文本描述生成相应风格和主题的图像;支持多种应用场景如艺术创作、商业设计与数据可视化', 'prompt_recommend': ['生成 一张关于夏天海滩的图片', '将这段描述转化为一幅卡通画', '依据这段文字制作一张信息图表'], 'knowledge': [], 'tools': {'image_gen': {'name': 'Wanx Image Generation', 'url': 'http://127.0.0.1:8889/v1/generation/text-to-image', 'is_active': True, 'use': True, 'is_remote_tool': False}, 'code_interpreter': {'name': 'Code Interpreter', 'is_active': True, 'use': False, 'is_remote_tool': False, 'max_output': 2000}, 'web_browser

@cgwyx
Copy link

cgwyx commented Mar 21, 2024

修改modelscope-agent\apps\agentfabric\config文件增加fooocus_image_gen这个tool,开启fooocus-api(命令行可以正常调用),AgentFabric也无法调用。

"fooocus_image_gen": {
"name": "fooocus Image Generation",
"url": "http://127.0.0.1:8889/v1/generation/text-to-image",
"is_active": true,
"use": true,
"is_remote_tool": true
},

报错:
2024-03-21 14:49:13.448 - modelscope-agent - INFO - | message: builder_cfg | uuid: local_user | details: {'builder_cfg': "Config (path: /tmp/agentfabric/config/local_user/builder_config.json): {'name': '文生图-人物写实助手', 'avatar': 'custom_bot_avatar.png', 'description': '一款专门针对人物写实风格设计的AI-Agent,能够精准地将用户输入的文字描述转换为逼真的 人物画像,适用于肖像创作、个性化定制和虚拟形象生成等多种场景。', 'instruction': '深度理解并精确捕捉用户输入文本中的人物 特征与细节;运用先进的图像生成技术,确保生成的人物图像具有高度的写实性与艺术感染力', 'prompt_recommend': ['请根据这个人 物描述生成一张写实风格的肖像画', '我想看一幅描绘我朋友特点的写实风格画作', '基于这段小说主角描述创建一张生动的肖像', '能否按照历史人物记载生成其写实画像?'], 'knowledge': [], 'tools': {'image_gen': {'name': 'Wanx Image Generation', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'fooocus_image_gen': {'name': 'fooocus Image Generation', 'url': 'http://127.0.0.1:8889/v1/generation/text-to-image', 'is_active': True, 'use': True, 'is_remote_tool': True}, 'code_interpreter': {'name': 'Code Interpreter', 'is_active': True, 'use': False, 'is_remote_tool': False, 'max_output': 2000}, 'web_browser': {'name': 'Web Browsing', 'is_active': True, 'use': False, 'max_browser_length': 2000}, 'amap_weather': {'name': '高德天气', 'is_active': True, 'use': False}, 'paraformer_asr': {'name': 'Paraformer语音识别', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'sambert_tts': {'name': 'Sambert语音合成', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'wordart_texture_generation': {'name': '艺术字纹理生成', 'is_active': True, 'use': False}, 'web_search': {'name': 'Web Searching', 'is_active': True, 'use': False, 'searcher': 'bing'}, 'qwen_vl': {'name': 'Qwen-VL识图', 'is_active': True, 'use': False}, 'style_repaint': {'name': '人物风格重绘', 'is_active': True, 'use': False}, 'image_enhancement': {'name': '追影-放大镜', 'is_active': True, 'use': False}, 'text-address': {'name': '地址解析', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/mgeo_geographic_elements_tagging_chinese_base', 'use': False, 'is_active': True, 'is_remote_tool': True}, 'speech-generation': {'name': '语音生成', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/speech_sambert-hifigan_tts_zh-cn_16k', 'use': False, 'is_active': True, 'is_remote_tool': False}, 'video-generation': {'name': '视频生成', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/text-to-video-synthesis', 'use': False, 'is_active': True, 'is_remote_tool': True}}, 'model': 'qwen-max', 'language': 'zh'}"} | step: | error:
2024-03-21 14:49:13.517 - modelscope-agent - INFO - | message: using model qwen-max | uuid: local_user | details: {'model_config': {'type': 'dashscope', 'model': 'qwen-max', 'length_constraint': {'knowledge': 4000, 'input': 6000}, 'generate_cfg': {'use_raw_prompt': True, 'top_p': 0.5, 'stop': 'Observation'}}} | step: | error:
2024-03-21 14:49:13.524 - modelscope-agent - ERROR - | message: | uuid: local_user | details: {'error_traceback': 'Traceback (most recent call last):\n File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 31, in init_user\n user_agent, user_memory = init_user_chatbot_agent(uuid_str)\n File "/agentfabric/modelscope-agent/apps/agentfabric/user_core.py", line 42, in init_user_chatbot_agent\n agent = RolePlay(\n File "/agentfabric/modelscope-agent/modelscope_agent/agent.py", line 47, in init\n self._register_tool(function)\n File "/agentfabric/modelscope-agent/modelscope_agent/agent.py", line 111, in _register_tool\n raise NotImplementedError\nNotImplementedError\n'} | step: | error:
2024-03-21 14:49:13.567 - modelscope-agent - INFO - | message: using builder model qwen-max | uuid: local_user | details: {} | step: | error:
2024-03-21 14:49:14,435 - modelscope - WARNING - Model revision not specified, use revision: v1.1.0
2024-03-21 14:49:14,852 - modelscope - INFO - initiate model from /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base
2024-03-21 14:49:14,859 - modelscope - INFO - initiate model from location /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base.
2024-03-21 14:49:14,896 - modelscope - INFO - initialize model from /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base
2024-03-21 14:49:23,898 - modelscope - WARNING - No preprocessor field found in cfg.
2024-03-21 14:49:23,899 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file.
2024-03-21 14:49:23,899 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base'}. trying to build by task and model information.
2024-03-21 14:49:24,124 - modelscope - WARNING - No preprocessor field found in cfg.
2024-03-21 14:49:24,124 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file.
2024-03-21 14:49:24,124 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base', 'sequence_length': 128}. trying to build by task and model information.
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 456, in call_prediction
output = await route_utils.call_process_api(
File "/opt/conda/lib/python3.10/site-packages/gradio/route_utils.py", line 232, in call_process_api
output = await app.get_blocks().process_api(
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1522, in process_api
result = await self.call_function(
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1156, in call_function
prediction = await utils.async_iteration(iterator)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 515, in async_iteration
return await iterator.anext()
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 508, in anext
return await anyio.to_thread.run_sync(
File "/opt/conda/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
return await future
File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 491, in run_sync_iterator_async
return next(iterator)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 662, in gen_wrapper
yield from f(*args, **kwargs)
File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 551, in preview_send_message
user_agent = _state['user_agent']
KeyError: 'user_agent'
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 456, in call_prediction
output = await route_utils.call_process_api(
File "/opt/conda/lib/python3.10/site-packages/gradio/route_utils.py", line 232, in call_process_api
output = await app.get_blocks().process_api(
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1522, in process_api
result = await self.call_function(
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1156, in call_function
prediction = await utils.async_iteration(iterator)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 515, in async_iteration
return await iterator.anext()
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 508, in anext
return await anyio.to_thread.run_sync(
File "/opt/conda/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
return await future
File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 491, in run_sync_iterator_async
return next(iterator)
File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 662, in gen_wrapper
yield from f(*args, **kwargs)
File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 551, in preview_send_message
user_agent = _state['user_agent']
KeyError: 'user_agent'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 501, in process_events
response = await self.call_prediction(awake_events, batch)
File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 465, in call_prediction
raise Exception(str(error) if show_error else None) from error
Exception: 'user_agent'

@RookieDay
Copy link

修改modelscope-agent\apps\agentfabric\config文件增加fooocus_image_gen这个tool,开启fooocus-api(命令行可以正常调用),AgentFabric也无法调用。

"fooocus_image_gen": { "name": "fooocus Image Generation", "url": "http://127.0.0.1:8889/v1/generation/text-to-image", "is_active": true, "use": true, "is_remote_tool": true },

报错: 2024-03-21 14:49:13.448 - modelscope-agent - INFO - | message: builder_cfg | uuid: local_user | details: {'builder_cfg': "Config (path: /tmp/agentfabric/config/local_user/builder_config.json): {'name': '文生图-人物写实助手', 'avatar': 'custom_bot_avatar.png', 'description': '一款专门针对人物写实风格设计的AI-Agent,能够精准地将用户输入的文字描述转换为逼真的 人物画像,适用于肖像创作、个性化定制和虚拟形象生成等多种场景。', 'instruction': '深度理解并精确捕捉用户输入文本中的人物 特征与细节;运用先进的图像生成技术,确保生成的人物图像具有高度的写实性与艺术感染力', 'prompt_recommend': ['请根据这个人 物描述生成一张写实风格的肖像画', '我想看一幅描绘我朋友特点的写实风格画作', '基于这段小说主角描述创建一张生动的肖像', '能否按照历史人物记载生成其写实画像?'], 'knowledge': [], 'tools': {'image_gen': {'name': 'Wanx Image Generation', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'fooocus_image_gen': {'name': 'fooocus Image Generation', 'url': 'http://127.0.0.1:8889/v1/generation/text-to-image', 'is_active': True, 'use': True, 'is_remote_tool': True}, 'code_interpreter': {'name': 'Code Interpreter', 'is_active': True, 'use': False, 'is_remote_tool': False, 'max_output': 2000}, 'web_browser': {'name': 'Web Browsing', 'is_active': True, 'use': False, 'max_browser_length': 2000}, 'amap_weather': {'name': '高德天气', 'is_active': True, 'use': False}, 'paraformer_asr': {'name': 'Paraformer语音识别', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'sambert_tts': {'name': 'Sambert语音合成', 'is_active': True, 'use': False, 'is_remote_tool': True}, 'wordart_texture_generation': {'name': '艺术字纹理生成', 'is_active': True, 'use': False}, 'web_search': {'name': 'Web Searching', 'is_active': True, 'use': False, 'searcher': 'bing'}, 'qwen_vl': {'name': 'Qwen-VL识图', 'is_active': True, 'use': False}, 'style_repaint': {'name': '人物风格重绘', 'is_active': True, 'use': False}, 'image_enhancement': {'name': '追影-放大镜', 'is_active': True, 'use': False}, 'text-address': {'name': '地址解析', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/mgeo_geographic_elements_tagging_chinese_base', 'use': False, 'is_active': True, 'is_remote_tool': True}, 'speech-generation': {'name': '语音生成', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/speech_sambert-hifigan_tts_zh-cn_16k', 'use': False, 'is_active': True, 'is_remote_tool': False}, 'video-generation': {'name': '视频生成', 'url': 'https://api-inference.modelscope.cn/api-inference/v1/models/damo/text-to-video-synthesis', 'use': False, 'is_active': True, 'is_remote_tool': True}}, 'model': 'qwen-max', 'language': 'zh'}"} | step: | error: 2024-03-21 14:49:13.517 - modelscope-agent - INFO - | message: using model qwen-max | uuid: local_user | details: {'model_config': {'type': 'dashscope', 'model': 'qwen-max', 'length_constraint': {'knowledge': 4000, 'input': 6000}, 'generate_cfg': {'use_raw_prompt': True, 'top_p': 0.5, 'stop': 'Observation'}}} | step: | error: 2024-03-21 14:49:13.524 - modelscope-agent - ERROR - | message: | uuid: local_user | details: {'error_traceback': 'Traceback (most recent call last):\n File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 31, in init_user\n user_agent, user_memory = init_user_chatbot_agent(uuid_str)\n File "/agentfabric/modelscope-agent/apps/agentfabric/user_core.py", line 42, in init_user_chatbot_agent\n agent = RolePlay(\n File "/agentfabric/modelscope-agent/modelscope_agent/agent.py", line 47, in init\n self._register_tool(function)\n File "/agentfabric/modelscope-agent/modelscope_agent/agent.py", line 111, in _register_tool\n raise NotImplementedError\nNotImplementedError\n'} | step: | error: 2024-03-21 14:49:13.567 - modelscope-agent - INFO - | message: using builder model qwen-max | uuid: local_user | details: {} | step: | error: 2024-03-21 14:49:14,435 - modelscope - WARNING - Model revision not specified, use revision: v1.1.0 2024-03-21 14:49:14,852 - modelscope - INFO - initiate model from /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base 2024-03-21 14:49:14,859 - modelscope - INFO - initiate model from location /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base. 2024-03-21 14:49:14,896 - modelscope - INFO - initialize model from /mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base 2024-03-21 14:49:23,898 - modelscope - WARNING - No preprocessor field found in cfg. 2024-03-21 14:49:23,899 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file. 2024-03-21 14:49:23,899 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base'}. trying to build by task and model information. 2024-03-21 14:49:24,124 - modelscope - WARNING - No preprocessor field found in cfg. 2024-03-21 14:49:24,124 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file. 2024-03-21 14:49:24,124 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'model_dir': '/mdscope/modelscope_cache/damo/nlp_gte_sentence-embedding_chinese-base', 'sequence_length': 128}. trying to build by task and model information. Traceback (most recent call last): File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( File "/opt/conda/lib/python3.10/site-packages/gradio/route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1522, in process_api result = await self.call_function( File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 515, in async_iteration return await iterator.anext() File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 508, in anext return await anyio.to_thread.run_sync( File "/opt/conda/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread return await future File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run result = context.run(func, *args) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 491, in run_sync_iterator_async return next(iterator) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 551, in preview_send_message user_agent = _state['user_agent'] KeyError: 'user_agent' Traceback (most recent call last): File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( File "/opt/conda/lib/python3.10/site-packages/gradio/route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1522, in process_api result = await self.call_function( File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 515, in async_iteration return await iterator.anext() File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 508, in anext return await anyio.to_thread.run_sync( File "/opt/conda/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread return await future File "/opt/conda/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run result = context.run(func, *args) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 491, in run_sync_iterator_async return next(iterator) File "/opt/conda/lib/python3.10/site-packages/gradio/utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "/agentfabric/modelscope-agent/apps/agentfabric/app.py", line 551, in preview_send_message user_agent = _state['user_agent'] KeyError: 'user_agent'

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 501, in process_events response = await self.call_prediction(awake_events, batch) File "/opt/conda/lib/python3.10/site-packages/gradio/queueing.py", line 465, in call_prediction raise Exception(str(error) if show_error else None) from error Exception: 'user_agent'

for mac. 删除 /tmp目录下的agentfabric文件夹,重新运行下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request tool questions with tool
Projects
None yet
Development

No branches or pull requests

5 participants