Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLMFactory错误 #324

Open
jchzhao opened this issue Mar 5, 2024 · 6 comments
Open

LLMFactory错误 #324

jchzhao opened this issue Mar 5, 2024 · 6 comments
Assignees
Labels
bug Something isn't working llm issues about llm usage

Comments

@jchzhao
Copy link

jchzhao commented Mar 5, 2024

ImportError: cannot import name 'LLMFactory' from 'modelscope_agent.llm' (/mnt/workspace/modelscope-agent/modelscope_agent/llm/init.py)

@zzhangpurdue
Copy link
Collaborator

which branch are you using?

@xl4533
Copy link

xl4533 commented Mar 11, 2024

i have the same problem

@xl4533
Copy link

xl4533 commented Mar 11, 2024

master branch

@zzhangpurdue zzhangpurdue added llm issues about llm usage bug Something isn't working labels Mar 12, 2024
@zzhangpurdue
Copy link
Collaborator

Could you please provide your code, in order to repeat the error. Thanks.

master branch

@zzhangpurdue zzhangpurdue self-assigned this Mar 17, 2024
@xl4533
Copy link

xl4533 commented Mar 18, 2024

A start.py file was created in the root directory, but an error occurred while importing the package. This code can run normally in previous versionsA start.py file was created in the root directory, but an error occurred while importing the package. This code can run normally in previous versions。

本地LLM配置

import os
from modelscope.utils.config import Config
from modelscope_agent.llm import LLMFactory
from modelscope_agent.agent import AgentExecutor
from modelscope_agent.tools import SearchKnowledgeTool
import json
import requests
import torch

model_name = 'meetingroom-7b'
model_cfg = {
'meetingroom-7b':{
'type': 'modelscope',
'model_id': 'meetingroom-7b',
'model_revision': 'v1.0.0',
'use_raw_generation_config': True,
'custom_chat': True
}
}

tool_cfg_file = os.getenv('TOOL_CONFIG_FILE', 'config/cfg_tool_template.json')
tool_cfg = Config.from_file(tool_cfg_file)
llm = LLMFactory.build_llm(model_name, model_cfg)
agent = AgentExecutor(llm, tool_cfg,tool_retrieval=False)

#agent.set_available_tools(available_tool_list)

Single-step tool-use

first = agent.run("预定一下明天下午三点的大型会议室", remote=False)

@zzhangpurdue
Copy link
Collaborator

zzhangpurdue commented Mar 18, 2024

I am very sorry that we have an upgrading from 0.2.x to 0.3.x which is backward incompatible upgrading.
So, some of the api have changed. Your case is a 0.2.x‘s example, so it will not run on 0.3.x.
Could you please run a demo on https://github.com/modelscope/modelscope-agent/blob/master/demo/demo_modelscopegpt_agent.ipynb and try again?

We will avoid such problem in the rest of upgrading.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working llm issues about llm usage
Projects
None yet
Development

No branches or pull requests

3 participants