Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] <title>执行eval中的eval_plugin进行评测 有一个agent从huggingface_hub拉包错误 #1239

Closed
2 tasks done
Plutowithcharon opened this issue May 8, 2024 · 1 comment

Comments

@Plutowithcharon
Copy link

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

想要复现
1715151168948
执行eval/evaluate_plugin.py

from transformers import Agent, AutoModelForCausalLM, AutoTokenizer
class QWenAgent(Agent):
"""
Agent that uses QWen model and tokenizer to generate code.

Example:

```py
agent = QWenAgent()
agent.run("Draw me a picture of rivers and lakes.")
```
"""

def __init__(
    self,
    chat_prompt_template=None,
    run_prompt_template=None,
    additional_tools=None,
    tokenizer=None,
    model=None,
):
    if tokenizer and model:
        self.tokenizer = tokenizer
        self.model = model
    else:
        checkpoint = "Qwen/Qwen-7B-Chat"
        self.tokenizer = AutoTokenizer.from_pretrained(
            checkpoint, trust_remote_code=True
        )
        self.model = (
            AutoModelForCausalLM.from_pretrained(
                checkpoint, device_map="auto", trust_remote_code=True
            )
            .cuda()
            .eval()
        )
        self.model.generation_config = GenerationConfig.from_pretrained(
            checkpoint, trust_remote_code=True
        )  # 可指定不同的生成长度、top_p等相关超参
        self.model.generation_config.do_sample = False  # greedy

    super().__init__(
        chat_prompt_template=chat_prompt_template,
        run_prompt_template=run_prompt_template,
        additional_tools=additional_tools,
    )

执行super().__init__这边
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1283, in hf_hub_download
raise FileMetadataError(
huggingface_hub.utils._errors.FileMetadataError: Distant resource does not seem to be on huggingface.co. It is possible that a configuration issue prevents you from downloading resources from https://huggingface.co. Please check your firewall and proxy settings and make sure your SSL certificates are updated.
1715151249020

需要从huggingface_hub 下载 repo_id: huggingface-tools/text-download
TOOL_CONFIG_FILE: tool_config.json 没法下载
https://huggingface.co/huggingface-tools 里面似乎已经不支持
这个作者这边是否有先前下载好的 或是如何解决的呢

期望行为 | Expected Behavior

能正常下载运行

复现方法 | Steps To Reproduce

python evaluate_plugin.py --eval-react-positive --eval-react-negative --eval-hfagent

运行环境 | Environment

- OS: centos
- Python:
- Transformers: 4.37.2
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

@jklj077
Copy link
Contributor

jklj077 commented May 8, 2024

@jklj077 jklj077 closed this as completed May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants