We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
例行检查
你的版本
问题描述, 日志截图 在config.json文件中如图所示的配置项,对应不同的模型,该如何正确填写?
例如在Meta-Llama-3-8B-Instruct中, 该怎么和config.json中配置项对应上?
因为我在用Meta-Llama-3-8B-Instruct进行对话时,出现了如图所示的现象:
复现步骤
预期结果
相关截图
The text was updated successfully, but these errors were encountered:
@c121914yu @nongmo677
Sorry, something went wrong.
这不是模型问题么
我的意思是 和 的对应关系
我的意思是 和 的对应关系 @c121914yu @nongmo677
不懂这是啥版本~只知道 llama3 官方是 8k 最大上下文。温度一般是 0-1
No branches or pull requests
例行检查
你的版本
问题描述, 日志截图
在config.json文件中如图所示的配置项,对应不同的模型,该如何正确填写?
例如在Meta-Llama-3-8B-Instruct中,
该怎么和config.json中配置项对应上?
因为我在用Meta-Llama-3-8B-Instruct进行对话时,出现了如图所示的现象:
复现步骤
预期结果
相关截图
The text was updated successfully, but these errors were encountered: