Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

并发请求得到的回复有差异 #1571

Open
ghntd opened this issue May 9, 2024 · 0 comments
Open

并发请求得到的回复有差异 #1571

ghntd opened this issue May 9, 2024 · 0 comments

Comments

@ghntd
Copy link

ghntd commented May 9, 2024

我现在想要测试我的微调模型在lmdeploy框架下的性能指标,因此我非常需要lmdeploy推理框架进行稳定且一致的生成。我将api_server.py用于提供随机采样的random_seed 参数固定了下来,并且将温度设为0.01。我观察到这样的设置在串发请求下可以得到稳定的输出,然而当我进行异步并发测试时发现生成结果会有微小的变化。我想请教一下我该如何得到稳定的输出。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant