Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

几点疑问或建议 #9

Open
shuaidaming opened this issue Dec 11, 2023 · 3 comments
Open

几点疑问或建议 #9

shuaidaming opened this issue Dec 11, 2023 · 3 comments

Comments

@shuaidaming
Copy link

1,利用cpp进行加速,如llama.cpp那样,你们也可以搞个mixtral.cpp,支持mixtral-8x7b和mixtral-7b在f32,f16等精度上的灵活切换
2,全参数的训练、提示学习微调代码,及其对应的数据json格式

@tonysy
Copy link
Contributor

tonysy commented Dec 11, 2023

Good suggestion.
For 2, our team has supported finetuning. Welcome to https://github.com/InternLM/xtuner/tree/main/xtuner/configs/mixtral for more information
For 1. stay tuned.

Thanks.

@shuaidaming
Copy link
Author

从0开始训练呢,我想修改dim,hidden_dim以及vocab_size等等,是否可提供一个train.py

@shuaidaming
Copy link
Author

顺带说明数据json文件中的示例,似应支持的格式:有监督---qa对和多轮对话,无监督---长文档

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants