Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The MixLoRA (LoRA + MoE) and its related improvements are available at mikecovlee/mlora. #186

Open
mikecovlee opened this issue Mar 16, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@mikecovlee
Copy link
Member

We have an actively developing fork of the official m-LoRA repository, focusing on LoRA + MoE and its related improvements, maintained by the authors of m-LoRA.
URL: https://github.com/mikecovlee/mlora

@mikecovlee mikecovlee added the enhancement New feature or request label Mar 16, 2024
@mikecovlee mikecovlee self-assigned this Mar 16, 2024
@mikecovlee mikecovlee pinned this issue Mar 16, 2024
@mikecovlee
Copy link
Member Author

We also provide the following features in this fork:

  • Auto evaluation, including popular NLP tasks, QA tasks, etc.
  • Training and evaluation of models for sequence classification tasks.
  • Half precision training and inference.
  • Built-in multi-LoRA inference.

These features will be added to this repository once they are stable and deemed to be of sufficient quality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant