You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are excited to announce the addition of support for the qwen2 model in the ColossalAI framework. The qwen2 model is compatible with version 4.39.3 of the transformer library, as the transformers of version 4.36 doesn't support qwen2 model.
The qwen2 model inherits most the optimization capabilities of ColossalAI and seamlessly integrates with the transformer library. Users can now utilize the qwen2 model for various tasks such as pretraining and fine-tuning, similar to how they would use other models like llama2.
With the integration of the qwen2 model, ColossalAI continues to empower users with state-of-the-art models and extensive optimization functionalities.
Environment: transformers = 4.39.3
The text was updated successfully, but these errors were encountered:
Describe the feature
We are excited to announce the addition of support for the qwen2 model in the ColossalAI framework. The qwen2 model is compatible with version 4.39.3 of the transformer library, as the transformers of version 4.36 doesn't support qwen2 model.
The qwen2 model inherits most the optimization capabilities of ColossalAI and seamlessly integrates with the transformer library. Users can now utilize the qwen2 model for various tasks such as pretraining and fine-tuning, similar to how they would use other models like llama2.
With the integration of the qwen2 model, ColossalAI continues to empower users with state-of-the-art models and extensive optimization functionalities.
Environment:
transformers = 4.39.3
The text was updated successfully, but these errors were encountered: