You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AutoModelForSequenceClassification - simple padding is sufficient
There are also option for Permutation Language Modelling and Whole word mask . Kindly suggest @andreyvelich@johnugeorge
Thank you for your interest @live2awesome!
It would be nice if you could let us know what changes we need to make to our HF LLM Trainer to support Data Collators for other Transformers.
Also, we should discuss if we should add Data Collator by default to all supported transformers.
More context: #2031 (comment).
Currently, we apply HuggingFace Data Collator only for
AutoModelForCausalLM
Transformer in HF LLM Trainer.We need to investigate if we should apply it for other Transformers for language modelling models.
The text was updated successfully, but these errors were encountered: