Code for paper "UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning", ACL 2022
-
Updated
Mar 23, 2022 - Python
Code for paper "UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning", ACL 2022
Code for the Findings of NAACL 2022(Long Paper): AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
[arXiv] Cross-Modal Adapter for Text-Video Retrieval
KR3: Korean Restaurant Review with Ratings / Experiments on Parameter-efficient Tuning and Task-adaptive Pre-training
Applied Deep Learning 深度學習之應用 by Vivian Chen 陳縕儂 at NTU CSIE
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
[CVPR 2023] VoP: Text-Video Co-operative Prompt Tuning for Cross-Modal Retrieval
PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
The code for generating natural distribution shifts on image and text datasets.
The code for the paper "Instance-aware Dynamic Prompt Tuning for Pre-trained Point Cloud Models" (ICCV'23).
[ICCV 2023] Binary Adapters, [AAAI 2023] FacT, [Tech report] Convpass
Official implementation of AAAI 2023 paper "Parameter-efficient Model Adaptation for Vision Transformers"
Official implementation for CVPR'23 paper "BlackVIP: Black-Box Visual Prompting for Robust Transfer Learning"
Evaluate robustness of adaptation methods on large vision-language models
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
Research Trends in LLM-guided Multimodal Learning.
[Preprint] AdaVAE: Exploring Adaptive GPT-2s in VAEs for Language Modeling PyTorch Implementation
This is AlpaGasus2-QLoRA based on LLaMA2 with AlpaGasus mechanism using QLoRA!
[NeurIPS2023] Parameter-efficient Tuning of Large-scale Multimodal Foundation Model
A curated list of prompt-based paper in computer vision and vision-language learning.
Add a description, image, and links to the parameter-efficient-tuning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-tuning topic, visit your repo's landing page and select "manage topics."