[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
-
Updated
Apr 25, 2024 - Python
[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
This repository contains the source code for the paper "Grouped Pointwise Convolutions Reduce Parameters in Convolutional Neural Networks".
[arXiv] Cross-Modal Adapter for Text-Video Retrieval
Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
INTERSPEECH 23 - Refunction Whisper to recognize new tasks with adapters!
This Repository surveys the paper focusing on Prompting and Adapters for Speech Processing.
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
CodeUp: A Multilingual Code Generation Llama2 Model with Parameter-Efficient Instruction-Tuning on a Single RTX 3090
On Transferability of Prompt Tuning for Natural Language Processing
Research Trends in LLM-guided Multimodal Learning.
A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.
Live Training for Open-source Big Models
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
A Unified Library for Parameter-Efficient and Modular Transfer Learning
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Add a description, image, and links to the parameter-efficient-learning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-learning topic, visit your repo's landing page and select "manage topics."