Skip to content

nikhil-chigali/AdapterBERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AdapterBERT

AdapterBERT is a project that utilizes the BERT model with adapter layers for fine-tuning on specific downstream tasks. This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.

Setup

To set up the project, follow these steps:

  1. Clone the repository:

    git clone https://github.com/nikhil-chigali/AdapterBERT.git
  2. Navigate to the project directory:

    cd AdapterBERT
  3. Install Poetry:

    pip install poetry
  4. Set up the Poetry environment:

    poetry install --no-root

Training

Update the training hyperparameters and settings as needed in the constants.py file.

To train the model, run the train.py script. Here's an example command:

python train.py

Prediction

Before running any inferences, update the path to the model checkpoint CKPT_PATH in constants.py

To make predictions on custom input prompt using the trained model, run the predict.py script:

python predict.py

Acknowledgements

I would like to acknowledge the following repositories and papers used as reference for this project:

About

This project is an implementation of the paper: Parameter-Efficient Transfer Learning for NLP, Houlsby [Google], ICML 2019.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages