Skip to content
#

350-million-parameters

Here is 1 public repository matching this topic...

In this project, I have provided code and a Colaboratory notebook that facilitates the fine-tuning process of an Alpaca 350M parameter model originally developed at Stanford University. The model was adapted using LoRA to run with fewer computational resources and training parameters and used HuggingFace's PEFT library.

  • Updated Jul 9, 2023
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the 350-million-parameters topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the 350-million-parameters topic, visit your repo's landing page and select "manage topics."

Learn more