Skip to content

trjo1/genaiwithllms

Repository files navigation

Generative AI with Large Language Models (LLMs) Course Projects

Overview

This repository documents my projects from the "Generative AI with Large Language Models" course. The course focused on understanding the lifecycle of LLMs from data gathering to deployment, with a strong emphasis on transformer architectures.

Key Learnings

  • Fundamentals of Generative AI: Gained insights into how generative AI and LLMs function.
  • Transformer Architecture: Delved into the details of the transformer architecture, training processes, and fine-tuning applications.
  • Empirical Scaling Laws: Applied scaling laws to optimize model objectives across various constraints.
  • State-of-the-art Techniques: Employed advanced training, tuning, and deployment methods for enhancing model performance.

Projects

  1. Summarize Dialogue: Implementing LLMs for dialogue summarization.
  2. Dialogue Summarization Fine-Tuning: Fine-tuned a generative AI model for dialogue summarization using instruction fine-tuning and PEFT techniques.
  3. FLAN-T5 Fine-Tuning: Advanced fine-tuning of FLAN-T5 with RLHF to generate more positive summaries.

Tools and Technologies

  • Python
  • PyTorch
  • Transformer Models

Repository Structure

  • Summarize Dialogue: Contains the implementation of the dialogue summarization project.
  • Fine-Tuning Techniques: Demonstrates various fine-tuning methods applied to LLMs.
  • FLAN-T5 Reinforcement Learning: Showcases the process and outcomes of fine-tuning FLAN-T5 using reinforcement learning techniques.

Acknowledgements

Special thanks to the course creators and instructors for providing comprehensive and practical insights into Generative AI and LLMs.


Note: This repository is a part of my continuous learning in AI and represents my hands-on experience from the course.

About

Fine-tuned FLAN T-5 using Instruction Fine-Tuning (Full), LoRA-based PEFT, and RLHF with PPO

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published