Skip to content

This repository contains my implementation for the "Generative AI with Large Language Models" course offered by Coursera. The course provides a comprehensive understanding of generative AI and explores how large language models (LLMs) can be used to create value in various real-world applications.

Notifications You must be signed in to change notification settings

fsarab/Generative-AI-with-LLMs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generative AI with Large Language Models Implementation

This repository contains my implementation for the "Generative AI with Large Language Models" course offered by Coursera. The course provides a comprehensive understanding of generative AI and explores how large language models (LLMs) can be used to create value in various real-world applications.

Course Overview

The course is divided into three modules, each covering different aspects of generative AI and LLMs. Here's a brief overview of each module:

Week 1: Generative AI Use Cases and Model Pre-training

In this Week, I learned about the project lifecycle of generative AI and LLMs. I gained insights into the transformer architecture that powers LLMs, their training process, and the concept of fine-tuning. Additionally, I explored various topics such as generative configuration, scaling laws, and computational challenges associated with training LLMs.

Week 2: Fine-tuning and Evaluating Large Language Models

Week 2 focused on the fine-tuning process and evaluating LLMs. I learned about different techniques for fine-tuning LLMs on specific tasks and explored model evaluation methods and benchmarks. The module also covered parameter efficient fine-tuning and its associated techniques.

Week 3: Reinforcement Learning and LLM-powered Applications

In the final week, I delved into reinforcement learning and its applications in LLM-powered systems. I gained an understanding of aligning models with human values, reinforcement learning from human feedback, and model optimizations for deployment. The module also covered topics like program-aided language models, reasoning and action, and responsible AI.

Prerequisites

To make the most out of this implementation, it is recommended to have the following prerequisites:

  • Intermediate level programming skills in Python.
  • Familiarity with the basics of machine learning, including supervised and unsupervised learning, loss functions, and data splitting.
  • Prior knowledge of the Machine Learning Specialization or Deep Learning Specialization offered by DeepLearning.AI is beneficial.

Repository Structure

The repository is organized as follows:

  • Week1: This directory contains the implementation and code related to Week1 of the course.
  • Week2: Here, you can find the code and implementation for Week2.
  • Week1: This directory includes the code and implementation for Week3.
  • README.md: The main readme file providing an overview of the repository and its contents.

Feel free to explore the code and implementations in each module directory to gain a deeper understanding of generative AI with large language models.

Acknowledgments

I would like to express my gratitude to the instructors of the "Generative AI with Large Language Models" course from DeepLearning.AI for their comprehensive and insightful teaching. Their expertise and guidance have been invaluable in enhancing my understanding of generative AI and LLMs.

References

Please note that this implementation is for educational purposes only and serves as a showcase of my understanding of the course material.

About

This repository contains my implementation for the "Generative AI with Large Language Models" course offered by Coursera. The course provides a comprehensive understanding of generative AI and explores how large language models (LLMs) can be used to create value in various real-world applications.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published