gpt2
Here are 323 public repositories matching this topic...
Generates the comprehended oneliners or summaries for the articles.
-
Updated
Oct 20, 2021 - Jupyter Notebook
Pretraining GPT2 model on Basque language
-
Updated
Aug 1, 2023 - Python
Training transformer models (e.g. RoBERTa, GPT2 and GPT-J) from scratch.
-
Updated
Dec 27, 2023 - Python
tfDlg is a Python library for transformer-based language models and dialog models with TensorFlow.
-
Updated
Apr 9, 2021 - Python
A novel technique for creating crossovers using NER Swap + GPT2/LSTM finetuning.
-
Updated
Jul 14, 2022 - Python
An Empirical Study of Multitask Learning to Improve Open Domain Dialogue Systems, NoDaLiDa 2023
-
Updated
May 22, 2023 - Shell
I performed sentiment analysis aimed at determining the sentiment of 50000 imDB movie reviews, whether they are positive, negative, or neutral. I employed various NLP approaches including lexicon based approaches, machine learning models, PLM models, and hybrid models, and assessed the performance on each type of model.
-
Updated
Feb 26, 2024 - Jupyter Notebook
Auto generate tweets powered by pre-trained GPT2 based large language model (LLM) available offline.
-
Updated
Apr 30, 2024 - Python
A personalized autocomplete (next word prediction) project using three different architectures: stacked LSTMs, Seq2Seq with Attention and LSTMs and GPT-2, written from scratch.
-
Updated
Oct 3, 2023 - Jupyter Notebook
This repository contains NLP Transfer learning projects with deployment and integration with UI.
-
Updated
Jan 13, 2023 - Python
-
Updated
Jun 20, 2020 - Python
Machine Learning Project. Please refer to my presentation-https://github.com/gongl1/projectdemo3/blob/main/Pattern%20Patent_ML.pptx - python, transformers, gpt2, nlp, sk-learn
-
Updated
Sep 8, 2021 - Jupyter Notebook
Team project - generate recipe based on ingredients available in the fridge
-
Updated
Dec 18, 2023 - TypeScript
GPC (Generative Pre-trained Coder based on GPT-2) is a coder assist tool that uses the GPT-2 natural language processing model to generate code. It can help developers by providing suggestions for completing code snippets based on the context and syntax of the code. This can save time in coding.
-
Updated
Apr 11, 2023 - Jupyter Notebook
[NLPCC'23] ZeroGen: Zero-shot Multimodal Controllable Text Generation with Multiple Oracles PyTorch Implementation
-
Updated
Oct 7, 2023 - Python
Improve this page
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."