This project used in my dissertation for detecting sarcasm in twitter dataset using Sate-of-the-Art Transformer
-
Updated
Oct 27, 2019 - Jupyter Notebook
This project used in my dissertation for detecting sarcasm in twitter dataset using Sate-of-the-Art Transformer
training sentence sentiment classification with Bert
revising reddit troll hunting with some new nlp techniques (DistilBERT, Multi-Sample Dropout, OOF, Group K-fold on subreddit and some fancy pre-processing)
NLP Service to perform text classification. This is the first part of Project Jarvis. This service integrates to the chat-bot service
Language Detection using DistilBERT
Pytorch-Named-Entity-Recognition-with-transformers
Transformers
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
NLP deep learning model for multilingual toxicity detection in text 📚
Web Interface for Question Answering System
Kaggle's Tweet Sentiment Extraction challenge. Model had to extract phrases out of a tweet which maximise a given sentiment.
Fake jobs posting predictions task
NLP Workshop -ML India
Compares the DistilBERT and MobileBERT architectures for mobile deployments.
FoodBERT: Food Extraction with DistilBERT
Notebook (Demonstration) for training Distilbert on Glue, and uploading model to Huggingface.
Usage of BERT models for text clustering techniques using sentence embeddings
In this repository, I have collected different sources, visualizations, and code examples of BERT
Model training and prediction algorithm classifying texts per topic and sentiment, using DistilBERT and PyTorch. Uses Github Actions to continuously train and deploy the model to Algorithmia.
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."