revising reddit troll hunting with some new nlp techniques (DistilBERT, Multi-Sample Dropout, OOF, Group K-fold on subreddit and some fancy pre-processing)
-
Updated
Feb 27, 2020 - Jupyter Notebook
revising reddit troll hunting with some new nlp techniques (DistilBERT, Multi-Sample Dropout, OOF, Group K-fold on subreddit and some fancy pre-processing)
🏥 Dr.Jarvis is a medical transcript classifier that helps patients to get their symptoms diagnosed in real-time on a Streamlit-powered web app. Trained by SVM, KNN, and Random Forest models of sklearn.
Fine tuning 🤗 transformer model for softskill NER task
Fine Tuning Transformer (DistilBert, but generic to other models) for MultiClass Text Classification (Sentimental Analysis over IMDb)
Deploying a pretrained distilBERT model with SageMaker
Classify international patents into one of eight categories based on the text of their titles & abstracts using DistilBert & ONNX Runtime
Predict which Tweets are about real disasters and which ones are not Microsoft DeBERTa
Emotion Analysis with Transformers
🤗 Dockerized BERT-Multi-Label-Classifier Inferer 🤗
A study on encoding english sentences to tensorflow vectors or tensors using pre-trained BERT model from the Hugging Face Library.
Welcome to our Smart Content Accumulator website! We have developed a powerful tool that streamlines the process of obtaining summarized content from any article. With just a URL and a click of a button, our website generates concise and meaningful summaries, saving you valuable time and effort.
Using State of Art transformers for text classification and deep CNNs for Image Classification
Applying zero-shot learning on classification task.
Deep learning for Natural Language Processing
Advanced RAG pipeline using Re-Ranking after initial retrieval
Q&A System using BERT and Faiss Vector Database
HLE-UPC at SemEval-2021 Task 5: Toxic Spans Detection
A sentiment analysing web application for customer reviews. Positive, Negative and Neutral opinions are highlighted.
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."