Code used in An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection.
-
Updated
Feb 18, 2020 - Jupyter Notebook
Code used in An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection.
"Unsupervised Paraphrase Generation using Pre-trained Language Model."
Pre-trained AWD-LSTM language model trained on Filipino text corpus using fastai v2. Instructions included.
The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
word2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, information extraction (i.e., entity, relation and event extraction), knowledge graph, text generation, network embedding
Introductory workshop series on NLP and Pretrained Language Models
Implementation of ICLR 21 paper: Probing BERT in Hyperbolic Spaces
Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
Examples for pre-training retrieval-extraction based language model
A curated list of pretrained sentence and word embedding models
Implemented a Research paper "PubMed 200k RCT: a Dataset for Sequential Sentence Classification in Medical Abstracts. Use Hybrid Embedding (char + token + positional).
NLP Pretrained Language Models Implementation Study
ELECTRA기반 한국어 대화체 언어모델
SynPL: a zero-shot prompt language model to process multiple-choice questions on synonyms
Code associated with the Don't Stop Pretraining ACL 2020 paper
The source code for #5 in the Logical Reasoning Reading Comprehension Leaderboard `ReClor`.
Implementation of paper "Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval"
CoBERTa is a pre-trained models are the pre-trained language models for Comment/ Social Vietnamese datasets.
[WWW 2022] Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations
A pretrained BERT model for longer reviews
Add a description, image, and links to the pretrained-language-model topic page so that developers can more easily learn about it.
To associate your repository with the pretrained-language-model topic, visit your repo's landing page and select "manage topics."