Skip to content

💥 Abstractive Text Summarization with Bert and Convolution 👽

License

Notifications You must be signed in to change notification settings

axenov/LocalBertSummarization

 
 

Repository files navigation

Neural-based Abstractive Text Summarization

This system implements the abstractive text summarization models from the paper Abstractive Text Summarization based on Language Model Conditioning and Locality Modeling .

The system supports two neural models:

  • BERT-Transformer (bert) - the model using pre-trained BERT to condition the encoder and decoder of Transformer.
  • Transformer with Convolutional Self-Attention (conv) - the model replacing self-attention with convolutional self-attention to better model local dependencies.

For summarization of long texts, the TF-IDF extractive summarizer can be used before the abstractive models.

Usage

First, download the models from here and extract them in the models/ folder.

Then, run the system specifyning the language of the text (English and German), the method of summarization and if the extractive summarizer must be used before the abstractive one.

The example of usage:

from summarizer import AbstractiveSummarizer
texts = []
with open("data/sample_en.txt") as f:
	texts = [text for text in f]

model = AbstractiveSummarizer(language = 'en', method = 'conv', extract = True)
for summ in model.summarize(texts):
	print(summ)

About

💥 Abstractive Text Summarization with Bert and Convolution 👽

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 88.6%
  • Perl 4.7%
  • Shell 3.7%
  • Emacs Lisp 2.2%
  • Smalltalk 0.2%
  • Ruby 0.2%
  • Other 0.4%