A Tensorflow 2 (Keras) implementation of DA-RNN (A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction, arXiv:1704.02971)
-
Updated
May 2, 2024 - Jupyter Notebook
A Tensorflow 2 (Keras) implementation of DA-RNN (A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction, arXiv:1704.02971)
SERVER: Multi-modal Speech Emotion Recognition using Transformer-based and Vision-based Embeddings
The open source implementation of the multi grouped query attention by the paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
Keyword Spotting using RNN with Attention layers
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
Experiments with Deep Learning for generating music
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
This repository contains PyTorch implementation of 4 different models for classification of emotions of the speech.
Tensoflow 1.11 implementation of the Neural Sign Language Translation CVPR 2018 paper
This is a implementation of integrating a simple but efficient attention block in CNN + bidirectional LSTM for video classification.
📃 | Deep Text Recognition Implementation using PyTorch
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Prepare Text Reviews Summary
Deep representation of visual and textual descriptions using StackGAN
Forex price movement forecast
Use Encoder Decoder architecture with LSTM for English->German translation network
A NMT model using LSTM to convert a sentence from source language (Spanish) to target language (English)
Deep neural network for sequential data
Add a description, image, and links to the attention-lstm topic page so that developers can more easily learn about it.
To associate your repository with the attention-lstm topic, visit your repo's landing page and select "manage topics."