👻 The PyTorch implementation for the IEEE Access paper: "PAC-MAN: Multi-Relation Network in Social Community for Personalized Hashtag Recommendation".
-
Updated
Oct 18, 2023 - Python
👻 The PyTorch implementation for the IEEE Access paper: "PAC-MAN: Multi-Relation Network in Social Community for Personalized Hashtag Recommendation".
✨ The Tensorflow implementation for the IEEE Access paper: "ARERec: Attentive Local Interaction Model for Sequential Recommendation".
The open source implementation of the multi grouped query attention by the paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
Deep learning methods for sentiment analysis classification of covid-19 vaccination tweets
Transformers, including the T5 and MarianMT, enabled effective understanding and generating complex programming codes. Consequently, they can help us in Data Security field. Let's see how!
Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount of time on any token
Implementation of the LDP module block in PyTorch and Zeta from the paper: "MobileVLM: A Fast, Strong and Open Vision Language Assistant for Mobile Devices"
Implementation of the model "Hedgehog" from the paper: "The Hedgehog & the Porcupine: Expressive Linear Attentions with Softmax Mimicry"
Omni-Modality Processing, Understanding, and Generation
A collection of layers, ops, utilities and more for TensorFlow 2.0 high-level API Keras
Collection of various of my custom TensorFlow-Keras 2.0+ layers, utils and such
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
My implementation of the model KosmosG from "KOSMOS-G: Generating Images in Context with Multimodal Large Language Models"
An active vision system which builds a 3D environment map autonomously using visual attention mechanisms.
Implementation of "PaLM2-VAdapter:" from the multi-modal model paper: "PaLM2-VAdapter: Progressively Aligned Language Model Makes a Strong Vision-language Adapter"
A PyTorch implementation of the Multi-Mode CNN to reconstruct Chlorophyll-a time series in the global ocean from oceanic and atmospheric physical drivers
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"
Implementation of an Attention layer where each head can attend to more than just one token, using coordinate descent to pick topk
Implementation of Agent Attention in Pytorch
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."