Stars
69
Forks
4
Language
Python
Last Updated
Oct 01, 2023
Similar Repos
Repo | Language | Stars | Description | Updated At |
---|---|---|---|---|
Python | 438 | Implementing Stand-Alone Self-Attention in Vision Models using Pytorch | Apr 06, 2023 | |
Python | 76 | Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch | Aug 01, 2022 | |
Python | 2 | PyTorch Attention Transformer wrapper | Sep 03, 2022 | |
Jupyter Notebook | 89 | Official PyTorch implementation of "Rethinking Mobile Block for Efficient Attention-based Models" | May 07, 2023 | |
Python | 615 | Implementing Attention Augmented Convolutional Networks using Pytorch | Apr 26, 2023 | |
Jupyter Notebook | 3 | Implementing language models using simple NLP techniques to advance Transformer architecture based models in pytorch | Feb 22, 2023 | |
Python | 2 | Transformer Language Models using Pytorch Lightning. | Apr 08, 2022 | |
Python | 536 | Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow) | May 11, 2023 | |
Python | 21 | Revealing example of self-attention, the building block of transformer AI models | Apr 29, 2023 | |
Jupyter Notebook | 8 | Classifying questions into class and subclass using Self Attention Transformer model | Nov 19, 2022 | |
Jupyter Notebook | 19 | Pytorch implementation of the paper Stand-Alone Self-Attention in Vision Models | Mar 18, 2023 | |
Python | 16 | Pytorch implementation of Self-Attention ConvLSTM | May 08, 2023 | |
Python | 2 | Implementing Attention Is All You Need paper. Transformer Model | Nov 19, 2021 | |
Jupyter Notebook | 41 | Unofficial PyTorch implementation of the paper "cosFormer: Rethinking Softmax In Attention". | Apr 23, 2023 | |
Python | 6 | PyTorch implementation of RealFormer: Transformer Likes Residual Attention | Nov 20, 2021 | |
Python | 3 | Spatially Separable Attention Transformer Network implemented in Pytorch | Oct 07, 2022 | |
Python | 110 | Pytorch code for "Rethinking CNN Models for Audio Classification" | May 04, 2023 | |
Python | 5 | Visual Question Answering using Transformer and Bottom-Up attention. Implemented in Pytorch | Mar 10, 2022 | |
Python | 2 | PyTorch Implementation of Self-Attention GAN (SAGAN) | Oct 27, 2023 | |
Python | 2 | Local self-attention in Transformer for visual question answering | Oct 19, 2023 | |
Python | 4 | PyTorch DistributedDataParallel training for Transformer models. | Apr 04, 2023 | |
Jupyter Notebook | 13 | Implementations of transformer models in pytorch | Apr 19, 2023 | |
Python | 7 | Transformer(Attention Is All You Need) Implementation in Pytorch | Aug 22, 2022 | |
Python | 2 | Transformer and Attention Mechanism written in PyTorch Lightning ⚡️ | Jun 10, 2021 | |
Python | 3 | A Robust Self-Corrected Retrosynthetic Reaction Predictor using Neural Transformer Models | May 20, 2022 | |
Python | 269 | Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image … | Aug 10, 2022 | |
None | 2 | Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image … | Mar 07, 2021 | |
Python | 65 | Self-Attention Generative Adversarial Networks Implementation in PyTorch | Aug 28, 2022 | |
Python | 65 | BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement | Sep 01, 2022 | |
Python | 14 | Pytorch code for some vision transformer models | Apr 10, 2022 | |
Python | 5 | A collection of transformer models, in PyTorch. | Feb 08, 2023 | |
Python | 104 | Implementation of Deformable Attention in Pytorch from the paper "Vision Transformer with Deformable Attention" | Aug 11, 2022 | |
Python | 99 | PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类 | Sep 12, 2022 | |
Python | 3 | Simple, self-contained, PyTorch NLP models. | Jan 25, 2023 | |
Svelte | 262 | Exploring attention weights in transformer-based models with linguistic knowledge. | Apr 27, 2023 | |
None | 2 | Official Pytorch implementations for "SegNeXt: Rethinking Convolutional Attention Design for Semantic Segmentation" (NeurIPS 2022) | Apr 12, 2023 | |
Jupyter Notebook | 3 | Robust Wave-Feature Adaptive Heartbeat Classification Based on Self-Attention Mechanism Using a Transformer Model | Dec 17, 2021 | |
Python | 2 | NatIR: Image Restoration Using Neighborhood-Attention-Transformer | Mar 20, 2023 | |
Python | 5 | A PyTorch implementation for Self-Attention Generative Adversarial Networks | Mar 25, 2023 | |
Python | 2353 | Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN) | May 14, 2023 | |
Jupyter Notebook | 256 | A PyTorch reimplementation of bottom-up-attention models | Apr 16, 2023 | |
Jupyter Notebook | 15 | An PyTorch reimplementation of bottom-up-attention models | Mar 24, 2023 | |
Python | 136 | A modular PyTorch library for vision transformer models | Aug 24, 2022 | |
Python | 853 | An implementation of Performer, a linear attention-based transformer, in Pytorch | Aug 11, 2022 | |
Python | 89 | A PyTorch implementation of Transformer in "Attention is All You Need" | May 03, 2023 | |
Python | 73 | Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention | Jun 06, 2022 | |
Python | 80 | The official implementation of ELSA: Enhanced Local Self-Attention for Vision Transformer | May 24, 2022 | |
Python | 2 | ResNet for License Plate Detection & Multi-Head Self Attention Transformer for OCR | Jun 12, 2023 | |
Python | 2 | Singularformer: Learning to Decompose Self-Attention to Linearize the Complexity of Transformer | Dec 24, 2023 | |
Jupyter Notebook | 10 | PyTorch Implementation of Self-Supervised Learning models | Jan 12, 2023 |