COLLECTED BY
Organization:
Internet Archive
Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
The Wayback Machine - https://web.archive.org/web/20200803215329/https://github.com/topics/attention-is-all-you-need
#
attention-is-all-you-need
Here are
61 public repositories
matching this topic...
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Updated
Jul 8, 2020
Python
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Updated
Feb 14, 2020
Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Updated
Jul 31, 2020
Python
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Updated
Jan 1, 2019
Python
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Updated
Feb 28, 2020
Python
A Benchmark of Text Classification in PyTorch
Updated
Aug 16, 2019
Python
Neural Machine Translation with Keras
Updated
Jul 3, 2020
Python
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Updated
May 7, 2020
Python
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Updated
Dec 10, 2018
Python
pytorch implementation of Attention is all you need
Updated
Jan 23, 2018
Python
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Updated
Aug 20, 2018
Jupyter Notebook
list of efficient attention modules
Updated
Aug 2, 2020
Python
Updated
Jun 5, 2019
Python
Open Source Project for Korean End-to-End (E2E) Automatic Speech Recognition (ASR) in Pytorch for Deep Learning Researchers
Updated
Aug 2, 2020
Python
Transformers without Tears: Improving the Normalization of Self-Attention
Updated
Apr 25, 2020
Python
Multi heads attention for image classification
Updated
Apr 19, 2018
Python
Neutron: A pytorch based implementation of Transformer and its variants.
Updated
Jun 17, 2020
Python
Abstractive summarization using Transformers.
Updated
Jan 1, 2020
Jupyter Notebook
Updated
Feb 21, 2019
Python
A PyTorch implementation of the Transformer model from "Attention Is All You Need".
Updated
Jul 13, 2019
Python
Updated
Aug 30, 2018
Python
Experiments on Multilingual NMT
Updated
Jun 22, 2020
Python
Implementation of "Attention is All You Need" paper
Updated
Jun 21, 2020
Python
Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"
Updated
Apr 21, 2019
Python
Transformer Based SeqGAN for Language Generation
Updated
Dec 10, 2018
Python
A simple TensorFlow implementation of the Transformer
Updated
Jan 7, 2019
Python
Machine Translation using Transfromers
Updated
Jan 1, 2020
Jupyter Notebook
Witwicky: An implementation of Transformer in PyTorch.
Updated
Oct 15, 2019
Python
Fake News Detection by Learning Convolution Filters through Contextualized Attention
Updated
May 14, 2020
Python
Attention Is All You Need | a PyTorch Tutorial to Machine Translation
Updated
Jun 3, 2020
Python
Improve this page
Add a description, image, and links to the
attention-is-all-you-need
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
attention-is-all-you-need
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.