-
Updated
May 20, 2020 - Python
attention-mechanism
Here are 599 public repositories matching this topic...
-
Updated
Aug 4, 2020 - Python
-
Updated
Sep 25, 2020 - Python
-
Updated
Sep 11, 2020 - Python
-
Updated
Sep 14, 2020
-
Updated
Apr 12, 2020 - Python
-
Updated
Aug 3, 2020 - Python
-
Updated
Sep 27, 2020 - Python
-
Updated
Oct 4, 2020 - Python
-
Updated
Oct 16, 2020 - Python
-
Updated
Sep 12, 2019 - Python
-
Updated
Jul 28, 2018 - Jupyter Notebook
-
Updated
Jun 25, 2019 - Python
-
Updated
Sep 13, 2020
-
Updated
Oct 16, 2020 - Python
-
Updated
Jul 30, 2019 - Python
-
Updated
Jul 3, 2020 - Python
-
Updated
Oct 14, 2020
-
Updated
Sep 22, 2019 - Python
-
Updated
Jul 29, 2020 - Python
-
Updated
Jun 12, 2020 - Python
-
Updated
Oct 12, 2020 - Jupyter Notebook
-
Updated
Oct 30, 2016 - Jupyter Notebook
-
Updated
Mar 10, 2020 - Python
-
Updated
Oct 14, 2020 - Python
-
Updated
Sep 24, 2020 - Python
-
Updated
Oct 3, 2017 - Jupyter Notebook
-
Updated
Apr 9, 2020 - Python
-
Updated
Dec 26, 2017 - Jupyter Notebook
Improve this page
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."


Need help for retraining and cross validation and see if the ROUGE score matches exactly (or better) with the numbers reported in the paper.
I just train for 500k iteration (with batch size 8) with pointer generation enabled + coverage loss disabled and next 100k iteration (with batch size 8) with pointer generation enabled + coverage loss enabled.
It would be great if someone can help re-r