natural-language-generation
Here are 221 public repositories matching this topic...
-
Updated
Oct 22, 2020 - Python
-
Updated
May 10, 2020 - Jupyter Notebook
Well, Gumbel Distribution is magical. Basically, given a sequence of K logits, i.e., "\log a_1, \log a_2, ..., \log a_K" and K independent gumbel random variables, i.e., "g_1, g_2, ..., g_K". We have
\argmax_i \log a_i + g_i ~ Categorical({a_i / sum(a)})
This gives you a very simple way to sampl
-
Updated
Aug 3, 2020 - Python
-
Updated
Sep 24, 2020 - Java
-
Updated
May 19, 2020 - Python
-
Updated
Sep 30, 2020 - Python
-
Updated
Jul 2, 2019 - Python
-
Updated
Mar 10, 2020 - Python
-
Updated
Jun 14, 2017 - Python
-
Updated
Dec 28, 2019
-
Updated
Oct 22, 2020 - Jupyter Notebook
-
Updated
Oct 14, 2020 - JavaScript
-
Updated
Oct 18, 2020 - Clojure
-
Updated
Jul 2, 2020 - Python
-
Updated
Jun 16, 2020
-
Updated
Dec 1, 2019 - Jupyter Notebook
-
Updated
Oct 14, 2020
-
Updated
Jul 18, 2018 - Python
-
Updated
Aug 31, 2020 - Python
-
Updated
Jan 31, 2018 - Python
-
Updated
Jul 11, 2020
-
Updated
Nov 8, 2017 - Python
-
Updated
Aug 23, 2020 - Python
-
Updated
Apr 4, 2020 - Python
-
Updated
Oct 15, 2020
-
Updated
Sep 27, 2019 - Python
-
Updated
Oct 11, 2020 - JavaScript
-
Updated
Dec 11, 2019 - Python
Improve this page
Add a description, image, and links to the natural-language-generation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the natural-language-generation topic, visit your repo's landing page and select "manage topics."


The architecture
GPT2ForSequenceClassificationwas added in #7501 in PyTorch. It would be great to have it in TensorFlow (cf. issues #7622), but it would also be great to have it for other causal models:OpenAI GPT, CTRL, TransfoXLCurrently working on OpenAI GPT: @fmcurti(done)Below is a list of items to follow to make sure the integration of such an architect