natural-language-understanding
Here are 430 public repositories matching this topic...
-
Updated
Oct 1, 2020 - Python
-
Updated
Aug 20, 2020 - Python
-
Updated
Oct 9, 2020 - Python
-
Updated
Oct 8, 2020 - Python
-
Updated
Oct 9, 2020 - Rust
Well, Gumbel Distribution is magical. Basically, given a sequence of K logits, i.e., "\log a_1, \log a_2, ..., \log a_K" and K independent gumbel random variables, i.e., "g_1, g_2, ..., g_K". We have
\argmax_i \log a_i + g_i ~ Categorical({a_i / sum(a)})
This gives you a very simple way to sampl
-
Updated
Oct 1, 2020 - Scheme
-
Updated
Aug 26, 2020 - C++
-
Updated
Oct 9, 2020 - Scala
-
Updated
Oct 1, 2020 - Python
-
Updated
Jul 30, 2020 - JavaScript
-
Updated
Oct 9, 2020 - Python
-
Updated
Jul 13, 2018 - Python
-
Updated
May 19, 2020 - Python
-
Updated
Oct 8, 2020 - Python
-
Updated
Oct 2, 2020 - Jupyter Notebook
-
Updated
Sep 27, 2018 - Python
-
Updated
Jun 29, 2020 - Python
-
Updated
Sep 3, 2020 - Java
-
Updated
Apr 16, 2018 - Python
-
Updated
Oct 10, 2020 - JavaScript
-
Updated
Dec 28, 2019
-
Updated
May 28, 2020 - Python
-
Updated
Sep 25, 2020 - Python
-
Updated
Oct 3, 2020 - C++
-
Updated
Oct 7, 2020
-
Updated
Oct 7, 2020 - Python
-
Updated
Nov 9, 2019 - Python
-
Updated
Nov 15, 2019
Improve this page
Add a description, image, and links to the natural-language-understanding topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the natural-language-understanding topic, visit your repo's landing page and select "manage topics."


The architecture
GPT2ForSequenceClassificationwas added in #7501 in PyTorch. It would be great to have it in TensorFlow (cf. issues #7622), but it would also be great to have it for other causal models: OpenAI GPT, CTRL, TransfoXLCurrently working on OpenAI GPT: @fmcurti
Below is a list of items to follow to make sure the integration of such an architecture is com