-
Updated
Aug 14, 2021 - Python
#
transformers
Here are 769 public repositories matching this topic...
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
-
Updated
Aug 21, 2021 - Python
-
Updated
Aug 19, 2021 - Rust
machine-learning
reinforcement-learning
deep-learning
transformers
pytorch
transformer
gan
neural-networks
deep-learning-tutorial
optimizers
-
Updated
Aug 22, 2021 - Jupyter Notebook
Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
deep-learning
transformers
artificial-intelligence
siren
text-to-image
multi-modality
implicit-neural-representation
-
Updated
Jul 4, 2021 - Python
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
-
Updated
Aug 20, 2021 - Python
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
nlp
natural-language-processing
tutorial
sentiment-analysis
word-embeddings
transformers
cnn
pytorch
recurrent-neural-networks
lstm
rnn
fasttext
bert
sentiment-classification
pytorch-tutorial
pytorch-tutorials
cnn-text-classification
lstm-sentiment-analysis
pytorch-nlp
torchtext
-
Updated
Jul 15, 2021 - Jupyter Notebook
A PyTorch-based Speech Toolkit
audio
transformers
pytorch
voice-recognition
speech-recognition
speech-to-text
language-model
speaker-recognition
speaker-verification
speech-processing
audio-processing
asr
speaker-diarization
speechrecognition
speech-separation
speech-enhancement
spoken-language-understanding
huggingface
speech-toolkit
speechbrain
-
Updated
Aug 21, 2021 - Python
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
-
Updated
Jun 10, 2021 - Python
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
-
Updated
Aug 18, 2021 - Python
State of the Art Natural Language Processing
nlp
natural-language-processing
spark
sentiment-analysis
text-classification
tensorflow
machine-translation
transformers
language-detection
pyspark
named-entity-recognition
seq2seq
lemmatizer
spell-checker
albert
bert
part-of-speech-tagger
entity-extraction
spark-ml
xlnet
-
Updated
Aug 21, 2021 - Scala
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
benchmark
tensorflow
nlu
glue
corpus
transformers
pytorch
dataset
chinese
pretrained-models
language-model
albert
bert
roberta
chineseglue
-
Updated
Aug 3, 2021 - Python
Super easy library for BERT based NLP models
-
Updated
Aug 2, 2021 - Python
Reformer, the efficient Transformer, in Pytorch
-
Updated
May 9, 2021 - Python
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
nlp
machine-learning
topic
transformers
topic-modeling
bert
topic-models
sentence-embeddings
topic-modelling
ldavis
-
Updated
Aug 11, 2021 - Python
jiant is an nlp toolkit
-
Updated
Jul 26, 2021 - Python
MLeap: Deploy ML Pipelines to Production
-
Updated
Aug 3, 2021 - Scala
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
-
Updated
Aug 21, 2021 - Python
A simple but complete full-attention transformer with a set of promising experimental features from various papers
-
Updated
Aug 20, 2021 - Python
自然语言处理、知识图谱、对话系统三大技术研究与应用。
-
Updated
Jan 3, 2021
bradennapier
commented
Mar 11, 2020
Hey! Thanks for the work on this.
Wondering how we can use this with mocha? tsconfig-paths has its own tsconfig-paths/register to make this work
https://github.com/dividab/tsconfig-paths#with-mocha-and-ts-node
Basically with mocha we have to run mocha -r ts-node/register -- but that wouldnt have the compiler flag.
Would be worthwhile to have the ability to do it which looks like
Generative Adversarial Transformers
transformers
attention
image-generation
gans
generative-adversarial-networks
compositionality
scene-generation
-
Updated
Jul 20, 2021 - Python
Korean BERT pre-trained cased (KoBERT)
-
Updated
Jul 22, 2021 - Jupyter Notebook
An implementation of Performer, a linear attention-based transformer, in Pytorch
-
Updated
Aug 21, 2021 - Python
This Word Does Not Exist
machine-learning
natural-language-processing
transformers
natural-language-generation
natural-language-understanding
gpt-2
-
Updated
Jan 12, 2021 - Python
This repo contains a PyTorch implementation of a pretrained BERT model for multi-label text classification.
nlp
text-classification
transformers
pytorch
multi-label-classification
albert
bert
fine-tuning
pytorch-implmention
xlnet
-
Updated
Jun 2, 2021 - Python
nlp
natural-language-processing
information-retrieval
deep-learning
transformers
pytorch
artificial-intelligence
question-answering
reading-comprehension
bert
-
Updated
Apr 30, 2020 - Python
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
machine-learning
deep-learning
machine-learning-algorithms
transformers
artificial-intelligence
transformer
attention
attention-mechanism
self-attention
-
Updated
Jul 26, 2021 - Python
Implementation of Bottleneck Transformer in Pytorch
-
Updated
Feb 1, 2021 - Python
Improve this page
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."


Problem
Since Java 8 was introduced there is no need to use Joda as it has been replaced the native Date-Time API.
Solution
Ideally greping and replacing the text should work (mostly)
Additional context
Need to check if de/serializing will still work.