gpt
Here are 296 public repositories matching this topic...
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
Updated
Feb 25, 2022 - Python
-
Updated
Jan 3, 2023 - Rust
LightSeq: A High Performance Library for Sequence Processing and Generation
-
Updated
Jan 6, 2023 - C++
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
-
Updated
Oct 20, 2022 - Python
Transformer related optimization, including BERT, GPT
-
Updated
Jan 6, 2023 - C++
-
Updated
Nov 1, 2022 - Python
-
Updated
Jun 15, 2022
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Jan 5, 2023 - Rust
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
-
Updated
Jan 6, 2023 - Python
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
-
Updated
Jan 3, 2023 - Python
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
-
Updated
Nov 21, 2022 - Python
PatrickStar enables Larger, Faster, Greener Pretrained Models for NLP and democratizes AI for everyone.
-
Updated
Dec 21, 2022 - Python
Super UEFIinSecureBoot Disk: Boot any OS or .efi file without disabling UEFI Secure Boot
-
Updated
Jun 20, 2022
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
-
Updated
Nov 30, 2021 - Jupyter Notebook
Improve this page
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."

