COLLECTED BY
Organization:
Internet Archive
Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
The Wayback Machine - https://web.archive.org/web/20200803215646/https://github.com/topics/gpt-2
Here are
140 public repositories
matching this topic...
Chinese version of GPT2 training code, using BERT tokenizer.
Updated
Apr 8, 2020
Python
Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow
Updated
Jul 29, 2020
Python
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Updated
Jul 5, 2020
Python
🦄 State-of-the-Art Conversational AI with Transfer Learning
Updated
Jun 8, 2020
Python
Large-scale pretraining for dialogue
Updated
Jul 29, 2020
Python
🛸 spaCy pipelines for pre-trained BERT, XLNet and GPT-2
Updated
Jul 29, 2020
Python
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Updated
May 16, 2020
Python
GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型
Updated
May 29, 2020
Python
Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation
Updated
Jul 29, 2020
Python
Updated
May 19, 2020
Python
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Updated
Jul 8, 2019
Python
Medical Q&A with Deep Language Models
Updated
Nov 13, 2019
Jupyter Notebook
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Updated
Jul 18, 2020
Python
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Updated
Jul 26, 2020
Rust
Text-generation API via GPT-2 for Cloud Run
Your new Telegram buddy trained on Reddit discussions using DialoGPT
Updated
May 19, 2020
Jupyter Notebook
Python script to download public Tweets from a given Twitter account into a format suitable for AI text generation.
Updated
May 21, 2020
Python
a bot that generates realistic replies using a combination of pretrained GPT-2 and BERT models
Updated
Feb 19, 2020
Jupyter Notebook
Transformer language model (GPT-2) with sentencepiece tokenizer
Updated
Jul 24, 2020
Python
A list of pretrained Transformer models for the Russian language.
Updated
Feb 3, 2020
Jupyter Notebook
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Updated
May 16, 2020
Python
Load GPT-2 checkpoint and generate texts
Updated
Jul 8, 2020
Python
Code and UI for running a Magic card text generator API via GPT-2
Updated
Jul 28, 2019
HTML
Byseqlib: A High Performance Inference Library for Sequence Processing and Generation
Fine-tuned pre-trained GPT2 for custom topic specific text generation. Such system can be used for Text Augmentation.
Updated
Mar 1, 2020
Python
Generate realistic Instagram captions using transformers 🤗
Updated
Jul 1, 2020
Python
Pytorch library for end-to-end transformer models training, inference and serving
Updated
Jul 29, 2020
Python
Galois is an auto code completer for code editors (or any text editor) based on OpenAI GPT-2.
Updated
Aug 15, 2019
Python
This repo contains all the notebooks mentioned in blog.
Updated
Jan 26, 2020
Jupyter Notebook
Improve this page
Add a description, image, and links to the
gpt-2
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
gpt-2
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.