Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
-
Updated
Aug 25, 2023 - Python
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Neural question generation using transformers
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Github repo with tutorials to fine tune transformers for diff NLP tasks
1 line for thousands of State of The Art NLP models in hundreds of languages The fastest and most accurate way to solve text problems.
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Crosslingual Generalization through Multitask Finetuning
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
An NLP system for generating reading comprehension questions
MinT: Minimal Transformer Library and Tutorials
Multilingual/multidomain question generation datasets, models, and python library for question generation.
Code and data for crosstalk text generation tasks, exploring whether large models and pre-trained language models can understand humor.
A wrapper of LLMs that biases its behaviour using prompts and contexts in a transparent manner to the end-users
Easy to use and understand multiple-choice question generation algorithm using T5 Transformers.
中文AI写作(写诗或写对联)
Text-to-SQL in the Wild: A Naturally-Occurring Dataset Based on Stack Exchange Data
Abstractive and Extractive Text summarization using Transformers.
Add a description, image, and links to the t5 topic page so that developers can more easily learn about it.
To associate your repository with the t5 topic, visit your repo's landing page and select "manage topics."