Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
-
Updated
Mar 15, 2023 - Python
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
CodeGen is an open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.
Training and serving large-scale neural networks
Integrate Human Supervision into your Platform. For all Training Data Types, Image, Video, 3D, Text, Geo, Audio, Compound, Grid, LLM, GPT, Conversational, and more.
Integrate cutting-edge LLM technology quickly and easily into your apps
Community for applying LLMs to robotics and a robot simulator with ChatGPT integration
Adding guardrails to large language models.
A curated list of modern Generative Artificial Intelligence projects and services
Basaran is an open-source alternative to the OpenAI text completion API. It provides a compatible streaming API for your Hugging Face Transformers-based text generation models.
An open collection of implementation tips, tricks and resources for training large language models
A high-performance serving framework for ML models, offers dynamic batching and multi-stage pipeline to fully exploit your compute machine
ChatGPT and Bing AI prompt curation
Essential guides and programming tools in my toolbox (with focus on ML Training)
Chain together LLMs for reasoning & orchestrate multiple large models for accomplishing complex tasks
Add a description, image, and links to the llm topic page so that developers can more easily learn about it.
To associate your repository with the llm topic, visit your repo's landing page and select "manage topics."