The Wayback Machine - https://web.archive.org/web/20260201161054/https://github.com/transformerlab/transformerlab-app
Skip to content

Open Source Machine Learning Research Platform designed for frontier AI/ML workflows. Local, on-prem, or in the cloud. Open source.

License

Notifications You must be signed in to change notification settings

transformerlab/transformerlab-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5,831 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Transformer Lab

The Operating System for AI Research Labs

Designed for ML Researchers. Local, on-prem, or in the cloud. Open source.

GitHub Stars Release License Twitter

⬇️ Install for Individuals Β Β·Β  🏒 Install for Teams Β Β·Β  πŸ“– Documentation Β Β·Β  🎬 Demo Β Β·Β  πŸ’¬ Discord


Mozilla Builders

Transformer Lab Demo


✨ What is Transformer Lab?

Transformer Lab is an open-source machine learning platform that unifies the fragmented AI tooling landscape into a single, elegant interface. It is available in two editions:

πŸ‘€ For Individuals

Perfect for researchers and hobbyists working on a single machine.

  • Local Privacy: No data leaves your machine.
  • Full Toolkit: Train, fine-tune, chat, and evaluate models.
  • Cross-Platform: Runs natively on macOS (Apple Silicon), Linux, and Windows (WSL2).
  • No Cloud Costs: Use your own hardware.

🏒 For Teams

Built for research labs scaling across GPU clusters.

  • Unified Orchestration: Submit jobs to Slurm clusters or SkyPilot clouds (AWS, GCP, Azure) from one UI.
  • Collaborative: Centralized experiment tracking, model registry, and artifact management.
  • Interactive Compute: One-click Jupyter, VSCode, and SSH sessions on remote nodes.
  • Resilience: Auto-recovery from checkpoints and spot instance preemption.

πŸ› οΈ Key Capabilities

🧠 Foundation Models & LLMs
  • Universal Support: Download and run Llama 3, DeepSeek, Mistral, Qwen, Phi, and more.
  • Inference Engines: Support for MLX, vLLM, Ollama, and HuggingFace Transformers.
  • Format Conversion: Seamlessly convert between HuggingFace, GGUF, and MLX formats.
  • Chat Interface: Multi-turn chat, batched querying, and function calling support.
πŸŽ“ Training & Fine-tuning
  • Unified Interface: Train on local hardware or submit tasks to remote clusters using the same UI.
  • Methods: Full fine-tuning, LoRA/QLoRA, RLHF (DPO, ORPO, SIMPO), and Reward Modeling.
  • Hardware Agnostic: Optimized trainers for Apple Silicon (MLX), NVIDIA (CUDA), and AMD (ROCm).
  • Hyperparameter Sweeps: Define parameter ranges in YAML and automatically schedule grid searches.
🎨 Diffusion & Image Generation
  • Generation: Text-to-Image, Image-to-Image, and Inpainting using Stable Diffusion and Flux.
  • Advanced Control: Full support for ControlNets and IP-Adapters.
  • Training: Train custom LoRA adaptors on your own image datasets.
  • Dataset Management: Auto-caption images using WD14 taggers.
πŸ“Š Evaluation & Analytics
  • LLM-as-a-Judge: Use local or remote models to score outputs on bias, toxicity, and faithfulness.
  • Benchmarks: Built-in support for EleutherAI LM Evaluation Harness (MMLU, HellaSwag, GSM8K, etc.).
  • Red Teaming: Automated vulnerability testing for PII leakage, prompt injection, and safety.
πŸ”Œ Plugins & Extensibility
  • Plugin System: Extend functionality with a robust Python plugin architecture.
  • Lab SDK: Integrate your existing Python training scripts (import lab) to get automatic logging, progress bars, and artifact tracking.
  • CLI: Power-user command line tool for submitting tasks and monitoring jobs without a browser.
πŸ—£οΈ Audio Generation
  • Text-to-Speech: Generate speech using Kokoro, Bark, and other state-of-the-art models.
  • Training: Fine-tune TTS models on custom voice datasets.

πŸ“₯ Quick Start

1. Install

curl https://lab.cloud/install.sh | bash

2. Run

cd ~/.transformerlab/src
./run.sh

3. Access

Open your browser to http://localhost:8338.

Requirements

Platform Requirements
macOS Apple Silicon (M1/M2/M3/M4)
Linux NVIDIA or AMD GPU
Windows NVIDIA GPU via WSL2 (setup guide)

🏒 Enterprise & Cluster Setup

Transformer Lab for Teams runs as an overlay on your existing infrastructure. It does not replace your scheduler; it acts as a modern control plane for it.

To configure Transformer Lab to talk to Slurm or SkyPilot:

  1. Follow the Teams Install Guide.
  2. Configure your compute providers in the Team Settings.
  3. Use the CLI (lab) or Web UI to queue tasks across your cluster.

πŸ‘©β€πŸ’» Development

Frontend
# Requires Node.js v22
npm install
npm start
Backend (API)
cd api
./install.sh   # Sets up Conda env + Python deps
./run.sh       # Start the API server
Lab SDK
pip install transformerlab

🀝 Contributing

We are an open-source initiative backed by builders who care about the future of AI research. We welcome contributions! Please check our issues for open tasks.


πŸ“„ License

AGPL-3.0 Β· See LICENSE for details.


πŸ“š Citation

@software{transformerlab,
  author = {Asaria, Ali and Salomone, Tony},
  title = {Transformer Lab: The Operating System for AI Research},
  year = 2023,
  url = {https://github.com/transformerlab/transformerlab-app}
}

πŸ’¬ Community

Discord Twitter GitHub Issues

Built with ❀️ by Transformer Lab in Canada πŸ‡¨πŸ‡¦

About

Open Source Machine Learning Research Platform designed for frontier AI/ML workflows. Local, on-prem, or in the cloud. Open source.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Sponsor this project