โจ What is Transformer Lab?
Transformer Lab is an open-source machine learning platform that unifies the fragmented AI tooling landscape into a single, elegant interface. It is available in two editions:
๐ ๏ธ Key Capabilities
๐ง Foundation Models & LLMs
- Universal Support: Download and run Llama 3, DeepSeek, Mistral, Qwen, Phi, and more.
- Inference Engines: Support for MLX, vLLM, Ollama, and HuggingFace Transformers.
- Format Conversion: Seamlessly convert between HuggingFace, GGUF, and MLX formats.
- Chat Interface: Multi-turn chat, batched querying, and function calling support.
๐ Training & Fine-tuning
- Unified Interface: Train on local hardware or submit tasks to remote clusters using the same UI.
- Methods: Full fine-tuning, LoRA/QLoRA, RLHF (DPO, ORPO, SIMPO), and Reward Modeling.
- Hardware Agnostic: Optimized trainers for Apple Silicon (MLX), NVIDIA (CUDA), and AMD (ROCm).
- Hyperparameter Sweeps: Define parameter ranges in YAML and automatically schedule grid searches.
๐จ Diffusion & Image Generation
- Generation: Text-to-Image, Image-to-Image, and Inpainting using Stable Diffusion and Flux.
- Advanced Control: Full support for ControlNets and IP-Adapters.
- Training: Train custom LoRA adaptors on your own image datasets.
- Dataset Management: Auto-caption images using WD14 taggers.
๐ Evaluation & Analytics
- LLM-as-a-Judge: Use local or remote models to score outputs on bias, toxicity, and faithfulness.
- Benchmarks: Built-in support for EleutherAI LM Evaluation Harness (MMLU, HellaSwag, GSM8K, etc.).
- Red Teaming: Automated vulnerability testing for PII leakage, prompt injection, and safety.
๐ Plugins & Extensibility
- Plugin System: Extend functionality with a robust Python plugin architecture.
- Lab SDK: Integrate your existing Python training scripts (
import lab) to get automatic logging, progress bars, and artifact tracking. - CLI: Power-user command line tool for submitting tasks and monitoring jobs without a browser.
๐ฃ๏ธ Audio Generation
- Text-to-Speech: Generate speech using Kokoro, Bark, and other state-of-the-art models.
- Training: Fine-tune TTS models on custom voice datasets.
๐ฅ Quick Start
1. Install
curl https://lab.cloud/install.sh | bash2. Run
cd ~/.transformerlab/src ./run.sh
3. Access
Open your browser to http://localhost:8338.
Requirements
| Platform | Requirements |
|---|---|
| macOS | Apple Silicon (M1/M2/M3/M4) |
| Linux | NVIDIA or AMD GPU |
| Windows | NVIDIA GPU via WSL2 (setup guide) |
๐ข Enterprise & Cluster Setup
Transformer Lab for Teams runs as an overlay on your existing infrastructure. It does not replace your scheduler; it acts as a modern control plane for it.
To configure Transformer Lab to talk to Slurm or SkyPilot:
- Follow the Teams Install Guide.
- Configure your compute providers in the Team Settings.
- Use the CLI (
lab) or Web UI to queue tasks across your cluster.
๐ฉโ๐ป Development
Frontend
# Requires Node.js v22
npm install
npm startBackend (API)
cd api ./install.sh # Sets up Conda env + Python deps ./run.sh # Start the API server
Lab SDK
pip install transformerlab
๐ค Contributing
We are an open-source initiative backed by builders who care about the future of AI research. We welcome contributions! Please check our issues for open tasks.
๐ License
AGPL-3.0 ยท See LICENSE for details.
๐ Citation
@software{transformerlab, author = {Asaria, Ali and Salomone, Tony}, title = {Transformer Lab: The Operating System for AI Research}, year = 2023, url = {https://github.com/transformerlab/transformerlab-app} }
๐ฌ Community
Built with โค๏ธ by Transformer Lab in Canada ๐จ๐ฆ
