MinT: an RL training API for researchers and developers
MinT lets you focus on what matters in LLM post-training — your data, your loss functions, and your RL environments — while we handle the heavy lifting of distributed training.
You write a simple loop that runs on your CPU-only machine, including the data or environment and the loss function. We figure out how to make the training work on a bunch of GPUs, doing the exact computation you specified, efficiently. To change the model you're working with, you only need to change a single string in your code.
MinT gives you full control over the training loop and all the algorithmic details. It's not a magic black box that makes fine-tuning "easy". It's a clean abstraction that shields you from the complexity of distributed training while preserving your control.
Here's how the division of responsibilities works in practice:
| You focus on | You write | We handle |
|---|---|---|
| Datasets and RL environments Your custom training data | Simple Python script Runs on your CPU | Efficient distributed training of large models Qwen3-235B, Qwen3-30B, and more |
| Training logic Your loss functions, training loop, and evals | API calls forward_backward() optim_step() sample() save_state() | Reliability Hardware failures handled transparently |
Features
- Fine-tune open-weight models from 0.6B to 235B+ parameters
- Support for vision-language models like Qwen3-VL
- LoRA fine-tuning for efficient training of large models
- Distributed data collection and rolling training
- Weight management, model publishing, and download
- Online evaluation on standard CPU clusters
Core API Functions
forward_backward: Compute and accumulate gradientsoptim_step: Update model parameters using accumulated gradientssample: Generate outputs from trained modelssave_state/load_state: Persist and restore weight and optimizer statesave_weights_and_get_sampling_client: Publish weights for inference
Tinker Compatible
MinT is API-compatible with ThinkingMachines Tinker. If you have existing Tinker code, you can migrate to MinT with minimal changes.
# Option 1: Use MinT directly
import mint
client = mint.ServiceClient()
# Option 2: Use MinT as a Tinker drop-in
import mint as tinker
client = tinker.ServiceClient()For detailed migration instructions and current compatibility status, see Tinker Compatibility.