MinT CLI
The Mind Lab Toolkit (mindlab-toolkit) is MinT's installable client. There is no separate mint shell binary today — the toolkit ships the mint Python package that you import from a script. A standalone CLI binary is on the roadmap; for now this page covers the toolkit-as-CLI workflow.
Concept
mindlab-toolkit packages three things:
mint— the Tinker-API-compatible Python client surface. Top-levelimport mintis a drop-in for Tinker.mint.mint/mintx— MinT-only extension APIs that don't exist in Tinker (e.g., OpenPI VLA helpers).- A pinned, validated Tinker SDK dependency — currently
tinker==0.15.0.import mintchecks the installed version at runtime and fails fast if it's wrong.
You install once and use mint from any Python script.
Pattern
Install
git clone https://github.com/MindLab-Research/mindlab-toolkit.git
cd mindlab-toolkit
pip install -e .If your environment already has a different Tinker version pinned:
python -m pip install --force-reinstall 'tinker==0.15.0'First call
import mint
# Reads MINT_API_KEY (or TINKER_API_KEY) and MINT_BASE_URL from env / .env
service_client = mint.ServiceClient()
caps = service_client.get_server_capabilities()
print(f"Connected. {len(caps.supported_models)} supported models.")Migrate from Tinker
If you already have import tinker code:
import mint as tinkerThen switch your endpoint and credentials to MinT. See Human Quickstart → step 1 / Migrating from Tinker for the full walk-through.
MinT-only APIs
Use the mintx namespace for OpenPI VLA and other MinT extensions:
import mint
import mint.mint as mintx
base_model = mintx.OPENPI_FAST_MODELAPI Surface
The toolkit re-exports everything from upstream Tinker plus MinT additions:
| Namespace | Source | Use for |
|---|---|---|
mint (top level) | Tinker-compatible | ServiceClient, forward_backward, optim_step, sample, save_state |
mint.tinker | mirror of mint top level | When code-style requires the explicit tinker module name |
mint.types | Tinker-compatible | Datum, AdamParams, SamplingParams, ModelInput |
mint.mint / mintx | MinT-only | OpenPI VLA helpers (OPENPI_FAST_MODEL, embodied training entry points) |
Caveats & Pitfalls
- Don't pin
tinker==0.6.3. Older Tinker versions are not compatible with the MinT compatibility patches. Ifimport mintfails the version check, runpython -m pip install --force-reinstall 'tinker==0.15.0'. - Don't call
zero_grad_async(). Gradient zeroing is handled automatically by the MinT server. Calling it manually can produce stale-gradient errors. - Standalone CLI binary not yet shipped. Some users expect a
mint train --config ...style command. That doesn't exist today. Track the toolkit repo issues for progress. - Two endpoints, one credential. The same
MINT_API_KEYworks against bothmint.macaron.xin(overseas) andmint-cn.macaron.xin(mainland China). Choose by network reachability, not by account.
Source of truth. The toolkit's installation, environment setup, and version policy live in the public mindlab-toolkit README. When this page and the README disagree, the README wins.