Mind Lab Toolkit (MinT)
Get started

MinT CLI

The Mind Lab Toolkit (mindlab-toolkit) is MinT's installable client. There is no separate mint shell binary today — the toolkit ships the mint Python package that you import from a script. A standalone CLI binary is on the roadmap; for now this page covers the toolkit-as-CLI workflow.

Concept

mindlab-toolkit packages three things:

  1. mint — the Tinker-API-compatible Python client surface. Top-level import mint is a drop-in for Tinker.
  2. mint.mint / mintx — MinT-only extension APIs that don't exist in Tinker (e.g., OpenPI VLA helpers).
  3. A pinned, validated Tinker SDK dependency — currently tinker==0.15.0. import mint checks the installed version at runtime and fails fast if it's wrong.

You install once and use mint from any Python script.

Pattern

Install

git clone https://github.com/MindLab-Research/mindlab-toolkit.git
cd mindlab-toolkit
pip install -e .

If your environment already has a different Tinker version pinned:

python -m pip install --force-reinstall 'tinker==0.15.0'

First call

import mint

# Reads MINT_API_KEY (or TINKER_API_KEY) and MINT_BASE_URL from env / .env
service_client = mint.ServiceClient()
caps = service_client.get_server_capabilities()
print(f"Connected. {len(caps.supported_models)} supported models.")

Migrate from Tinker

If you already have import tinker code:

import mint as tinker

Then switch your endpoint and credentials to MinT. See Human Quickstart → step 1 / Migrating from Tinker for the full walk-through.

MinT-only APIs

Use the mintx namespace for OpenPI VLA and other MinT extensions:

import mint
import mint.mint as mintx

base_model = mintx.OPENPI_FAST_MODEL

API Surface

The toolkit re-exports everything from upstream Tinker plus MinT additions:

NamespaceSourceUse for
mint (top level)Tinker-compatibleServiceClient, forward_backward, optim_step, sample, save_state
mint.tinkermirror of mint top levelWhen code-style requires the explicit tinker module name
mint.typesTinker-compatibleDatum, AdamParams, SamplingParams, ModelInput
mint.mint / mintxMinT-onlyOpenPI VLA helpers (OPENPI_FAST_MODEL, embodied training entry points)

Caveats & Pitfalls

  • Don't pin tinker==0.6.3. Older Tinker versions are not compatible with the MinT compatibility patches. If import mint fails the version check, run python -m pip install --force-reinstall 'tinker==0.15.0'.
  • Don't call zero_grad_async(). Gradient zeroing is handled automatically by the MinT server. Calling it manually can produce stale-gradient errors.
  • Standalone CLI binary not yet shipped. Some users expect a mint train --config ... style command. That doesn't exist today. Track the toolkit repo issues for progress.
  • Two endpoints, one credential. The same MINT_API_KEY works against both mint.macaron.xin (overseas) and mint-cn.macaron.xin (mainland China). Choose by network reachability, not by account.

Source of truth. The toolkit's installation, environment setup, and version policy live in the public mindlab-toolkit README. When this page and the README disagree, the README wins.

On this page