Installation
Install from Source
Clone the repository and install in development mode:
git clone https://github.com/MindLab-Research/mindlab-toolkit.git
cd mindlab-toolkit
pip install -e .What's Included
The MinT Python SDK provides low-level operations including forward_backward, sample, optim_step, and save_state functionality.
Authentication
API access requires setting up credentials:
- Obtain an API key from the Mind Lab team
- Set the
MINT_API_KEYenvironment variable
export MINT_API_KEY=sk-your-key-hereIf you still need the access path spelled out: request it at https://macaron.im/mindlab via Schedule a Demo, or email contact@mindlab.ltd.
For common first-run questions like SFT vs RL, domain selection, and API key access, see FAQ.
Tinker Compatibility
MinT targets API compatibility with Tinker through the mindlab-toolkit compatibility layer. The current toolkit depends on tinker>=0.15.0 and applies MinT compatibility patches when you import mint.
If you already have code written against the Tinker client surface, the lowest-friction MinT migration is:
import mint as tinkerThen keep the Tinker-style client code and set these environment variables:
export TINKER_BASE_URL=<your-region-endpoint>
export TINKER_API_KEY=<your-mint-api-key>Use the MinT endpoint that matches your region:
- Mainland China:
https://mint-cn.macaron.xin/ - Outside Mainland China:
https://mint.macaron.xin/
Why this is the recommended path:
- raw upstream
import tinkerstill validates API keys with thetml-prefix - MinT API keys start with
sk- import mint as tinkerkeeps the familiar Tinker code shape while enabling MinT compatibility patches
If you must keep the exact import tinker statement, import mint in the process before constructing Tinker clients. For known gaps and planned updates, see Tinker Compatibility.
SDK Version Requirement
Current mindlab-toolkit depends on tinker>=0.15.0. Do not pin tinker==0.6.3 for current MinT usage:
pip install -e .The MinT compatibility layer patches newer Tinker key validation so MinT API keys (sk-*) continue to work after import mint.
Note: Do not call zero_grad_async() in MinT training loops. Gradient zeroing is handled automatically by the server.
Quickstart
For a hands-on tutorial, see the mint-quickstart repository which demonstrates SFT and RL training workflows.