Using the API
Saving and Loading
This page covers the low-level MinT checkpoint primitives. For end-to-end operational workflows, see the dedicated Advanced Checkpoint pages.
Save for inference
sampler_path = training_client.save_weights_for_sampler(name="0000").result().pathUse this when you want a weights-only checkpoint for sampling or export.
Save for training resume
resume_path = training_client.save_state(name="0010").result().pathUse this when you want to preserve both weights and optimizer state.
Load for full training resume
For optimizer-preserving resume, create a fresh LoRA training client with the same model/rank/options, then call load_state_with_optimizer(...):
training_client = service_client.create_lora_training_client(
base_model=model,
rank=rank,
train_mlp=True,
train_attn=True,
train_unembed=True,
)
training_client.load_state_with_optimizer(resume_path).result()Load weights only
training_client.load_state(resume_path).result()load_state(...) restores weights only. It does not preserve optimizer state, so do not use it as the full training-resume path.