Typed. Async-first. Built for production.
One typed Python client for every product on the platform. Pydantic v2 throughout, idempotency keys handled for you, evidence verification a single call. Matching CLI for the moments code is overkill.
- Type coverage
- 100%
- Python
- 3.11+
- Notebooks
- 6
- Wheel
- signed
1from radmah_sdk import Client2 3client = Client(api_key="rm_live_…")4 5# Synthesize from a CSV upload6ds = await client.datasets.upload("./customers.csv")7job = await client.synthesize.run(8 dataset_id=ds.id,9 model="synthesize",10 rows=10_000,11 seed=42,12)13await job.wait()14 15bundle = await client.evidence.fetch(job.id)16assert bundle.verify() # BLAKE3, offline17print(job.metrics.qa_score)- ▶ uploading customers.csv ………… 4.7 MB
- dataset ds_4a7c81 · 20 cols · 482 931 rows
- ▶ POST /v1/synthesize/jobs 202
- job_9b3df1 · estimate 38 s · 14 credits
- ▶ training the engine on 17 numeric / 3 cat …
- epoch 12 / 20 loss 0.083
- epoch 20 / 20 loss 0.041
- ▶ generating 10 000 rows ………… 1.6 s
- ✓ K-S gate 0.018 · χ² gate ok · constraints ok
- ▶ fetching evidence bundle …
- ✓ chain verified root a4f2…d801
- >>> 0.9569
Six things every SDK call guarantees.
The same guarantees whether you’re uploading a CSV or driving the autonomous agent. No surprises, no hidden retries, no untyped JSON blobs.
100 % typed
Pydantic v2 models for every request and response. mypy-strict on the client. Your IDE knows every field.
Async-first
Every method is awaitable. Sync wrappers exist where you need them, but async is the primary surface.
Idempotency built in
Idempotent retries on POSTs out of the box; the SDK manages your Idempotency-Key for you.
Streaming evidence
First-class helpers to subscribe to evidence-chain events and verify BLAKE3 chains offline.
Contract-test runner
Drop the contract runner into your CI to lock the SDK<->API contract — caught upstream, not in prod.
Matching CLI
Every SDK call has a `rady` CLI sibling for shell pipelines, one-off ops, and the chatty interactive REPL. See /developer/cli.
From pip install to a signed bundle in 30 seconds.
pip install radmah-sdk radmah-cli
rady auth login # OS-keyring credential, one-time
# 30-second mock dataset via the SDK
python3 -c "
from radmah_sdk import RadMahClient
c = RadMahClient(api_key='rm_live_...')
job = c.submit_job_with_budget(
kind='mock', engine='mock_fast',
prompt='200 SaaS accounts MRR 50-5000',
rows=200, seed=42, max_credits=1,
)
print(job.wait().to_dataframe().head())
"
# Or one-shot via the CLI
rady mock -p "200 SaaS accounts MRR 50-5000" -n 200 -o bundle.csv
# Verify any evidence bundle offline
rady verify bundle.tar.zstX-API-Key: slt_live_AbCdEfGh…
X-Idempotency-Key: 8e0c…b21
Content-Type: application/json
{
"dataset_id": "ds_4a7c81",
"model": "synthesize",
"rows": 10000,
"seed": 42
}X-Job-Id: job_9b3df1
X-Rate-Limit-Remaining: 47
Retry-After: —
{
"job_id": "job_9b3df1",
"status": "queued",
"estimate": { "credits": 14, "seconds": 38 },
"evidence": { "expected_root_pending": true },
"_links": {
"self": "/v1/synthesize/jobs/job_9b3df1",
"stream": "/v1/synthesize/jobs/job_9b3df1/events",
"cancel": "/v1/synthesize/jobs/job_9b3df1:cancel"
}
}Six reference notebooks. One per product surface.
01-mock.ipynb
Generate a sealed mock dataset from a prompt.
02-synthesize.ipynb
Train it on a CSV upload, fetch the bundle.
03-agent.ipynb
Drive the autonomous data scientist end-to-end.
04-scada.ipynb
Spin up a Virtual SCADA run, capture telemetry.
05-evidence.ipynb
Verify any bundle, walk the BLAKE3 chain offline.
06-webhooks.ipynb
Receive signed events, validate signatures.
Language support, version policy.
| Surface | Versions | Status |
|---|---|---|
| Python | 3.11 – 3.13 | stable |
| Async | asyncio, anyio | stable |
| Pydantic | v2.x | required |
| HTTP | httpx 0.27+ | vendored |
| TypeScript SDK | Q3 | in design |
| Go SDK | Q4 | planned |
pip install radmah-sdk
One install. Every product surface. Bring an OpenAPI test suite if you have one, drop the contract runner into your CI, and you can ship in a day.