14 encrypted source adapters

Bring your data from where it already lives.

Warehouses, relational databases, object storage, and the HuggingFace hub — browse, sample, and import in four steps, with credentials Fernet-vaulted by construction. The same four calls work for every adapter, so the engineer who wires Snowflake on Monday wires Azure Blob on Wednesday in the same afternoon.

Adapters
14
Families
4
Vault
Fernet
Plain secrets
0
api.radmah.ai / v1 / synthesize / jobsopenapi 3.1
POST/v1/synthesize/jobs
X-API-Key:          slt_live_AbCdEfGh…
X-Idempotency-Key:  8e0c…b21
Content-Type:       application/json

{
  "dataset_id": "ds_4a7c81",
  "model":      "synthesize",
  "rows":       10000,
  "seed":       42
}
202Accepted
X-Job-Id:                job_9b3df1
X-Rate-Limit-Remaining:  47
Retry-After:             —

{
  "job_id":     "job_9b3df1",
  "status":     "queued",
  "estimate":   { "credits": 14, "seconds": 38 },
  "evidence":   { "expected_root_pending": true },
  "_links":     {
    "self":     "/v1/synthesize/jobs/job_9b3df1",
    "stream":   "/v1/synthesize/jobs/job_9b3df1/events",
    "cancel":   "/v1/synthesize/jobs/job_9b3df1:cancel"
  }
}

Fourteen adapters across four families.

Same browse / test / import surface across every adapter — no per-source quirks to learn.

Warehouse

Snowflake
BigQuery
Databricks
Redshift
Hive

Relational

Postgres
MySQL
MariaDB
MSSQL
Oracle

Object storage

Amazon S3
Google GCS
Azure Blob

Dataset hub

HuggingFace

Four steps from creds to dataset.

Step 1

Create connector

Enter creds in the UI or POST the config — inline passwords are auto-hoisted to the encrypted vault.

Step 2

Test connection

We open a short-lived test session against the source and surface the exact error if it fails.

Step 3

Browse

Per-connector dispatcher returns up to 500 tables / objects, paged and filtered.

Step 4

Import to dataset

Run an import job — lands in your tenant artefact prefix, ready for Synthesize / ADS / Mock.

Same four calls — every adapter.

# 1. Create a Postgres connector — passwords auto-hoist
POST /v1/client/connectors
{
  "name": "warehouse-prod",
  "type": "postgres",
  "config": {
    "host": "warehouse.acme.internal",
    "database": "analytics",
    "user": "radmah_ro",
    "password": "p@ssw0rd-will-be-vaulted"
  }
}
→ 201 Created   secret_ref=cs_…   no plain secret in run-state

# 2. Test it
POST /v1/client/connectors/{id}/test     → 200 OK

# 3. Browse tables
POST /v1/client/connectors/{id}/browse
{ "schema": "public", "limit": 500 }
→ [
   {"name": "customers", "rows_estimate": 482931},
   {"name": "orders",     "rows_estimate": 1923810},
   …
]

# 4. Import as a dataset
POST /v1/client/connectors/{id}/import
{ "source": "public.customers", "row_limit": 200000,
  "target_dataset_name": "customers-snapshot-2026-04" }
→ 202 Accepted    job_id=imp_…

The Fernet vault, in plain English.

Connector secrets are not optional to encrypt. They are encrypted by construction — the auto-hoist path enforces it even if the caller forgets.

Fernet at rest

Per-tenant key, AES-128-CBC + HMAC-SHA256. No password is ever stored in the connector row.

Auto-hoist on POST

If you accidentally POST a plain password it is moved to the vault before the row is committed.

Scoped use

The vault entry is read only by the connector worker, only when running an explicit test/browse/import.

Audit log

Every secret resolve recorded with caller, connector ID, and reason — never the value.

Other protections built in.

TLS-required by default

PG/MySQL/MSSQL/Oracle drivers refuse plain-text connections.

Read-only test mode

Test session uses a single SELECT 1 — never writes to the source.

Tenant artefact prefix

Imported rows land under the calling tenant's artefact path. No cross-tenant write possible.

Sample-first imports

Sample mode (1k rows) lets you eyeball the schema before kicking off a 200k row import.

Send us a hostname. We’ll prove it round-trips.

Adapter not on the list? We add new connectors on a 2-week SLA when there’s a named customer behind the request.