Skip to main content
POST
/
api
/
models
/
{model_id}
/
infer-batch
Batch Inference
curl --request POST \
  --url https://api.example.com/api/models/{model_id}/infer-batch \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "model_version_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "dataset_version_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "dataset_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "output_type": "parquet"
}
'
{
  "job_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "status": "<string>",
  "model_version_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a"
}

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Headers

X-API-Key
string | null

Path Parameters

model_id
string<uuid>
required

Body

application/json

Request body for asynchronous batch inference.

model_version_id
string<uuid> | null
dataset_version_id
string<uuid> | null
dataset_id
string<uuid> | null
output_type
enum<string>
default:parquet

Supported inference output formats.

Available options:
json,
csv,
parquet

Response

Successful Response

Batch inference submission response.

job_id
string<uuid>
required
status
string
required
model_version_id
string<uuid>
required