UiPath Documentation
industry-department-solutions
latest
false
  • Overview
    • Introduction
    • Getting Started
    • API Guide
    • Schema lifecycle
    • Scheduled ingestion
    • Data Quality Dashboard
    • Customizations
    • Data Onboarding Checklist
  • API Resources

Supply Chain & Retail Solutions API guide

Last updated May 8, 2026

API Guide

Definitions

These descriptions are designed to support correct usage of the API. For further assistance or questions, please submit a support ticket.

NameDescriptionRequired
solutionNameA unique identifier for the rollout or solution. Example: B2C_OOTB.Yes
prefixA unique prefix applied to all generated object names to support naming convention.Yes
suffixA unique suffix appended to all generated object names for differentiation.Yes
appNameThe application for which the objects are being deployed.Yes
appVersionThe semantic version of the application's standard schema being saved, rolled out, or upgraded.Yes
objectNameThe base object name. Exact object name matching is currently enforced during schema validation.Yes
targetSchemaNameThe schema within the data warehouse where the tables and data will be stored. Example: STAGE.Yes
operationTypeThe type of ingest operation the API should perform. Supported values: UPSERT (insert or update based on primary key) or APPEND (insert only, similar to an insert operation).Yes
dryRunWhen true, the request is fully validated and the response is returned, but no rows are written to the warehouse. Use it to test a payload without persisting anything. Default false.No

API host and reference

All v2 endpoints share a single host. Use this base URL for every request in this guide:

  • Production: https://ingestion.peak.ai

An interactive OpenAPI / Swagger reference is served at the same host:

The Swagger page lists every endpoint, request schema, and response schema and lets you try requests inline. Use it as the source of truth when generating client code or validating a payload structure. For non-production environments (beta or sandbox), submit a support ticket to request the corresponding URLs.

Operation types

The API supports two operation types for data ingestion:

UPSERT

  • Behavior: Insert new records or update existing records based on primary key match
  • Use case: Maintaining up-to-date records where data may change over time
  • Example: Updating product information, customer details, or pricing data

APPEND

  • Behavior: Insert new records only, does not update existing records
  • Use case: Appending new data without modifying historical records
  • Example: Transaction logs, event data, or time-series data where records should not be modified

The operationType you submit is recorded as "upsert" or "append" on each failed row in the <table_name>_failed_rows companion table — see Audit columns added at rollout for the full list of columns the API populates automatically.

Validation behavior

Every row is validated inline before the API responds, regardless of which mode the tenant is in. Failed rows are returned in the response's failed array. The mode determines what happens to the successful rows after validation.

Asynchronous (default)

Successful rows are accepted and written to the warehouse asynchronously — typically within a few seconds. A 200 OK response means the rows passed validation and were accepted for ingestion, not that they're queryable yet.

Failed rows are returned in the response and also recorded to the matching <table_name>_failed_rows tables that were created during rollout, where they are visible in the Data Quality Dashboard for triage.

This is the recommended mode for everyday ingestion.

Synchronous (on request)

A synchronous fallback is available for use cases where you need successful rows to be queryable in the warehouse the moment a request returns 200 OK — for example, when bulk-loading a large historical dataset and you want each batch confirmed before sending the next, or when an interactive caller needs to act on landed data right away.

In sync mode, successful rows are written to the warehouse before the response returns; failed rows are returned in the response only (they are not recorded to the <table_name>_failed_rows tables).

To enable sync mode for your tenant, submit a support ticket.

Scheduled ingestion

When scheduled ingestion is enabled for a table, individual ingest requests do not write to the warehouse immediately — they are batched and flushed at the configured cron time. See Scheduled ingestion for details on enablement, cron format, and the observable behavior between scheduled runs.

API limits

The API supports ingestion of up to 500 rows per request. When ingesting larger datasets, ensure your data is split into appropriately sized batches. Rate limits are 50 requests per second.

For deletion, the same 500-row maximum applies — each delete request can target between 1 and 500 primary key value sets.

Response status codes

The ingest endpoint uses three status codes to communicate the outcome of a batch:

StatusMeaning
200 OKEvery row in the request passed validation and was accepted.
207 Multi-StatusSome rows passed and some failed. Successful rows are accepted; failed rows are returned in the response's failed array with structured error details.
400 Bad RequestEvery row in the request failed validation, or the request payload itself is malformed.

Other endpoints return standard HTTP semantics: 201 Created for successful schema saves and rollouts, 404 Not Found when a referenced solution or schema does not exist, 409 Conflict when saving a duplicate appName + appVersion, and 500 Internal Server Error for unexpected failures.

Error codes

Each validation failure returns a structured error response containing an error code, category, and message. Codes follow the pattern DI_E_XXXXX (errors) or DI_W_XXXXX (warnings, non-fatal).

Resolution guidance by category

CategoryWhat It MeansConsumer Action
BUSINESS_VALIDATIONData value violates a schema-defined business ruleFix the data value (correct format, valid enum, within range, etc.)
SYSTEM_DATAData integrity issue (missing PK, duplicate, null PK, type mismatch)Fix the data (provide PK, remove duplicates, fill required keys, send the right JSON type)
SYSTEM_SCHEMAData structure does not match the schema definitionFix the schema/data alignment (column not in schema, PK column missing, precision/scale mismatch)

Business-level validation errors — Category: BUSINESS_VALIDATION

These come from schema-defined validation rules configured per attribute (required, range, enum, length, date/timestamp format).

#Error CodeWhen TriggeredStandard Message
1DI_E_23N01required validator: field absent from payloadRequired field is absent
2DI_E_23502nonNull validator: value is nullValue cannot be null
3DI_E_23E01nonNull validator: string is blank/emptyString value cannot be empty
4DI_E_22026minLength validator: string too shortString length is below minimum
5DI_E_22001maxLength validator: string too longString length exceeds maximum
6DI_E_22003range validator: value above max or below minNumeric value out of allowed range
7DI_E_22P02range validator: value not numericValue is not a valid number
8DI_E_22023enum validator: value not in allowed setValue is not one of the allowed enum values
9DI_E_22007dateTimeFormat validator: unparseable dateInvalid date format
10DI_E_22008timestampFormat validator: unparseable timestamp or negative epochInvalid timestamp format or epoch

System-level data validation errors — Category: SYSTEM_DATA

These come from data integrity checks (primary key null, duplicate, type mismatch) and unique key checks.

#Error CodeWhen TriggeredStandard Message
11DI_E_23P01PK attribute value is null/empty in a rowPrimary key value cannot be null/empty
12DI_E_235052+ rows in same batch share a PK valueDuplicate primary key
13DI_E_23U012+ rows in same batch share a unique-key valueDuplicate unique key
14DI_E_22I01Value not parseable as integerValue is not a valid integer
15DI_E_22N01Value not parseable as number/floatValue is not a valid number
16DI_E_22B01Value not parseable as booleanValue is not a valid boolean
17DI_E_22S02Value not a valid stringValue is not a valid string
18DI_E_22T01Timestamp field is empty stringTimestamp value is empty
19DI_E_22P03Empty/null value during precision/scale checkPrecision/scale field is empty or null
20DI_E_22P04Precision/scale validation errorPrecision/scale validation error

System-level schema validation errors — Category: SYSTEM_SCHEMA

These come from data type mismatches and schema structural checks.

#Error CodeWhen TriggeredStandard Message
21DI_E_42703Attribute in data row not found in schemaColumn not found in schema
22DI_E_23P02PK attribute entirely absent from a rowPrimary key column is missing
23DI_E_22T02Timestamp field is wrong JSON typeTimestamp value has wrong type
24DI_E_22P01Total digits exceed attribute precisionNumeric value exceeds allowed precision
25DI_E_22S01Decimal digits exceed attribute scaleNumeric value exceeds allowed scale

Quick reference — all codes sorted

CodeCategoryShort Description
DI_E_22001BUSINESS_VALIDATIONString too long (max length)
DI_E_22003BUSINESS_VALIDATIONNumeric out of range (min/max)
DI_E_22007BUSINESS_VALIDATIONInvalid date format
DI_E_22008BUSINESS_VALIDATIONInvalid timestamp format / invalid epoch
DI_E_22023BUSINESS_VALIDATIONInvalid enum value
DI_E_22026BUSINESS_VALIDATIONString too short (min length)
DI_E_22B01SYSTEM_DATAInvalid boolean type
DI_E_22I01SYSTEM_DATAInvalid integer type
DI_E_22N01SYSTEM_DATAInvalid number type
DI_E_22P01SYSTEM_SCHEMANumeric precision exceeded
DI_E_22P02BUSINESS_VALIDATIONValue is not a valid number
DI_E_22P03SYSTEM_DATAEmpty value for precision/scale check
DI_E_22P04SYSTEM_DATAPrecision/scale validation error
DI_E_22S01SYSTEM_SCHEMANumeric scale exceeded
DI_E_22S02SYSTEM_DATAInvalid string type
DI_E_22T01SYSTEM_DATATimestamp empty string
DI_E_22T02SYSTEM_SCHEMATimestamp wrong JSON type
DI_E_23502BUSINESS_VALIDATIONNot-null violation
DI_E_23505SYSTEM_DATADuplicate primary key in batch
DI_E_23E01BUSINESS_VALIDATIONNot-empty violation
DI_E_23N01BUSINESS_VALIDATIONRequired field missing
DI_E_23P01SYSTEM_DATAPrimary key value is null/empty
DI_E_23P02SYSTEM_SCHEMAPrimary key column missing from row
DI_E_23U01SYSTEM_DATADuplicate unique key in batch
DI_E_42703SYSTEM_SCHEMAColumn not found in schema

Categorization logic

Use the error code prefix to quickly identify the class of error:

DI_E_22XXX  →  Data exception      (type/format/range/precision)
DI_E_23XXX  →  Integrity constraint (null/unique/pk/required)
DI_E_42XXX  →  Schema mismatch     (undefined columns)
DI_W_XXXXXWarning             (non-fatal, future use)
DI_E_22XXX  →  Data exception      (type/format/range/precision)
DI_E_23XXX  →  Integrity constraint (null/unique/pk/required)
DI_E_42XXX  →  Schema mismatch     (undefined columns)
DI_W_XXXXX  →  Warning             (non-fatal, future use)

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated