UiPath Documentation
industry-department-solutions
latest
false
  • Overview
    • Introduction
    • Getting Started
    • API Guide
    • Schema lifecycle
    • Scheduled ingestion
    • Data Quality Dashboard
    • Customizations
    • Data Onboarding Checklist
  • API Resources

Supply Chain & Retail Solutions API guide

Last updated May 8, 2026

Getting Started

Prerequisites

UiPath's Supply Chain & Retail Solutions should already have the API end points ready for you to ingest data. If this is not the case please submit a support ticket.

Creating an authorization token

To submit data via the ingestion API an authorization token must be created. To do this:

  1. Log in to your organization.

  2. Click your icon in the lower-left corner, and select Access Tokens.

  3. Select GENERATE TOKEN.

  4. Create a Personal Access Token (PAT) and save the generated token into your environment.

Onboarding flow at a glance

A first-time onboarding follows three steps. Each step uses a separate endpoint group; you only need to repeat step 3 for ongoing data delivery.

  1. Roll out the solution tables — pick the standard schema for the solution you're integrating against and roll it out to your tenant. This creates the warehouse tables (and the matching <table_name>_failed_rows tables) that subsequent ingest calls write into. See Schema lifecycle.
  2. Ingest dataPOST rows to the object endpoints described below. Validation runs against the schema you rolled out.
  3. Monitor outcomes — inspect aggregate ingestion outcomes, top error codes, and individual failed rows in the Data Quality Dashboard in your Peak tenant.

Rolling out a solution's tables

Before any data can be ingested, the solution's tables must exist in your warehouse. A rollout creates the data tables and their corresponding _failed_rows tables in one call.

curl -X POST \
  'https://ingestion.peak.ai/api/v2/schema/rollout' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "targetSchemaName": "STAGE",
    "appName": "quote-pricing",
    "appVersion": "1.0.0",
    "prefix": "QP_",
    "suffix": "_OOTB"
  }'
curl -X POST \
  'https://ingestion.peak.ai/api/v2/schema/rollout' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "targetSchemaName": "STAGE",
    "appName": "quote-pricing",
    "appVersion": "1.0.0",
    "prefix": "QP_",
    "suffix": "_OOTB"
  }'

For the full schema lifecycle (saving a standard schema, upgrading versions, adding columns) see Schema lifecycle.

Using the API

This section describes how to upload and remove data once a rollout has completed. For guidance on the data required by the API, including supported fields and formats, see the API Resources section.

Ingesting objects

All ingestion requests follow this pattern:

POST https://ingestion.peak.ai/api/v2/objects/{prefix}{OBJECT_NAME}{suffix}
POST https://ingestion.peak.ai/api/v2/objects/{prefix}{OBJECT_NAME}{suffix}

Where:

  • {prefix} - Solution-specific prefix (e.g., QP_)
  • {OBJECT_NAME} - Object name as defined in your data model (e.g., PRODUCTS)
  • {suffix} - Solution-specific suffix (e.g., _OOTB)

Example request

curl -X POST \
  'https://ingestion.peak.ai/api/v2/objects/QP_PRODUCTS_OOTB' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "data": [
      {
        "PRODUCT_ID": "PROD-001",
        "UPDATED_AT": "2024-01-15 10:30:00",
        "BESPOKE_PRODUCT": false,
        "PRODUCT_NAME": "Industrial Steel Beam",
        "PRODUCT_CATEGORY": "Construction",
        "PRODUCT_SUBCATEGORY": "Structural"
      }
    ],
    "operationType": "UPSERT"
  }'
curl -X POST \
  'https://ingestion.peak.ai/api/v2/objects/QP_PRODUCTS_OOTB' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "data": [
      {
        "PRODUCT_ID": "PROD-001",
        "UPDATED_AT": "2024-01-15 10:30:00",
        "BESPOKE_PRODUCT": false,
        "PRODUCT_NAME": "Industrial Steel Beam",
        "PRODUCT_CATEGORY": "Construction",
        "PRODUCT_SUBCATEGORY": "Structural"
      }
    ],
    "operationType": "UPSERT"
  }'

Request components

Headers:

  • Authorization - Your Personal Access Token (PAT) from platform.peak.ai
  • Content-Type: application/json - Required for JSON payloads

Payload:

  • solutionName - Matches your solution configuration (e.g., QP_OOTB)
  • data - Array of records to ingest (1 to 500 entries per call)
  • operationType - Operation type for data ingestion:
    • UPSERT - Insert new records or update existing ones based on primary key
    • APPEND - Insert new records only (does not update existing records)
  • dryRun (optional, default false) - Run validation only and return what would have been ingested, without writing to the warehouse

Response

Success (200 OK): All rows in the request passed validation and were accepted.

{
  "requestId": "abc123-def456-ghi789",
  "message": "Data ingested successfully",
  "totalRows": 1,
  "successRows": 1,
  "failedRows": 0
}
{
  "requestId": "abc123-def456-ghi789",
  "message": "Data ingested successfully",
  "totalRows": 1,
  "successRows": 1,
  "failedRows": 0
}

Partial success (207 Multi-Status): Some rows passed and some failed. Successful rows are accepted; failed rows are returned with structured error details so they can be corrected and resubmitted.

Error (400 Bad Request): Every row in the request failed validation. The response contains a structured list of errors per failing row. Each error includes a code, the affected field, severity, category, and a human-readable message. See the API Guide for the full list of error codes and their meanings.

{
  "requestId": "abc123-def456-ghi789",
  "totalRows": 1,
  "successRows": 0,
  "failedRows": 1,
  "failed": [
    {
      "rowNumber": 1,
      "data": { "product_name": "Industrial Steel Beam" },
      "errors": [
        {
          "code": "DI_E_23N01",
          "field": "product_id",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "Required field is absent",
          "description": "The 'product_id' field is required but was not provided in the request."
        },
        {
          "code": "DI_E_22001",
          "field": "product_name",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "String length exceeds maximum",
          "description": "The 'product_name' value exceeds the maximum allowed length of 255 characters."
        },
        {
          "code": "DI_E_22023",
          "field": "status",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "Value is not one of the allowed enum values",
          "description": "The 'status' value 'enabled' is not in the list of allowed values."
        },
        {
          "code": "DI_E_23505",
          "field": "product_id",
          "severity": "ERROR",
          "category": "SYSTEM_DATA",
          "message": "Duplicate primary key",
          "description": "Two or more rows in this batch share the same primary key value."
        }
      ]
    }
  ]
}
{
  "requestId": "abc123-def456-ghi789",
  "totalRows": 1,
  "successRows": 0,
  "failedRows": 1,
  "failed": [
    {
      "rowNumber": 1,
      "data": { "product_name": "Industrial Steel Beam" },
      "errors": [
        {
          "code": "DI_E_23N01",
          "field": "product_id",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "Required field is absent",
          "description": "The 'product_id' field is required but was not provided in the request."
        },
        {
          "code": "DI_E_22001",
          "field": "product_name",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "String length exceeds maximum",
          "description": "The 'product_name' value exceeds the maximum allowed length of 255 characters."
        },
        {
          "code": "DI_E_22023",
          "field": "status",
          "severity": "ERROR",
          "category": "BUSINESS_VALIDATION",
          "message": "Value is not one of the allowed enum values",
          "description": "The 'status' value 'enabled' is not in the list of allowed values."
        },
        {
          "code": "DI_E_23505",
          "field": "product_id",
          "severity": "ERROR",
          "category": "SYSTEM_DATA",
          "message": "Duplicate primary key",
          "description": "Two or more rows in this batch share the same primary key value."
        }
      ]
    }
  ]
}

Deleting objects

All delete requests follow this pattern:

DELETE https://ingestion.peak.ai/api/v2/objects/{prefix}{OBJECT_NAME}{suffix}
DELETE https://ingestion.peak.ai/api/v2/objects/{prefix}{OBJECT_NAME}{suffix}

Where:

  • {prefix} - Solution-specific prefix (e.g., QP_)
  • {OBJECT_NAME} - Object name as defined in your data model (e.g., PRODUCTS)
  • {suffix} - Solution-specific suffix (e.g., _OOTB)

Example request

curl -X DELETE \
  'https://ingestion.peak.ai/api/v2/objects/QP_PRODUCTS_OOTB' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "primaryKeys": [
      { "product_id": "PROD-001" },
      { "product_id": "PROD-002" }
    ]
  }'
curl -X DELETE \
  'https://ingestion.peak.ai/api/v2/objects/QP_PRODUCTS_OOTB' \
  -H 'Authorization: YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "solutionName": "QP_OOTB",
    "primaryKeys": [
      { "product_id": "PROD-001" },
      { "product_id": "PROD-002" }
    ]
  }'

Request components

Headers:

  • Authorization - Your Personal Access Token (PAT) from platform.peak.ai
  • Content-Type: application/json - Required for JSON payloads

Payload:

  • solutionName - Matches your solution configuration (e.g., QP_OOTB)
  • primaryKeys - List of primary key value sets identifying the rows to delete (1 to 500 entries per call). Each entry must include every primary key column defined in the schema.

Response

Success (200 OK):

{
  "requestId": "abc123-def456-ghi789",
  "message": "Objects deleted successfully",
  "deletedRows": 2
}
{
  "requestId": "abc123-def456-ghi789",
  "message": "Objects deleted successfully",
  "deletedRows": 2
}

Error (4xx/5xx):

{
  "title": "Bad Request",
  "type": "",
  "message": "primaryKeys contains columns not found in table schema: invalid_column"
}
{
  "title": "Bad Request",
  "type": "",
  "message": "primaryKeys contains columns not found in table schema: invalid_column"
}

Validation behavior

Every row is validated before the API responds. Any failed rows appear in the response's failed array with structured error details (see API Guide — Error Codes).

Successful rows are accepted for ingestion and written to the warehouse asynchronously — typically within a few seconds. A 200 OK confirms the rows passed validation and were accepted, not that they're queryable yet. Failed rows are also recorded to the matching <table_name>_failed_rows tables for triage in the Data Quality Dashboard.

For use cases that need successful rows to be queryable the moment a request returns 200 OK — for example, bulk-loading a large historical dataset where each batch should be confirmed before the next is sent — a synchronous mode is available on request. See API Guide — Validation behavior for details, and submit a support ticket to enable it for your tenant.

Monitoring ingestion outcomes

The Data Quality Dashboard in your Peak tenant surfaces aggregate KPIs (records processed, pass rate, top error codes), per-table and per-request breakdowns, and the exact failed rows with their source field values. Use it to monitor live ingestion runs and to triage failures from the asynchronous validation path.

See Data Quality Dashboard for the one-time setup steps and a walkthrough of what's on each page. If the dashboard isn't yet available in your tenant, submit a support ticket to request setup.

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated