- Overview
- API Resources
Supply Chain & Retail Solutions API guide
The Data Quality Dashboard is a pre-built canvas asset that lets you monitor your tenant's ingestion runs end-to-end. Once a solution is rolled out and starts ingesting, the dashboard surfaces:
- How many records have been processed
- How many rows passed, failed, or produced warnings
- Which tables and which solutions are affected
- Which error codes are most frequent
- The exact failed rows, with the source field values that caused validation to fail
Data flows in automatically — nothing additional is needed at ingest time. Every validated row contributes to the dashboard, regardless of how it was ingested (see API Guide — Validation behavior).
Prerequisites
Before you start, make sure:
- Your tenant is on the latest ingestion release. The dashboard depends on the validation feature that ships in this release. If your tenant has not been upgraded yet, submit a support ticket.
- At least one solution has been rolled out in the tenant via the ingestion API (see Schema lifecycle). The dashboard renders empty until a rollout has happened.
- You have permission to register tenant actions and create canvas assets.
Setting up the dashboard (one-time)
Setup is a one-time task per tenant, performed by a tenant admin. Three steps.
Step 1 — Register the Data Quality Dashboard action
The dashboard reads aggregated validation metrics through a backend action that needs to be registered in your tenant before the dashboard widgets can fetch data.
- Go to Actions in the tenant settings.
- Select Add new action and choose the API spec / OpenAPI option.
- Paste the OpenAPI spec for the dashboard. The current spec is provided by Peak when you request dashboard setup — submit a support ticket if you do not have it.
- Save the action with the exact name Data Quality Dashboard APIs.
Step 2 — Open canvas and select the namespace
- Open Canvas in the tenant.
- When prompted to choose a metrics namespace, pick
peak__dq_dashboard.
This namespace is reserved for Peak-managed dashboards — the dashboard's KPI cards and detail tables read their numbers from cubes published into it whenever a solution is rolled out, upgraded, or has a column added.
Do not publish your own cubes into peak__dq_dashboard. Pick a different namespace for any custom dashboard work.
Step 3 — Create the asset from the template
- With the namespace selected, open the Templates library in canvas.
- Find the template named Data Quality Dashboard and select it — a preview of the layout is shown.
- Select Create asset, give it a name (for example, Data Quality Dashboard), and confirm.
The asset is now yours: rename it, share it, embed it in spaces, or duplicate it. Edits you make to your asset do not affect other tenants' copies.
What's in the dashboard
The dashboard is laid out across two pages.
Page 1 — Data quality overview
The landing page. Gives you a tenant-wide view of validation activity.
Filters at the top
- Solution filter — multi-select. Pick one or more solutions to scope every widget on the page. Leave empty to show data across all solutions.
- Date range filter — defaults to the last 7 days. Pick any range up to one year back.
KPI cards show, for the chosen filter window:
- Total records ingested
- Total rows split into success / failure / warning
- Pass rate
- Number of unique tables involved
- Number of unique error codes seen
Breakdown by table — paginated table showing one row per validated table, with record counts and row outcomes. Each row has a View button — select it to drill down into that specific table on Page 2.
Charts
- A time-series chart showing failure / success / warning rows over the selected range, bucketed daily / weekly / monthly.
- A bar chart of the most frequent error codes.
Breakdown by request — paginated table showing one row per ingestion request, useful when you want to investigate a specific batch.
AI summary — a short narrative summary of the day's data, refreshed every time the dashboard loads.
Page 2 — Failed rows detail
Reached by selecting View on any row in the breakdown-by-table widget on Page 1. The page is scoped to the table you selected.
- A KPI strip showing the failed-row count, primary-key violation count, unique-key violation count, and how many distinct API requests produced failures — all for the selected table.
- A detail table listing the actual failed rows, with the audit columns plus the source field values for that table. Hover any error code to see what it means.
To return to the overview, use the page navigation at the top of the canvas.
What you can monitor
At a glance:
| Metric | Where to find it | Use it to… |
|---|---|---|
| Records ingested, pass rate | Page 1 KPI cards | Track day-over-day ingestion health |
| Time-series of pass / fail / warning rows | Page 1 chart | Spot regressions after a release or schema change |
| Top error codes | Page 1 bar chart | Identify the most common data-quality issues to fix at the source |
| Per-table breakdown | Page 1 table widget | Find which tables are contributing most failures |
| Per-request breakdown | Page 1 table widget | Investigate a specific batch by requestId |
| Per-solution breakdown | Available via solution filter | Compare quality across multiple solutions in the same tenant |
| Failed rows with source values | Page 2 detail table | Reproduce and fix the rows that did not pass validation |
When data appears
Data is live. As soon as a solution is rolled out and an ingestion request lands, records start showing up. The underlying metrics cube refreshes every few minutes, so very recent activity may take a moment to appear.
Common questions
Why is one of my solutions missing from the solution filter? The filter lists solutions that have at least one validation record. Freshly-rolled-out solutions with zero ingestions will not appear until their first request runs through validation.
Why is the breakdown-by-table widget empty? Either no validation records have landed in the selected date range, or your selected solution filter is excluding everything. Try widening the date range or clearing the solution filter.
Why don't I see source columns in the failed-rows detail? Source columns are only carried for tables that came from the supported standard schemas. Custom tables and custom-added columns appear in the dashboard's audit columns only.
Can I customize the dashboard? Yes. The asset created from the template is a normal canvas asset — you can edit widgets, add pages, change the styling, and so on. Re-creating the asset from the template will overwrite your edits, so save a copy first if you want to keep them.
Who do I contact for issues? Submit a support ticket with: your tenant name, the asset ID (visible in the canvas URL), and a screenshot of what you're seeing. Include the time range you were looking at — it speeds up the triage significantly.
- Prerequisites
- Setting up the dashboard (one-time)
- Step 1 — Register the Data Quality Dashboard action
- Step 2 — Open canvas and select the namespace
- Step 3 — Create the asset from the template
- What's in the dashboard
- Page 1 — Data quality overview
- Page 2 — Failed rows detail
- What you can monitor
- When data appears
- Common questions