- Getting started
- Host administration
- Organizations
- Tenants and services
- Authentication and security
- Licensing
- Accounts and roles
- Testing in your organization
- AI Trust Layer
- About AI Trust Layer
- AI Trust Layer configuration checklist
- Large Language Model Support
- Using connector templates
- How-to guide: Connecting Mistral through AI Trust Layer
- How-to guide: Connecting OSS models through AI Trust Layer
- External applications
- Notifications
- Logging
- Troubleshooting
Private Test Cloud admin guide
This guide explains how to connect an open-source model hosted on an AI gateway through the AI Trust Layer LLM Configurations feature. You create a custom Integration Service connector using the OpenAI V1 Compliant template and configure it with your gateway credentials and model identifier.
The steps use Fireworks as the example gateway, but any OpenAI-compatible gateway works.
Prerequisites
- BYO AI Gateway enabled in your cluster. For details, see Configuring AI Trust Layer.
- An account on an OpenAI-compatible AI gateway (for example, Fireworks) with access to the model you want to use
- The model identifier as listed in your gateway (for example,
accounts/fireworks/models/glm-4p6for GLM on Fireworks) - API credentials for your gateway account
- Organization administrator access in Automation Suite
- Access to Integration Service and Connector Builder
Configure the LLM in AI Trust Layer
-
Navigate to Admin > AI Trust Layer > LLM configurations and select Add configuration.
-
Set the Tenant, Product, and Feature values.
Note:You can select any product that supports BYOM — for example, Agents, Gen AI Activities, or Coded Agents.
-
Under Model Configuration, select Add Custom Alias and enter the name of the OSS model.
-
Set API Type to the API type that corresponds to your model. For most chat models, select Chat Completions.
-
In the Connector field, select Create custom connector.
-
Select the OpenAI V1 Compliant LLM template, then select Create connector. Connector Builder opens with the template pre-populated.
-
Configure and save the connector, then publish it.
Add a connection in Integration Service
- In Integration Service, navigate to Connections and select Add connection.
- Select the custom connector you published.
- Enter your gateway credentials. For Fireworks, this is your API key.
- Select Connect to provision the connection.
Complete the LLM configuration
-
Return to Admin > AI Trust Layer > LLM configurations and open the configuration you started.
-
Under Model Configuration, set Connector to your published connector and Connection to the connection you created.
-
In the LLM identifier field, enter the model identifier exactly as it appears in your gateway.
For Fireworks-hosted models, the identifier follows the format
accounts/fireworks/models/<model-name>. For example:accounts/fireworks/models/glm-4p6.Note:Trailing spaces in the LLM identifier field cause connection errors. Verify there are no leading or trailing spaces before saving.
-
Select Test configuration to run the AI Trust Layer probe.
-
If the probe passes, select Save.
Result
The configuration is saved and the OSS model is available to the product and feature you specified. Calls route through the AI Trust Layer and appear in the audit log under Source: Custom connection. To verify, run an agent using the newly added model and check the traces and AI Trust Layer audit logs to confirm the expected model was invoked.
If you encounter issues while creating a custom connector, contact UiPath Support for assistance.