Awesome-copilot arize-ai-provider-integration
INVOKE THIS SKILL when creating, reading, updating, or deleting Arize AI integrations. Covers listing integrations, creating integrations for any supported LLM provider (OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, Vertex AI, Gemini, NVIDIA NIM, custom), updating credentials or metadata, and deleting integrations using the ax CLI.
git clone https://github.com/github/awesome-copilot
T=$(mktemp -d) && git clone --depth=1 https://github.com/github/awesome-copilot "$T" && mkdir -p ~/.claude/skills && cp -r "$T/plugins/arize-ax/skills/arize-ai-provider-integration" ~/.claude/skills/github-awesome-copilot-arize-ai-provider-integration && rm -rf "$T"
plugins/arize-ax/skills/arize-ai-provider-integration/SKILL.mdArize AI Integration Skill
Concepts
- AI Integration = stored LLM provider credentials registered in Arize; used by evaluators to call a judge model and by other Arize features that need to invoke an LLM on your behalf
- Provider = the LLM service backing the integration (e.g.,
,openAI
,anthropic
)awsBedrock - Integration ID = a base64-encoded global identifier for an integration (e.g.,
); required for evaluator creation and other downstream operationsTGxtSW50ZWdyYXRpb246MTI6YUJjRA== - Scoping = visibility rules controlling which spaces or users can use an integration
- Auth type = how Arize authenticates with the provider:
(provider API key),default
(proxy via custom headers), orproxy_with_headers
(bearer token auth)bearer_token
Prerequisites
Proceed directly with the task — run the
ax command you need. Do NOT check versions, env vars, or profiles upfront.
If an
ax command fails, troubleshoot based on the error:
or version error → see references/ax-setup.mdcommand not found
/ missing API key → run401 Unauthorized
to inspect the current profile. If the profile is missing or the API key is wrong: checkax profiles show
for.env
and use it to create/update the profile via references/ax-profiles.md. IfARIZE_API_KEY
has no key either, ask the user for their Arize API key (https://app.arize.com/admin > API Keys).env- Space ID unknown → check
for.env
, or runARIZE_SPACE_ID
, or ask the userax spaces list -o json - LLM provider call fails (missing OPENAI_API_KEY / ANTHROPIC_API_KEY) → check
, load if present, otherwise ask the user.env
List AI Integrations
List all integrations accessible in a space:
ax ai-integrations list --space-id SPACE_ID
Filter by name (case-insensitive substring match):
ax ai-integrations list --space-id SPACE_ID --name "openai"
Paginate large result sets:
# Get first page ax ai-integrations list --space-id SPACE_ID --limit 20 -o json # Get next page using cursor from previous response ax ai-integrations list --space-id SPACE_ID --limit 20 --cursor CURSOR_TOKEN -o json
Key flags:
| Flag | Description |
|---|---|
| Space to list integrations in |
| Case-insensitive substring filter on integration name |
| Max results (1–100, default 50) |
| Pagination token from a previous response |
| Output format: (default) or |
Response fields:
| Field | Description |
|---|---|
| Base64 integration ID — copy this for downstream commands |
| Human-readable name |
| LLM provider enum (see Supported Providers below) |
| if credentials are stored |
| Allowed model list, or if all models are enabled |
| Whether default models for this provider are allowed |
| Whether tool/function calling is enabled |
| Authentication method: , , or |
Get a Specific Integration
ax ai-integrations get INT_ID ax ai-integrations get INT_ID -o json
Use this to inspect an integration's full configuration or to confirm its ID after creation.
Create an AI Integration
Before creating, always list integrations first — the user may already have a suitable one:
ax ai-integrations list --space-id SPACE_ID
If no suitable integration exists, create one. The required flags depend on the provider.
OpenAI
ax ai-integrations create \ --name "My OpenAI Integration" \ --provider openAI \ --api-key $OPENAI_API_KEY
Anthropic
ax ai-integrations create \ --name "My Anthropic Integration" \ --provider anthropic \ --api-key $ANTHROPIC_API_KEY
Azure OpenAI
ax ai-integrations create \ --name "My Azure OpenAI Integration" \ --provider azureOpenAI \ --api-key $AZURE_OPENAI_API_KEY \ --base-url "https://my-resource.openai.azure.com/"
AWS Bedrock
AWS Bedrock uses IAM role-based auth instead of an API key. Provide the ARN of the role Arize should assume:
ax ai-integrations create \ --name "My Bedrock Integration" \ --provider awsBedrock \ --role-arn "arn:aws:iam::123456789012:role/ArizeBedrockRole"
Vertex AI
Vertex AI uses GCP service account credentials. Provide the GCP project and region:
ax ai-integrations create \ --name "My Vertex AI Integration" \ --provider vertexAI \ --project-id "my-gcp-project" \ --location "us-central1"
Gemini
ax ai-integrations create \ --name "My Gemini Integration" \ --provider gemini \ --api-key $GEMINI_API_KEY
NVIDIA NIM
ax ai-integrations create \ --name "My NVIDIA NIM Integration" \ --provider nvidiaNim \ --api-key $NVIDIA_API_KEY \ --base-url "https://integrate.api.nvidia.com/v1"
Custom (OpenAI-compatible endpoint)
ax ai-integrations create \ --name "My Custom Integration" \ --provider custom \ --base-url "https://my-llm-proxy.example.com/v1" \ --api-key $CUSTOM_LLM_API_KEY
Supported Providers
| Provider | Required extra flags |
|---|---|
| |
| |
| , |
| |
| , |
| |
| , |
| |
Optional flags for any provider
| Flag | Description |
|---|---|
| Comma-separated list of allowed model names; omit to allow all models |
/ | Enable or disable the provider's default model list |
/ | Enable or disable tool/function calling support |
After creation
Capture the returned integration ID (e.g.,
TGxtSW50ZWdyYXRpb246MTI6YUJjRA==) — it is needed for evaluator creation and other downstream commands. If you missed it, retrieve it:
ax ai-integrations list --space-id SPACE_ID -o json # or, if you know the ID: ax ai-integrations get INT_ID
Update an AI Integration
update is a partial update — only the flags you provide are changed. Omitted fields stay as-is.
# Rename ax ai-integrations update INT_ID --name "New Name" # Rotate the API key ax ai-integrations update INT_ID --api-key $OPENAI_API_KEY # Change the model list ax ai-integrations update INT_ID --model-names "gpt-4o,gpt-4o-mini" # Update base URL (for Azure, custom, or NIM) ax ai-integrations update INT_ID --base-url "https://new-endpoint.example.com/v1"
Any flag accepted by
create can be passed to update.
Delete an AI Integration
Warning: Deletion is permanent. Evaluators that reference this integration will no longer be able to run.
ax ai-integrations delete INT_ID --force
Omit
--force to get a confirmation prompt instead of deleting immediately.
Troubleshooting
| Problem | Solution |
|---|---|
| See references/ax-setup.md |
| API key may not have access to this space. Verify key and space ID at https://app.arize.com/admin > API Keys |
| Run ; set env var or write |
| Verify with |
after create | Credentials were not saved — re-run with the correct or |
| Evaluator runs fail with LLM errors | Check integration credentials with ; rotate the API key if needed |
mismatch | Cannot change provider after creation — delete and recreate with the correct provider |
Related Skills
- arize-evaluator: Create LLM-as-judge evaluators that use an AI integration → use
arize-evaluator - arize-experiment: Run experiments that use evaluators backed by an AI integration → use
arize-experiment
Save Credentials for Future Use
See references/ax-profiles.md § Save Credentials for Future Use.