Retrieves the available models for a specific AI provider
Before You Begin
🔗 Use the GraphQL Playground to execute the queries in this guide.
➡️ New to GraphQL? Learn how to navigate the Playground with our Playground Basics Guide.
Prerequisites
- Authentication: Use a Service Account token.
- Permissions: You must have permissions to
manage AI providersin the organization, typically granted tosuper adminroles. - Provider Name: The name of the AI provider whose available models you want to retrieve.
Step 1: Execute the Query
This query retrieves the list of available models for a given AI provider.
Querying available models
query {
availableAiModels(providerName: openai)
}
Arguments
providerName(ProviderName, required): The name of the AI provider. Accepted values are listed underProviderNamein the GraphQL Playground. Current options include:openai— OpenAIazure_openai— Azure OpenAIamazon_bedrock— AWS Bedrock AIcustom— Custom Providergoogle_vertex_ai— Google Vertex AIoracle_oci— Oracle AI
Using Variables
You can also pass the provider name as a variable:
query GetAvailableAiModels($providerName: ProviderName!) {
availableAiModels(providerName: $providerName)
}
Variables:
{
"providerName": "openai"
}
Response Fields
The query returns a list of String values, where each string represents a model name available for the specified provider.
Example Response
{
"data": {
"availableAiModels": ["gpt-4o", "gpt-4o-mini", "gpt-4-turbo"]
}
}
Additional Notes
- If the
providerNamevalue is not valid, a validation error will be returned. - If the provider service is unavailable, an error will be returned with details about the failure.
- If the request fails due to insufficient permissions, an authorization error will be raised.
Use this query to discover which models are available for a specific AI provider before configuring your LLM provider settings.
