Available AI Models Query

Retrieves the available models for a specific AI provider

Before You Begin

🔗 Use the GraphQL Playground to execute the queries in this guide.

➡️ New to GraphQL? Learn how to navigate the Playground with our Playground Basics Guide.

Prerequisites

  1. Authentication: Use a Service Account token.
  2. Permissions: You must have permissions to manage AI providers in the organization, typically granted to super admin roles.
  3. Provider Name: The name of the AI provider whose available models you want to retrieve.

Step 1: Execute the Query

This query retrieves the list of available models for a given AI provider.

Querying available models

query {
  availableAiModels(providerName: openai)
}

Arguments

  • providerName (ProviderName, required): The name of the AI provider. Accepted values are listed under ProviderName in the GraphQL Playground. Current options include:
    • openai — OpenAI
    • azure_openai — Azure OpenAI
    • amazon_bedrock — AWS Bedrock AI
    • custom — Custom Provider
    • google_vertex_ai — Google Vertex AI
    • oracle_oci — Oracle AI

Using Variables

You can also pass the provider name as a variable:

query GetAvailableAiModels($providerName: ProviderName!) {
  availableAiModels(providerName: $providerName)
}

Variables:

{
  "providerName": "openai"
}

Response Fields

The query returns a list of String values, where each string represents a model name available for the specified provider.

Example Response

{
  "data": {
    "availableAiModels": ["gpt-4o", "gpt-4o-mini", "gpt-4-turbo"]
  }
}

Additional Notes

  • If the providerName value is not valid, a validation error will be returned.
  • If the provider service is unavailable, an error will be returned with details about the failure.
  • If the request fails due to insufficient permissions, an authorization error will be raised.

Use this query to discover which models are available for a specific AI provider before configuring your LLM provider settings.