Get Default LLM Provider

Retrieve the default Large Language Model (LLM) provider configuration for a specific assistant or for the organization.

Before You Begin

🔗 Use the GraphQL Playground to execute the queries in this guide.

➡️ New to GraphQL? Learn how to navigate the Playground with our Playground Basics Guide.

Prerequisites

  1. Authentication: Use a Service Account token.
  2. Permissions: You must have permissions to manage AI providers in the organization.
  3. Owner ID & Type: The ID and type of the entity whose default provider you want to retrieve.

Step 1: Execute the Query

This query retrieves the full LLM provider record that is set as the default for a specific owner (assistant or organization), including its configuration.

query {
  defaultLlmProvider(ownerType: assistant, ownerId: "asst_123abc") {
    id
    name
    active
    configuration
    organizationDefault
  }
}

Arguments

  • ownerType (Enum, required): The type of owner. Listed under OwnerProvider in the GraphQL Playground.
  • ownerId (String, required): The ID of the target owner.

Response Fields

  • defaultLlmProvider: The default LLM provider object, containing:
    • id: Unique ID of the LLM provider configuration.
    • name: Display name of the provider (when available).
    • active: Whether the provider is active (not disabled).
    • configuration: JSON object with provider settings (for example, the selected model).
    • organizationDefault: Whether this provider is the organization-wide default.

Example Response

{
  "data": {
    "defaultLlmProvider": {
      "id": "prov-123",
      "name": "OpenAI",
      "active": true,
      "configuration": {
        "model": "gpt-4"
      },
      "organizationDefault": false
    }
  }
}

Additional Notes

  • If no default provider is configured for the specified owner, the query return null; invalid ownerId or missing permissions will return an error.
  • This query returns the full provider (LlmProvider). The older activeLlmProvider field is deprecated; prefer defaultLlmProvider for new integrations.
  • Use this when you need the default provider’s configuration (for example, which model is in use), not only the active provider link IDs.

This query returns the default LLM provider and its configuration for a given assistant or organization.