Create LLM Provider

Create a new Large Language Model (LLM) provider configuration for an organization.

Before You Begin

🔗 Use the GraphQL Playground to execute the mutations in this guide.

➡️ New to GraphQL? Learn how to navigate the Playground with our Playground Basics Guide.

Prerequisites

  1. Authentication: Use a Service Account token.
  2. Permissions: You must have permissions to manage AI providers in the organization, typically granted to super admin roles.
  3. Organization UUID: The UUID of the organization where the provider will be created.

Step 1: Execute the Mutation

This mutation registers a new LLM provider configuration for a specific organization.

mutation {
  createLlmProvider(
    input: {
      organizationUuid: "org-uuid-example"
      name: "OpenAI GPT-4"
      configuration: {
        provider: "openai"
        openai_api_key: "sk-your-secret-key"
        model_name: "gpt-4o"
      }
    }
  ) {
    llmProvider {
      id
      name
      configuration
    }
  }
}

Arguments

  • organizationUuid (String, required): The UUID of the target organization.
  • configuration (JSON, required): An object containing the provider's settings. The required keys and values depend on the specific provider (e.g., provider, openai_api_key, model_name).
  • name (String, optional): A descriptive name for the provider configuration (e.g., "Internal GPT-3.5 Turbo").

Response Fields

  • llmProvider: The newly created LLM provider object, containing:
    • id: The unique ID for the new provider configuration.
    • name: The name assigned to the provider.
    • configuration: The configuration details stored for the provider.

Example Response

{
  "data": {
    "createLlmProvider": {
      "llmProvider": {
        "id": "27",
        "name": "OpenAI GPT-4",
        "configuration": {
          "provider": "openai",
          "openai_api_key": "sk-your-secret-key",
          "model_name": "gpt-4o"
        }
      }
    }
  }
}

Additional Notes

  • If the mutation fails due to invalid data or lack of permissions, an error will be returned with a detailed message.
  • The structure of the configuration JSON is critical and must match the requirements of the specific LLM provider you are integrating.
  • This mutation is the first step in setting up a new AI provider, which can later be set as the active provider for an assistant or organization.
  • For security, sensitive information like API keys is not exposed in the configuration object of the response.

This mutation allows you to securely register and store configurations for various LLM providers within your organization.