Update the name or configuration of an existing Large Language Model (LLM) provider.
Before You Begin
🔗 Use the GraphQL Playground to execute the mutations in this guide.
➡️ New to GraphQL? Learn how to navigate the Playground with our Playground Basics Guide.
Prerequisites
- Authentication: Use a Service Account token.
- Permissions: You must have permissions to
manage AI providersin the organization. - Organization UUID: The UUID of the organization where the provider exists.
- LLM Provider ID: The ID of the provider configuration you wish to update. You can get this ID from the
llmProvidersByOrganizationquery.
Step 1: Execute the Mutation
This mutation allows you to modify an existing LLM provider's details, such as its name or its configuration settings.
mutation {
updateLlmProvider(
input: {
id: "27"
organizationUuid: "org-uuid-example"
name: "OpenAI GPT-4o (Updated)"
configuration: {
provider: "openai"
openai_api_key: "sk-your-new-secret-key"
model_name: "gpt-4o"
}
}
) {
llmProvider {
id
name
configuration
}
}
}
Arguments
id(ID, required): The ID of the LLM provider configuration to update.organizationUuid(String, required): The UUID of the target organization.configuration(JSON, required): An object containing the provider's new settings. All required configuration keys must be provided.name(String, optional): The new descriptive name for the provider configuration.
Response Fields
llmProvider: The updated LLM provider object, containing:id: The unique ID of the provider configuration.name: The newly updated name.configuration: The updated configuration details.
Example Response
{
"data": {
"updateLlmProvider": {
"llmProvider": {
"id": "27",
"name": "OpenAI GPT-4o (Updated)",
"configuration": {
"provider": "openai",
"model_name": "gpt-4o"
}
}
}
}
}
Additional Notes
- If the mutation fails due to an invalid
id, incorrectorganizationUuid, or lack of permissions, an error will be returned. - When updating the
configuration, the entire configuration object must be passed. The system does not perform a partial merge. - For security, sensitive fields like
openai_api_keyare accepted in the input but will not be returned in the response.
This mutation provides a straightforward way to modify and maintain your LLM provider configurations.
