As GoHighLevel continues expanding its artificial intelligence capabilities (Conversation AI, Voice AI, and AI Agents), there is a clear need for greater flexibility in choosing the language models used behind the system.
Currently, users are limited to the LLM providers that GoHighLevel chooses to integrate directly. However, the AI ecosystem evolves extremely fast, with new models appearing constantly. Agencies and developers need the ability to choose the model that best fits their use case, cost structure, performance requirements, or language support.
It would be very valuable to enable a Bring Your Own LLM (BYO-LLM) approach, allowing AI Agents to connect to any external language model via API.
This could be configured easily within the AI Agent settings by allowing users to enter:
* API endpoint
* API key / secret key
* model name
* basic parameters such as temperature or max tokens
This should support two types of connections:
  1. Direct connections to LLM providers, for example:
OpenAI, Anthropic (Claude), Google Gemini, xAI (Grok), Mistral, DeepSeek, or self-hosted open-source models.
  1. Connections through aggregators such as OpenRouter, allowing access to multiple models through a single endpoint.
Key benefits include:
Flexibility to choose the best model depending on the use case.
Cost optimization by routing usage to more efficient providers.
Future-proof architecture that does not depend on specific integrations.
Greater ability to build advanced AI workflows by connecting external AI stacks or agents.
In practice, this would turn GoHighLevel into an AI-agnostic platform where the CRM manages workflows and automation while the user decides which language model to use.