BYOK - bring our own keys
M
Matt Call
#### Overview
As GoHighLevel continues to evolve its AI capabilities—particularly with Agent Voice for conversational automation—there's a clear opportunity to empower users with greater control over their LLM providers. Currently, the built-in model options are solid but limited, which can constrain innovation, cost management, and compliance needs. Implementing Bring Your Own Key (BYOK) natively would allow users to integrate custom API keys from any LLM provider (e.g., OpenAI, Anthropic, Grok/xAI, or others) directly into Agent Voice workflows. This feature should support configuration at both the agency level (for centralized management) and sub-account level (for client-specific customizations).
#### Key Benefits
- Enhanced Flexibility: Users could leverage the latest or most cost-effective models without waiting for GHL to onboard new providers. For instance, switching to a specialized voice-optimized model like Grok's voice mode or a fine-tuned local setup could dramatically improve conversation quality and latency.
- Cost Optimization: Agencies and businesses could route traffic to the most economical providers based on usage patterns, reducing reliance on a single vendor's pricing.
- Privacy & Compliance: BYOK enables self-hosted or on-premise LLMs, ensuring sensitive data stays within your infrastructure—critical for industries like finance, healthcare, or legal.
- Future-Proofing: With the LLM landscape exploding (new models weekly), this keeps GHL at the forefront without constant backend updates.
#### Proposed Implementation
- Configuration Options:
- Agency-Level: Global settings in the agency dashboard for shared keys/models across all sub-accounts, with override permissions.
- Sub-Account-Level: Per-client toggles in the sub-account settings for tailored integrations.
- Fallback Mechanism: Default to GHL's native models if a custom key fails or isn't configured.
- Supported Integrations:
- API Key Upload: Simple key entry fields with validation (e.g., test prompt on save).
- Local/SSH Models: Option to connect via SSH to self-hosted instances (e.g., Ollama, Llama.cpp) for zero-cloud dependency. Include secure tunneling and authentication guides.
- Provider Presets: Pre-built templates for popular APIs (OpenAI, Anthropic, Cohere, etc.) to streamline setup.
- Agent Voice Enhancements:
- Model selection dropdown in voice workflow builders, pulling from BYOK configs.
- Real-time monitoring: Usage stats, error logging, and A/B testing between models.
- Voice-Specific Tweaks: Parameters for latency tuning, accent adaptation, or interruption handling per provider.
#### Why This Matters Now
The current model selection is a great starting point, but it doesn't match the pace of AI innovation. Without BYOK, users are locked into a narrow ecosystem, forcing workarounds like external Zapier integrations or custom code—which add complexity and cost. This feature would position GHL as the most developer-friendly CRM on the market, attracting power users and agencies who demand sovereignty over their tech stack. I've seen similar implementations in tools like LangChain or Vercel AI SDK drive massive adoption; GHL could lead the charge in no-code AI.
Log In