GitHub Copilot bring your own key (BYOK) enhancements #184350
Replies: 7 comments
-
Beta Was this translation helpful? Give feedback.
-
|
Hoping to have this feature in Teams Plan in the future |
Beta Was this translation helpful? Give feedback.
-
|
Hope the BYOK feature can also be made available for individual accounts. |
Beta Was this translation helpful? Give feedback.
-
|
Hi this is a feature we can really benefit from, unfortunately, we struggle configure Anthropic via Microsoft Foundary. It looks like that you are not recognizing that Antropic with message endpoint is configured. |
Beta Was this translation helpful? Give feedback.
-
|
Still have not been able to make this feature work even once. |
Beta Was this translation helpful? Give feedback.
-
|
it's not working for AWS Bedrock |
Beta Was this translation helpful? Give feedback.
-
|
Hi! I am extremely interested in this opportunity. there's a huge piece of this missing. that I would like to see answered. I have configured a model pointed at my OpenAI Compatible API endpoint... however seems like copilot agent makes a call to api.github.com which is fine but i never see like VS code trying to access my model hosted in the datacenter directly. my org has no appetite to expose this to the whole internet... nor should we., what I'm needing to know is from a networking perspective if i don't want to expose my models to all of the internet is there some allowlist that can be provided to enable this to function. even if i was using Microsoft Foundry I would have a network policy attached to the API so it's not exposed to everything on the internet. I've looked in https://docs.github.com/en/copilot/reference/copilot-allowlist-reference and that shows what we would need from like a user's pc outbound but i cannot find anything related to make the custom models work. many thanks! |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Bring your own key (BYOK) for GitHub Copilot just got more powerful. Enterprises can now connect a wider range of LLM providers, unlock structured outputs, fine-tune context windows, and stream responses in real time—giving you complete control over your AI infrastructure.
What's new
New provider options
Connect API keys from AWS Bedrock, Google AI Studio, and any OpenAI-compatible provider. These join Anthropic, Microsoft Foundry, OpenAI, and xAI as supported BYOK choices—giving your organization maximum flexibility in choosing models that fit your stack.
Support for the Responses API
BYOK now supports models using the Responses API, unlocking structured outputs and richer multimodal interactions. This enables more sophisticated use cases and cleaner integrations with your workflows.
Maximum context window configuration
Admins can now define a maximum context window for BYOK models, balancing cost, performance, and response quality based on your team's needs.
Streaming responses for faster interaction
Watch responses stream in real time as Copilot generates them, rather than waiting for full completion. This keeps your workflow smooth and responsive.
Start using the new BYOK features today
These capabilities are available now in public preview for GitHub Enterprise and Business customers. Head to your enterprise or organization settings, connect your LLM provider's API key, and start using your models in Copilot Chat and supported IDEs.
For detailed setup and configuration, visit our BYOK documentation.
Help us shape the future
We're just getting started, and your feedback will guide what comes next. Join the conversation below to share how BYOK is transforming your enterprise's Copilot experience.
Beta Was this translation helpful? Give feedback.
All reactions