GitHub recently announced that custom models are available for GitHub Copilot Enterprise users in limited public beta. Custom models are fine-tuned on your organization’s proprietary codebases and coding practices to provide more contextually relevant and consistent inline suggestions.
How It Works: Security and Privacy in Custom Models
The data used to train your organization’s custom model is never used to train another customer’s model, and your custom model is never shared. You can learn more about the fine-tuning process in GitHub’s announcement post and documentation.
C and C++ Case Study: Microsoft Office
Custom models can be particularly valuable for repositories that rely on internal APIs, specialized frameworks, or strict coding styles. As a case study, GitHub partnered with the Microsoft Office organization to fine-tune a custom model on Office’s C and C++ code. After switching to the custom model, Office developers observed code completions that were more consistent with their unique C++ dialect (e.g., macros, strict naming conventions), coding practices (e.g., use of feature gates), and specific needs.
“Code completion is much better than before. It seems to generate functions quite well, given arguments, return type and a comment explaining what the function does. I personally used it to write a block of authentication code where a pop-up would be shown only if silent login fails.” – Office developer
Join the Waitlist
The code on which you want to train a custom model must be hosted in repositories owned by your organization on GitHub.com. If you meet this requirement and are interested in joining the waitlist, you can sign-up here. Note that custom models for GitHub Copilot Enterprise are currently in limited public beta and subject to change. If you’re interested in learning more, check out the custom model documentation.
If you have feedback on Copilot for C++ in Visual Studio or VS Code, we’d love to learn more. You can reach us via email (visualcpp@microsoft.com) or X (@VisualC).
0 comments
Be the first to start the discussion.