OpenAI announced GPT-5 last week in “GPT-5 and the new era of work“. Working with OpenAI, we rolled out GPT-5 support the day it launched, making sure you can use and build with it and integrate it into your applications immediately.
We’ve got you covered with a dev-focused rundown: GPT-5 availability, new capabilities, and quick-start steps.
GPT-5 announcement recap
- Overview announcement: GPT-5 and the new era of work
- Developer deep dive: Introducing GPT-5 for developers
- Release notes: Introducing GPT-5
Top highlights from the announcement:
- Better reasoning and structured thinking, with improved accuracy and faster responses
- Stronger context recognition for complex, real-world workflows
- Unified capabilities across chat, agents, coding, multimodal, and advanced math
- Available now in ChatGPT and in the API
Where you can use GPT-5 today
We were happy to roll out GPT-5 support the day it was released across our developer products and services, and we’re continuing to expand these services, for example with this week’s inclusion of GPT-5 mini in all Copilot plans.
Below is a roundup of GPT-5 integrations across Microsoft products and services, based on last week’s rollout.
GitHub Copilot
GitHub Copilot brings GPT‑5 into your editor and GitHub workflows for richer code suggestions and chat—especially on larger, multi‑file changes and refactors. Because it’s integrated into the tools you already use, you can explore GPT‑5’s capabilities without leaving your flow. Availability can vary by IDE and plan during preview.
- Enhanced coding capabilities with GPT-5 for longer and more complex tasks.
- Integrated into VS Code and GitHub workflows.
- VS Code changelog: GPT-5 mini in Copilot for VS Code
- VS Code release: GitHub Copilot in VS Code – July release (v1.103)
- Other IDEs: OpenAI GPT-5 is now available in public preview in Visual Studio, JetBrains IDEs, Xcode, and Eclipse
AI Toolkit in Visual Studio Code
Use the AI Toolkit in VS Code to experiment with GPT‑5 models: connect to GitHub Models or Azure AI Foundry, run playgrounds, and scaffold integrations in your workspace. It works with both cloud endpoints and OSS/local backends so you can prototype and ship from the same editor.
- GPT-5 models available via AI Toolkit for experimentation and integration.
- Supports GPT OSS and cloud/local development.
- Announcement / docs: GPT-5 Family of Models & GPT OSS Are Now Available in AI Toolkit for VS Code
Azure AI Foundry
Important
Access to gpt-5 in Azure AI Foundry requires registration (apply here). gpt-5-mini, gpt-5-nano, and gpt-5-chat do not require registration. GPT-5 models are currently available in East US 2 and Sweden Central (Global Standard & Data Zones); see Azure OpenAI models: GPT-5.- GPT-5 models available with enterprise-grade security and model routing.
- Supports long-running agentic tasks with structured outputs and reasoning capabilities.
- Regional availability: GPT-5 models are currently available in East US 2 and Sweden Central (Global Standard & Data Zones). See Azure OpenAI models: GPT-5.
- Access: Registration required for gpt-5; gpt-5-mini, gpt-5-nano, and gpt-5-chat do not require registration.
- Full capabilities: Azure OpenAI reasoning models.
- Announcement / docs: Microsoft incorporates OpenAI’s GPT-5 into consumer, developer and enterprise offerings
GitHub Models
- Supports gpt-5, gpt-5-mini, gpt-5-nano, gpt-5-chat.
- Marketplace: Models · GitHub Marketplace
Microsoft Copilot Studio
- Makers can select GPT-5 models for agent orchestration.
- Supports GPT-5 Chat and GPT-5 Reasoning with auto-routing.
- Announcement / docs: Available today: GPT-5 in Microsoft Copilot Studio
Microsoft 365 Copilot
- GPT-5 powers Copilot Chat with smarter orchestration, improved reasoning, and multimodal capabilities.
- Users can opt-in via “Try GPT-5” button.
- Announcement / docs: Available today: GPT-5 in Microsoft 365 Copilot
OpenAI .NET SDK
- Official .NET library supports GPT-5 via the Responses API, including streaming and reasoning with configurable reasoning effort.
- Guide: How to use responses with streaming and reasoning
- NuGet: OpenAI
C# example: streaming with reasoning effort
using OpenAI.Responses;
OpenAIResponseClient client = new(
model: "gpt-5",
apiKey: Environment.GetEnvironmentVariable("OPENAI_API_KEY")
);
await foreach (var update in client.CreateResponseStreamingAsync(
userInputText: "Explain beta-reduction in lambda calculus.",
new ResponseCreationOptions
{
ReasoningOptions = new ResponseReasoningOptions
{
ReasoningEffortLevel = ResponseReasoningEffortLevel.High,
},
}))
{
if (update is StreamingResponseContentPartDeltaUpdate delta)
{
Console.Write(delta.Text);
}
}
Python example: reasoning effort and verbosity
This sample demonstrates GPT-5’s controllable response behavior using the new reasoning_effort
and verbosity
parameters, allowing developers to fine-tune how deeply the model thinks and how much it says.
import os
import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
client = openai.AzureOpenAI(
api_version=os.environ["AZURE_OPENAI_VERSION"],
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
azure_ad_token_provider=get_bearer_token_provider(DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default"),
)
response = client.chat.completions.create(
model=os.environ["AZURE_OPENAI_DEPLOYMENT"],
messages=[
{"role": "user", "content": "Explain beta-reduction in lambda calculus."},
],
reasoning_effort="minimal",
verbosity="low"
)
print(response.choices[0].message. Content)
JavaScript example: GPT-5 structured output (JSON schema)
This JavaScript sample highlights GPT-5’s ability to return structured, schema-conforming JSON using the new response_format
parameter with a custom JSON Schema—making it easy to extract reasoning steps and final answers with full type safety. It also demonstrates seamless integration with Azure Foundry endpoints for secure, production-grade deployments.
import { AzureOpenAI } from "openai";
import dotenv from "dotenv";
dotenv.config();
const endpoint = process.env.AZURE_INFERENCE_ENDPOINT; // Foundry project endpoint
const key = process.env.AZURE_INFERENCE_KEY; // API key
const deployment = process.env.AZURE_OPENAI_DEPLOYMENT || "gpt-5";
const client = new AzureOpenAI({
endpoint,
apiKey: key,
apiVersion: "2025-01-01-preview",
deployment,
});
const schema = {
name: "math_explanation",
schema: {
type: "object",
properties: {
steps: { type: "array", items: { type: "string" } },
answer: { type: "number" },
},
required: ["steps", "answer"],
additionalProperties: false,
},
strict: true,
};
const result = await client.chat.completions.create({
model: deployment,
messages: [
{ role: "system", content: "Return JSON only." },
{ role: "user", content: "What is 23 * 7? Show your steps." },
],
response_format: { type: "json_schema", json_schema: schema },
});
const content = result.choices[0].message?.content ?? "{}";
const data = JSON.parse(content);
console.log("Steps:", data.steps);
console.log("Answer:", data.answer);
Output:
Example output for "What is 23 * 7? Show your steps.":
Steps: [
'Break 23 into 20 and 3: (20 + 3) * 7',
'Multiply: 20 * 7 = 140',
'Multiply: 3 * 7 = 21',
'Add the results: 140 + 21 = 161'
]
Answer: 161
Python: RAG Chat Sample
Looking for a full sample? Check out the Azure Search + OpenAI RAG chat app in Python: RAG chat app with Azure OpenAI and Azure AI Search (Python). It includes a Python backend and React frontend, sample data, and quick start paths (Codespaces, Dev Containers, or azd) so you can deploy and test fast. The docs also cover evaluation and production hardening.
It’s updated for GPT‑5: switch the chat model and version via environment variables, then redeploy. See “Using different chat completion models” and “Using reasoning models” in the sample’s docs for exact settings.
Java: OpenAI SDK for Java
The OpenAI SDK for Java 3.0.0 release adds support for GPT-5 and new API features.
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.models.ChatModel;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;
// Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID` and `OPENAI_PROJECT_ID` environment variables
OpenAIClient client = OpenAIOkHttpClient.fromEnv();
ResponseCreateParams params = ResponseCreateParams.builder()
.input("Say this is a test")
.model(ChatModel.GPT_5)
.build();
Response response = client.responses().create(params);
Evaluate and compare models
When you adopt GPT-5, validate quality and cost for your domain with side-by-side evaluation.
- Azure AI Foundry: Run playground comparisons and batch evaluations with the Azure AI Evaluation SDK. See Azure OpenAI reasoning models and the evaluation SDK: azure-ai-evaluation (GitHub).
- GitHub Models: Quickly try GPT-5 family variants (gpt-5, mini, nano, chat) in the marketplace and your IDE to test latency and output quality. See Models · GitHub Marketplace.
For .NET apps, when you evaluate or switch to GPT-5, keep quality checks consistent across unit tests and CI by using Microsoft.Extensions.AI Evaluation. It provides quality, safety, and NLP evaluators with caching and reporting to help you spot regressions before rollout. See The Microsoft.Extensions.AI.Evaluation libraries (Learn) and the blog Exploring Agent Quality and NLP evaluators.
Tip
Prefer structured outputs for easier, automated checks (schema validation), and evaluate with groundedness and relevance metrics on your own datasets before switching defaults.What we’ve been building
We’ve been experimenting with GPT-5 across Microsoft and the open-source community. Here are two highlights from the past week:
Pamela Fox — Evaluating GPT-5 for RAG
- Blog: GPT-5: Will it RAG?
- A deep dive into evaluation setup and early takeaways for GPT-5 on retrieval-augmented generation.
Anthony Shaw — GitHub Models CLI updates
- Repo: tonybaloney/llm-github-models
- Updated to work with the GPT-5 family via GitHub Models, making it easy to test prompts and workflows from the command line.
Tony also wrote up a big blog post on Using an LLM in GitHub Actions. He explores how to integrate a large language model directly into GitHub Actions workflows, enabling automated reasoning, code review, and decision-making within CI pipelines. He walks through setting up a custom action that invokes an LLM via API, discusses practical use cases like summarizing PRs or validating commit messages, and highlights the potential for AI-assisted automation in developer tooling. It’s a hands-on look at bringing GPT-powered intelligence into everyday DevOps.
Burke Holland — Vibe coding with GPT-5 in VS Code
Burke tries some “vibe coding” with GPT-5 in Visual Studio Code—he has GPT-5 build a working website while he goes to eat dinner, and it does a pretty good job. The design looks pretty nice, too. Watch on YouTube
Burke also tried an experiment where he asked GPT-5 to create an “aesthetically pleasing game” three times, with three different results.
Quick reference table
Product / Service | GPT-5 Integration Highlights | Announcement / Documentation |
---|---|---|
💻 Visual Studio Code | GPT-5 via AI Toolkit; supports GPT OSS and cloud/local development. | GPT-5 Family of Models & GPT OSS Are Now Available in AI Toolkit for VS Code • GitHub Copilot in VS Code – July release (v1.103) • GPT-5 mini |
🤖 GitHub Copilot | Enhanced coding with GPT-5 for longer and more complex tasks; integrated into VS Code and GitHub. | GitHub Copilot in VS Code – July release (v1.103) • OpenAI GPT-5 is now available in public preview in Visual Studio, JetBrains IDEs, Xcode, and Eclipse • GPT-5 mini in Copilot for VS Code |
☁️ Azure AI Foundry | Enterprise-grade GPT-5 with security, model routing, and structured outputs. | Azure OpenAI models: GPT-5 • Registration for gpt-5 access • Reasoning models |
🧬 GitHub Models | gpt-5, gpt-5-mini, gpt-5-nano, gpt-5-chat. | Models · GitHub Marketplace |
🛠️ Microsoft Copilot Studio | Makers can select GPT-5 models for agent orchestration; GPT-5 Chat and Reasoning with auto-routing. | Available today: GPT-5 in Microsoft Copilot Studio |
📄 Microsoft 365 Copilot | GPT-5 powers Copilot Chat with smarter orchestration, improved reasoning, and multimodal capabilities. Users can opt-in via “Try GPT-5”. | Available today: GPT-5 in Microsoft 365 Copilot |
🧪 Code Samples | Azure Search + OpenAI demo with GPT-5. | RAG chat app with Azure OpenAI and Azure AI Search (Python) |
Closing
Microsoft is all-in on AI. As new models land, we work to bring support to our developer tools and cloud quickly—like we did for GPT‑5—so you can build with the latest, confidently and at scale. You’ve got plenty of links above to get you started, but the first step it to start using GPT-5 in Visual Studio and Visual Studio Code:
0 comments
Be the first to start the discussion.