June 20th, 2025
0 reactions

o-series Updates: New o3 pricing and o3-pro in Azure AI Foundry

o-series Updates: New o3 pricing and o3-pro in Azure AI Foundry

We are excited to announce the availability of o3-pro, the newest model in the Azure AI Foundry via Azure OpenAI. o3-pro combines more compute with long context and multimodal input to deliver consistently better answers for complex tasks.

o3-pro is available via the Responses API, which enables advanced features like multi-turn interactions and future extensibility. Because the model is designed to tackle tough problems, some requests may take longer to complete. For long-running tasks, we recommend using background mode to avoid timeouts.

Key features

  • Highest reasoning performance: Uses more compute to deliver deeper, more consistent answers across complex tasks.
  • Multimodal input: Accepts both text and image inputs for broader use case coverage.
  • Structured outputs: Supports structured data generation and function calling.
  • Tool integration: Compatible with File Search tool, with more support coming.
  • Enterprise-ready: Backed by the enterprise promises and Responsible AI best practices

Get access

Customers can request access to o3-pro at https://aka.ms/oai/o3proaccess. It is available in East US2 and Sweden Central and supports Global Standard deployment. To build with o3-pro and other new features, use the new v1 preview API.

Get started

Upgrade your version of the OpenAI library.

pip install --upgrade openai

Use REST API to submit a test prompt to o3-pro.

curl -X POST https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/responses?api-version=preview \
  -H "Content-Type: application/json" \
  -H "api-key: $AZURE_OPENAI_API_KEY" \
  -d '{
     "model": "o3-pro",
     "input": "This is a test"
    }'

Use the Python SDK to generate a text response.

from openai import AzureOpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(
    DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)

client = AzureOpenAI(  
  base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",  
  azure_ad_token_provider=token_provider,
  api_version="preview"
)

response = client.responses.create(
    model="o3-pro",
    input= "This is a test" 
)

print(response.output_text) 

Availability and pricing

o3-pro is available starting June 19, 2025, in the Azure OpenAI Service. It is accessible via the Responses API only.

Pricing is as follows:

Pricing of o3-pro vs. o3 Model Input $/million tokens Output $/million tokens
o3-pro $20 $80
o3* $2 $8

 

*Starting June 1st, 2025, Azure OpenAI adjusted its charges on the o3 model to the new price of $2/$8 per million input/output tokens. The new pricing will be reflected in customers’ July invoices.

Unlocking enterprise reasoning at scale

o3-pro joins a growing set of Foundry models in Azure OpenAI Service, offering customers more choice and control over how they deploy advanced AI. Foundry models are designed for scale, performance, and enterprise readiness—enabling organizations to build intelligent applications with confidence.

With o3-pro, enterprises can tackle their most complex reasoning challenges using a model that’s optimized for depth, precision, and multimodal understanding. We look forward to seeing what you’ll build with o3-pro. Sign up on Azure AI Foundry today.

Next steps

Author

Sanjeev Jagtap
Principal Lead
Anthony Mocny
Sr Product Marketing Manager

0 comments