OpenAI o3 Just Got 80% Cheaper — Plus, Meet o3-pro: A New Era of Affordable, Powerful AI

6/22/20252 min read

OpenAI o3 Just Got 80% Cheaper — Plus, Meet o3-pro: A New Era of Affordable, Powerful AI
OpenAI o3 Just Got 80% Cheaper — Plus, Meet o3-pro: A New Era of Affordable, Powerful AI

Big news from OpenAI: If you’re building with large language models or planning to, AI just got a whole lot cheaper.
On June 21, 2025, OpenAI announced an 80% price reduction for its o3 model — plus the release of o3-pro, a beefed-up, reliable version for solving tougher problems.

So, what does this mean for developers, startups, and the future of AI? Let’s break it down in plain English.

What Is o3?

o3 is one of OpenAI’s newest frontier models. Think of it as the powerful engine behind many advanced AI features:

  • Coding help

  • Tool calling (for integrating with external APIs)

  • Instruction following (like giving multi-step commands)

Same model. Much cheaper.
Previously, running o3 could get expensive. But with the new pricing:

  • $2 per 1 million input tokens

  • $8 per 1 million output tokens
    That’s cheaper than GPT-4o and the same cost as GPT-4.1 — making cutting-edge AI more accessible than ever before.

Why Did They Drop the Price?

OpenAI says they’ve optimized their inference stack (basically, the servers and software that generate responses) to make things faster and cheaper. It’s the same o3 model — no features removed — just better pricing.

For developers and businesses that rely on AI for production, automation, or content generation, this means more scale with less budget.

Introducing o3-pro: Smarter, More Reliable AI for Hard Problems

While o3 is cheaper and faster, OpenAI also introduced o3-pro, a high-performance version of o3 designed for advanced tasks like:

  • Tackling challenging problems

  • Giving longer, more thoughtful responses

  • Supporting image inputs, function calling, and structured outputs

o3-pro Pricing:

  • $20 per 1 million input tokens

  • $80 per 1 million output tokens
    That’s 87% cheaper than the old o1-pro model — making it usable in real-world projects without blowing up costs.

Pro tip: Since o3-pro takes more time to “think,” OpenAI recommends using the background mode in the Responses API to avoid timeouts for heavy tasks.

o3 vs o3-pro: Which One Should You Use?

Short answer:

  • Use o3 for everyday workloads, coding, tool use, and fast API responses.

  • Use o3-pro for complex reasoning, detailed research, or when accuracy matters more than speed.

Why This Matters for Developers and Startups

This price drop is huge for:

  • Developers building apps that rely on LLMs

  • Startups looking to scale AI products affordably

  • Researchers who need powerful models but have been limited by cost

With OpenAI lowering prices while increasing performance, it’s clear they’re pushing aggressively to remain competitive against Anthropic (Claude), Google (Gemini), and others in the LLM space.

Final Thought: Cheaper, Smarter, Faster AI Is Here

The future of AI isn’t just about making models bigger — it’s about making them accessible.
With o3’s 80% price cut and o3-pro’s launch, OpenAI is setting a new standard for affordable, powerful artificial intelligence.