OpenAI Releases Two Open-Source AI Models That Performs on Par With o3, o3-Mini

OpenAI's new open-source models, gpt-oss-120b and gpt-oss-20b, feature chain-of-thought reasoning and tool use.

Advertisement
Written by Akash Dutta, Edited by David Delima | Updated: 6 August 2025 13:56 IST
Highlights
  • OpenAI says the larger AI model can run on a single 80GB GPU
  • The gpt-oss-20b can run on just 16GB of RAM
  • OpenAI says both models are built on a mixture-of-experts architecture

Both models are available with the permissive Apache 2.0 licence for academic and commercial usage

Photo Credit: Pexels/Matheus Bertelli

OpenAI released two open-source artificial intelligence (AI) models on Tuesday. This marks the San Francisco-based AI firm's first contribution to the open community since 2019, when GPT-2 was open sourced. The two new models, dubbed gpt-oss-120b and gpt-oss-20b, are said to offer comparable performance to the o3 and o3-mini models. Built on the mixture-of-experts (MoE) architecture, the company says these AI models have undergone rigorous safety training and evaluation. The open weights of these models are available to download via Hugging Face.

OpenAI's Open-Source AI Models Support Native Reasoning

In a post on X (formerly Twitter), OpenAI CEO Sam Altman announced the release of these models, highlighting that “gpt-oss-120b performs about as well as o3 on challenging health issues.” Notably, both the models are currently being hosted on OpenAI's Hugging Face listing, and interested individuals can download and locally run the available open weights.

On its website, OpenAI explains that these models are compatible with the company's Responses application programming interface (API), and can work with agentic workflows. These models also support tool use such as web search or Python code execution. With native reasoning, the models also display transparent chain-of-thought (CoT), which can be adjusted to either focus on high-quality responses or low latency outputs.

Advertisement

Coming to the architecture, these models are built on MoE architecture to reduce the number of active parameters for processing efficiency. The gpt-oss-120b activates 5.1 billion parameters per token, while gpt-oss-20b activates 3.6b parameters per token. The former has a total of 117 billion parameters and the latter has 21 billion parameters. Both models support a content length of 1,28,000 tokens.

These open-source AI models were trained on mostly English language text database. The company focused on Science, Technology, Engineering, and Mathematics (STEM) fields, coding, and general knowledge. In the post-training stage, OpenAI used reinforcement learning (RL)-based fine-tuning.

Benchmark performance of the open-source OpenAI models
Photo Credit: OpenAI

Advertisement

 

Based on the company's internal testing, gpt-oss-120b outperforms o3-mini on competition coding (Codeforces), general problem solving (MMLU and Humanity's Last Exam), and tool calling (TauBench). But in general, these models marginally fall short of o3 and o3-mini on other benchmarks such as GPQA Diamond.

Advertisement

OpenAI highlights that these models have undergone intensive safety training. In the pre-training stage, the company filtered out harmful data relating chemical, biological, radiological, and nuclear (CBRN) threats. The AI firm also said that it used specific techniques to ensure the model refuses unsafe prompts and is protected from prompt injections.

Despite being open-source, OpenAI claims that the models have been trained in a way that they cannot be fine-tuned by a bad actor to provide harmful outputs.

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement

Related Stories

Popular Mobile Brands
  1. OnePlus 15 Launch Details Likely to Be Announced on October 17
  2. Honor's Robot Phone With a Pop-Up Camera Will Debut at MWC 2026
  3. Vivo Announces OriginOS 6 for Vivo and iQOO Handsets Globally
  4. Realme GT 8 Pro Colourways Revealed; Realme GT 8 to Run on This Chipset
  5. Redmi Note 15 Series India Launch Timeline, Price and Features Leaked
  6. iQOO 15 With Snapdragon 8 Elite Gen 5 SoC to Launch in India in November
  7. Redmi K90 Pro Max to Launch in China Soon Along With This Smartphone
  8. Dreame F10 Review: Good Cleaning Performance for an Affordable Price
  9. MacBook Pro With M5 Chip, 14.2-Inch Display Launched in India: See Price
  10. Oppo Find X9 Series, Oppo Pad 5 Launching Today: All You Need to Know
  1. Oppo Find X9 Pro, Oppo Find X9 Launched With Dimensity 9500 SoC, Hasselblad-Tuned Cameras: Price, Features
  2. Anthropic Releases Claude Haiku 4.5 as a Fast and Cost-Effective AI Model
  3. Oppo Watch S Launched With Temperature Monitoring, 16-Channel SpO2 Sensor: Price, Specifications
  4. Battlefield 6 Has Reportedly Sold 7 Million Copies in Just 5 Days After Launch
  5. Japan Tells OpenAI to Stop Using Mario, Pikachu, and Anime Characters in Sora 2 Videos: Report
  6. Oppo Pad 5 With MediaTek Dimensity 9400+ Chipset, 10,420mAh Battery Launched: Price, Specifications
  7. WhatsApp Channel Quiz Feature Spotted in Development, Could Launch Soon
  8. Instagram Rolls Out Diwali-Themed Meta AI-Powered Effects for Stories, Video Effects on Edits App
  9. Gmail Working on Bills, Travel Inbox Labels for Easier Email Organisation: Report
  10. Reliance Jio, Aptos to Launch Blockchain Rewards for 500 Million Users
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.