OpenAI Releases Two Open-Source AI Models That Performs on Par With o3, o3-Mini

OpenAI's new open-source models, gpt-oss-120b and gpt-oss-20b, feature chain-of-thought reasoning and tool use.

Advertisement
Written by Akash Dutta, Edited by David Delima | Updated: 6 August 2025 13:56 IST
Highlights
  • OpenAI says the larger AI model can run on a single 80GB GPU
  • The gpt-oss-20b can run on just 16GB of RAM
  • OpenAI says both models are built on a mixture-of-experts architecture

Both models are available with the permissive Apache 2.0 licence for academic and commercial usage

Photo Credit: Pexels/Matheus Bertelli

OpenAI released two open-source artificial intelligence (AI) models on Tuesday. This marks the San Francisco-based AI firm's first contribution to the open community since 2019, when GPT-2 was open sourced. The two new models, dubbed gpt-oss-120b and gpt-oss-20b, are said to offer comparable performance to the o3 and o3-mini models. Built on the mixture-of-experts (MoE) architecture, the company says these AI models have undergone rigorous safety training and evaluation. The open weights of these models are available to download via Hugging Face.

OpenAI's Open-Source AI Models Support Native Reasoning

In a post on X (formerly Twitter), OpenAI CEO Sam Altman announced the release of these models, highlighting that “gpt-oss-120b performs about as well as o3 on challenging health issues.” Notably, both the models are currently being hosted on OpenAI's Hugging Face listing, and interested individuals can download and locally run the available open weights.

On its website, OpenAI explains that these models are compatible with the company's Responses application programming interface (API), and can work with agentic workflows. These models also support tool use such as web search or Python code execution. With native reasoning, the models also display transparent chain-of-thought (CoT), which can be adjusted to either focus on high-quality responses or low latency outputs.

Advertisement

Coming to the architecture, these models are built on MoE architecture to reduce the number of active parameters for processing efficiency. The gpt-oss-120b activates 5.1 billion parameters per token, while gpt-oss-20b activates 3.6b parameters per token. The former has a total of 117 billion parameters and the latter has 21 billion parameters. Both models support a content length of 1,28,000 tokens.

Advertisement

These open-source AI models were trained on mostly English language text database. The company focused on Science, Technology, Engineering, and Mathematics (STEM) fields, coding, and general knowledge. In the post-training stage, OpenAI used reinforcement learning (RL)-based fine-tuning.

Benchmark performance of the open-source OpenAI models
Photo Credit: OpenAI

Advertisement

 

Based on the company's internal testing, gpt-oss-120b outperforms o3-mini on competition coding (Codeforces), general problem solving (MMLU and Humanity's Last Exam), and tool calling (TauBench). But in general, these models marginally fall short of o3 and o3-mini on other benchmarks such as GPQA Diamond.

Advertisement

OpenAI highlights that these models have undergone intensive safety training. In the pre-training stage, the company filtered out harmful data relating chemical, biological, radiological, and nuclear (CBRN) threats. The AI firm also said that it used specific techniques to ensure the model refuses unsafe prompts and is protected from prompt injections.

Despite being open-source, OpenAI claims that the models have been trained in a way that they cannot be fine-tuned by a bad actor to provide harmful outputs.

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Advertisement

Related Stories

Popular Mobile Brands
  1. Ray-Ban Meta Gen 2 Glassses Are Now Available in India
  2. Vivo X300 Pro With 200-Megapixel Telephoto Camera Launched in India
  3. Oppo A6x 5G With 6,500mAh Battery Launched in India at This Price
  4. Vivo X300 Launched in India With MediaTek Dimensity 9500 SoC at This Price
  5. OnePlus 15R, OnePlus Pad Go 2 Set for Live Launch at Bengaluru Keynote
  6. Amar Subramanya to Replace John Giannandrea as Apple's VP of AI
  7. Poco C85 5G Teased to Launch in India Soon With These Features
  8. OnePlus Pad Go 2 Visits Geekbench With This Midrange Chipset
  9. Vivo X300 Review: Pro Power, Pocket Size
  10. Apple Adds iPhone SE (First Generation), More Products to Obsolete List
  1. Redmi 15C 5G Camera Details Confirmed a Day Ahead of Launch in India: Expected Specifications, Features
  2. Samsung Galaxy S26, Galaxy S26+ Hardware Upgrades Spotted in Leaked Comparison With Galaxy S25 Counterparts
  3. Redmi Note 15 5G Series Price, Battery Capacity and Other Key Features Leaked Ahead of Anticipated Global Debut
  4. Khujechi Toke Raat Berate OTT Release: When and Where to Watch This Bengali Series Online?
  5. Twinless Now Available for Rent on Amazon Prime Video and Apple TV: What You Need to Know
  6. Who Is Amar Subramanya? Indian-Origin Researcher Taking Reigns of Apple’s AI Division
  7. Samsung Galaxy S26 Could Feature Revamped Lock Screen Customisation, 3D Wallpaper Effects, One UI 8.5 Leak Shows
  8. HMD XploraOne Teased to Launch Soon as Kid-Friendly Phone; Specifications Tipped
  9. Poco C85 5G India Launch Teased; 50-Megapixel Rear Camera, Flipkart Availability Confirmed
  10. Government Says Sanchar Saathi App Optional, Can Be Removed; Apple Reportedly Plans to Oppose Mandatory Installation
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.