Gemini 2.5 Pro Experimental Will Now Power the Agentic Deep Research Feature

Before this, the Deep Research tool was powered by the Gemini 2.0 Flash Thinking (experimental) AI model.

Advertisement
Written by Akash Dutta, Edited by Siddharth Suvarna | Updated: 9 April 2025 17:06 IST
Highlights
  • Deep Research with Gemini 2.5 Pro is available to paid users
  • Free users will continue to get it with Gemini 2.0 Flash Thinking
  • Google says the new AI model offers improvements in reasoning

Gemini Advanced users can select the 2.5 Pro Experimental-powered Deep Research from the model picker

Photo Credit: Google

Google is expanding its Gemini 2.5 Pro Experimental artificial intelligence (AI) model to other tools. On Tuesday, the Mountain View-based tech giant announced that its latest Gemini model will now power the agentic Deep Research tool. This feature is currently limited to the paid subscribers of the AI platform, while the free users will continue to use the AI agent with the Gemini 2.0 Flash Thinking (experimental) model. Google says the latest large language model (LLM) will allow users to see noticeable improvements in the tool's analytical reasoning capability.

Gemini 2.5 Pro Will Now Power Deep Research

In a blog post, the tech giant announced the expansion of its latest foundation model to Deep Research. The Gemini 2.5 Pro Experimental AI model was introduced last month with several improvements. This is also the first Gemini series to feature integrated reasoning capability, which means all the Gemini 2.5 family AI models will be natively “Thinking” models. Because of this, integrating the model into Deep Research becomes a seamless process.

Google claimed that it conducted an internal test of Deep Research with Gemini 2.5 Pro Experimental, and the raters preferred the generated reports over competitors. In particular, the company claimed that the testers noticed a noticeable improvement in analytical reasoning, information synthesis, and generating insightful research reports.

Advertisement

Currently, Deep Research with Gemini 2.5 Pro Experimental is only available to Gemini Advanced subscribers. Users with the subscription can access the AI agent across the web as well as Android and iOS apps. Meanwhile, those on the free tier of Gemini can continue to use Deep Research with the Gemini 2.0 Flash Thinking (experimental) model.

Advertisement

Notably, Google first released its agentic Deep Research in December 2024. However, initially, it was only available to Gemini Advanced users. Later, in 2025, the AI agent was expanded to all users.

Deep Research can create multi-step research plans, run web searches, and collect information on the topic and related fields. After that, it analyses the gathered data, creates a detailed report and shows the output to the user by preparing a detailed report.

 

Catch the latest from the Consumer Electronics Show on Gadgets 360, at our CES 2026 hub.

Advertisement

Related Stories

Popular Mobile Brands
  1. Young Sherlock Now Set for OTT Release on OTT: All the Details
  1. Giant Ancient Collision May Have ‘Flipped’ the Moon’s Interior, Study Suggests
  2. VLT’s GRAVITY Instrument Detects ‘Tug’ from Colossal Exomoon; Could Be Largest Natural Satellite Ever Found
  3. Young Sherlock Now Set for OTT Release on OTT: What You Need to Know About Guy Ritchie’s Mystery Thriller
  4. NASA’s Miner++ AI Brings Machine Digs Into TESS Archive to the Hunt for Nearby Earth-Like Worlds
  5. iQOO 15 Ultra Confirmed to Feature Touch-based Shoulder Triggers With Haptic Feedback
  6. Invincible Season 4 OTT Release: When and Where to Watch the Highly Anticipated Viltrumite War Online?
  7. iPhone Shipments in India Rise to 14 Million Units in 2025 as Apple Sees Record Year: Report
  8. Oppo Find N6 Listed on TDRA Website, Hinting at Imminent Launch in the UAE
  9. NASA’s JWST Uncovers a ‘Feeding Frenzy’ That Births Supermassive Black Holes
  10. NASA Confirms Historic Artifacts Will Fly on Artemis II Moon Mission
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.