OpenAI Alleges Its AI Models Were Used to Build DeepSeek-R1: Report

OpenAI reportedly claimed that it had seen evidence of distillation of its AI models, which it suspected to be from DeepSeek.

Advertisement
Written by Akash Dutta, Edited by Siddharth Suvarna | Updated: 29 January 2025 19:27 IST
Highlights
  • OpenAI’s terms of service forbid using outputs to develop new AI models
  • DeepSeek-R1 is an open-source reasoning-focused AI model
  • The distillation was reportedly done using the OpenAI APIs

AI model distillation is a technique used to transfer knowledge from a larger model to a smaller model

Photo Credit: Reuters

OpenAI has reportedly claimed that DeepSeek might have distilled its artificial intelligence (AI) models to build the R1 model. As per the report, the San Francisco-based AI firm stated that it has evidence that some users were using its AI models' outputs for a competitor, which is suspected to be DeepSeek. Notably, the Chinese company released the open-source DeepSeek-R1 AI model last week and hosted it on GitHub and Hugging Face. The reasoning-focused model surpassed the capabilities of the ChatGPT-maker's o1 AI models in several benchmarks.

OpenAI Says It Has Evidence of Foulplay

According to a Financial Times report, OpenAI claimed that its proprietary AI models were used to train DeepSeek's models. The company told the publication that it had seen evidence of distillation from several accounts using the OpenAI application programming interface (API). The AI firm and its cloud partner Microsoft investigated the issue and blocked their access.

Advertisement

In a statement to the Financial Times, OpenAI said, “We know [China]-based companies — and others — are constantly trying to distil the models of leading US AI companies.” The ChatGPT-maker also highlighted that it is working closely with the US government to protect its frontier models from competitors and adversaries.

Notably, AI model distillation is a technique used to transfer knowledge from a large model to a smaller and more efficient model. The goal here is to bring the smaller model on par or ahead of the larger model while reducing computational requirements. Notably, OpenAI's GPT-4 has roughly 1.8 trillion parameters while DeepSeek-R1 has 1.5 billion parameters, which would fit the description.

Advertisement

The knowledge transfer typically takes place by using the relevant dataset from the larger model to train the smaller model, when a company is creating more efficient versions of its model in-house. For instance, Meta used the Llama 3 AI model to create several coding-focused Llama models.

However, this is not possible when a competitor, which does not have access to the datasets of a proprietary model, wants to distil a model. If OpenAI's allegations are true, this could have been done by adding prompt injections to its APIs to generate a large number of outputs. This natural language data is then converted to code and fed to a base model.

Advertisement

Notably, OpenAI has not publicly issued a statement regarding this. Recently, the company CEO Sam Altman praised DeepSeek for creating such an advanced AI model and increasing the competition in the AI space.

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Advertisement

Related Stories

Popular Mobile Brands
  1. Poco X8 Pro Series Roundup: Here's Everything That We Know So Far
  2. Nothing Phone 4a, Phone 4a Pro Goes on Sale in India: Price, Offers
  3. MacBook Neo Teardown Suggests Apple's Most Repairable Laptop to Date
  4. Tipsters Leak Apple's Foldable 'iPhone Ultra': Here's How Much It Might Cost
  1. Hubble and Euclid Reveal Stunning New View of Cat’s Eye Nebula
  2. Silent Hill 2 Remake Has Surpassed 5 Million Copies Sold, Konami Announces
  3. Samsung Galaxy Z Flip 8 Battery Details Leaked; Might Have Same Capacity as the Galaxy Z Flip 7
  4. HSBC, Standard Chartered Said to Be First Recipients of Stablecoin Licences in Hong Kong
  5. Apple's Foldable Tipped to Launch as 'iPhone Ultra'; Price and Memory Configurations Leaked
  6. MacBook Neo Teardown Suggests It May Be Apple’s Most Repairable Laptop in Several Years
  7. Vashikaranam OTT Release Date: When and Where to Watch This Supernatural Drama Online?
  8. Musk’s X to Alter Verification System in Europe, Commission Says
  9. Token2049 Crypto Conference Delays Dubai Summit to 2027 Over Security Concerns
  10. OpenAI Is Reportedly Developing a Code Hosting Platform to Take on Microsoft’s GitHub
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.