DeepSeek-V3 Open-Source AI Model With Mixture-of-Experts Architecture Released

The model features 671B parameters, much higher than Meta Llama 3.1 model's 405B parameters.

Advertisement
Written by Akash Dutta, Edited by Siddharth Suvarna | Updated: 27 December 2024 16:38 IST
Highlights
  • DeepSeek-V3 was pre-trained on 14.8 trillion tokens
  • The AI model also comes with advanced reasoning capabilities
  • It scored 87.1 percent on the MMLU benchmark

The AI model adopts Multi-head Latent Attention (MLA) and DeepSeekMoE architectures

Photo Credit: DeepSeek

DeepSeek, a Chinese artificial intelligence (AI) firm, released the DeepSeek-V3 AI model on Thursday. The new open-source large language model (LLM) features a massive 671 billion parameters, surpassing the Meta Llama 3.1 model which has 405 billion parameters. Despite its size, the researchers claimed that the LLM is focused towards efficiency with its mixture-of-expert (MoE) architecture. Due to this, the AI model can only activate specific parameters relevant to the task provided and ensure efficiency and accuracy. Notably, it is a text-based model and does not have multimodal capabilities.

DeepSeek-V3 AI Model Released

The open-source DeepSeek-V3 AI model is currently being hosted on Hugging Face. According to the listing, the LLM is geared towards efficient inference and cost-effective training. For this, the researchers adopted Multi-head Latent Attention (MLA) and DeepSeekMoE architectures.

Essentially, the AI model only activates the parameters which are relevant to the topic of the prompt, ensuring faster processing and higher accuracy compared to typical models of this size. Pre-trained on 14.8 trillion tokens, the DeepSeek-V3 uses techniques such as supervised fine-tuning and reinforcement learning to generate high-quality responses.

Advertisement

The Chinese firm claimed that despite its size, the AI model was fully trained in 2.788 million hours with the Nvidia H800 GPU. DeepSeek-V3's architecture also includes a load-balancing technique to minimise performance degradation. This technique was first used on its predecessor.

Advertisement

Coming to performance, the researchers shared evals from internal testing of the model and claimed that it outperforms Meta Llama 3.1 and Qwen 2.5 models on the Big-Bench High-Performance (BBH), Massive Multitask Language Understanding (MMLU), HumanEval, MATH, and several other benchmarks. However, these are currently not verified by third-party researchers.

One of the main highlights of the DeepSeek-V3 is its massive size of 671 billion parameters. While larger models exist, for example, the Gemini 1.5 Pro has one trillion parameters, such size in the open source space is rare. Prior to this, the largest open-source AI model was Meta's Llama 3.1 with 405 billion parameters.

Advertisement

At present, DeepSeek-V3's code can be accessed by its Hugging Face listing under an MIT license for personal and commercial usage. Additionally, the AI model can also be tested via the company's online chatbot platform. Those looking to build using the AI model can also access the API.

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Advertisement
Popular Mobile Brands
  1. Nothing Phone 4a Pro's  Battery, Durability, Charging Details Revealed
  2. Oppo K15 Launch Seems Imminent as Company Teases Launch of a New Phone
  3. Amazon to Cut Thousands More Jobs Globally With India Being the Worst-Hit
  4. Samsung Galaxy A57 Surfaces on Chinese Certification Site With This Design
  5. Redmi Turbo 5 Max China Launch Date Has Been Announced
  6. Here's How Much the iQOO 15R Might Cost in India
  7. Amazfit Active Max With 1.5-Inch AMOLED Display Launched in India: See Price
  8. The Conjuring: Last Rites OTT Release Date: When and Where to Watch it Online?
  9. Vivo X200T Launched in India With These Features
  1. James Webb Helps Astronomers Chart the Universe’s Hidden Dark Matter
  2. ESA’s Solar Orbiter Reveals How Magnetic Avalanches Trigger Solar Flares
  3. NASA Races to Restore Contact With MAVEN Mars Orbiter After Weeks of Silence
  4. iQOO 15R Price in India, Chipset Details Teased Ahead of Launch in India on February 24
  5. Nothing Phone 4a Pro Battery, Charging Speed and IP Rating Revealed via EPREL Label
  6. Honor Magic V6 Leak Hints at Slimmer Build, New Hardware Upgrades Ahead of Anticipated March Debut
  7. OpenAI Says ChatGPT's Writing Worsened Due to Overtraining Math, Coding
  8. Sony Said to Be Planning State of Play Broadcast for February
  9. Amazon to Reportedly Layoff 16,000 Employees, India Might Be Among Worst-Hit Regions
  10. Hashtag Star Now Available for Streaming on Chaupal: What You Need to Know About This Punjabi Film
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.