Meta has reportedly deployed a limited number of chipsets, and plans to scale production if the test is successful.
Photo Credit: Meta
The new AI chipsets are said to be part of its Meta Training and Inference Accelerator (MTIA) family
Meta has reportedly begun testing its first in-house chipsets that will be used to train artificial intelligence (AI) models. As per the report, the company has deployed a limited number of processors to test the performance and sustainability of the custom chipsets, and based on how well the tests go, it will begin large-scale production of the said hardware. These processors are said to be part of the Menlo Park-based tech giant's Meta Training and Inference Accelerator (MTIA) family of chipsets.
According to a Reuters report, the tech giant developed these AI chipsets in collaboration with the chipmaker Taiwan Semiconductor Manufacturing Company (TSMC). Meta reportedly completed the tape-out or the final stage of the chip design process recently, and has now begun deploying the chips at a small scale.
This is not the first AI-focused chipset for the company. Last year, it unveiled Inference Accelerators or processors that are designed for AI inference. However, Meta did not have any in-house hardware accelerators to train its Llama family of large language models (LLMs).
Citing unnamed sources within the company, the publication claimed that Meta's larger vision behind developing in-house chipsets is to bring down the infrastructure costs of deploying and running complex AI systems for internal usage, consumer-focused products, and developer tools.
Interestingly, in January, Meta CEO Mark Zuckerberg announced that the company's expansion of the Mesa Data Center in Arizona, USA was finally complete and the division began running operations. It is likely that the new training chipsets are also being deployed at this location.
The report stated that the new chipsets will first be used with Meta's recommendation engine that powers its various social media platforms, and later the use case will be expanded to generative AI products such as Meta AI.
In January, Zuckerberg revealed in a Facebook post that the company plans to invest as much as $65 billion (roughly Rs. 5,61,908 crore) in 2025 on projects relating to AI. The expenses also accounted for the expansion of the Mesa Data Center. It also includes hiring more employees for its AI teams.
Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.
Kepler and TESS Discoveries Help Astronomers Confirm Over 6,000 Exoplanets Orbiting Other Stars
Rocket Lab Clears Final Tests for New 'Hungry Hippo' Fairing on Neutron Rocket