AI Chip Startup Cerebras Releases Open Source ChatGPT-Like Models for Free: All Details

Silicon Valley-based Cerebras released seven models, all trained on its AI supercomputer called Andromeda.

AI Chip Startup Cerebras Releases Open Source ChatGPT-Like Models for Free: All Details

Photo Credit: Cerebras

Most of the AI models today like ChatGPT are trained on Nvidia's chips

Highlights
  • OpenAI's chatbot ChatGPT has 175 billion parameters
  • Cerebras's seven models range from small parameters to large
  • Smaller models could be leveraged on smaller devices, smartphones
Advertisement

Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration.

Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model.

"There is a big movement to close what has been open-sourced in AI...it's not surprising as there's now huge money in it," said Andrew Feldman, founder, and CEO of Cerebras. "The excitement in the community, the progress we've made, has been in large part because it's been so open."

Models with more parameters are able to perform more complex generative functions.

OpenAI's chatbot ChatGPT launched late last year, for example, has 175 billion parameters and can produce poetry and research, which has helped draw large interest and funding to AI more broadly.

Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models.

However, Karl Freund, a chip consultant at Cambrian AI, said bigger is not always better.

"There's been some interesting papers published that show that (a smaller model) can be accurate if you train it more," said Freund. "So there's a trade off between bigger and better trained."

Feldman said his biggest model took a little over a week to train, work that can typically take several months, thanks to the architecture of the Cerebras system, which includes a chip the size of a dinner plate built for AI training.

Most of the AI models today are trained on Nvidia's chips, but more and more startups like Cerebras are trying to take share in that market.

The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.

© Thomson Reuters 2023


From smartphones with rollable displays or liquid cooling, to compact AR glasses and handsets that can be repaired easily by their owners, we discuss the best devices we've seen at MWC 2023 on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Redmi Note 12 Turbo With Snapdragon 7+ Gen 2 SoC, 64-Megapixel Camera Launched: Price, Specifications
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »