Google Titans AI Architecture Unveiled With Ability to Solve Long-Term Memory Issues in AI Models

Google claims that Titans can scale LLM memory to larger than the context window of two million tokens.

Advertisement
Written by Akash Dutta, Edited by David Delima | Updated: 20 January 2025 17:27 IST
Highlights
  • Google’s Titans ditches Transformer and RNN architectures
  • LLMs typically use the RAG system to replicate memory functions
  • Titans AI is said to memorise and forget context during test time
Google Titans AI Architecture Unveiled With Ability to Solve Long-Term Memory Issues in AI Models

Google’s new architecture uses a surprise-based learning system

Photo Credit: Reuters

Google researchers unveiled a new artificial intelligence (AI) architecture last week that can enable large language models (LLMs) to remember the long-term context of events and topics. A paper was published by the Mountain View-based tech giant on the topic, and the researchers claim that AI models trained using this architecture displayed a more “human-like” memory retention capability. Notably, Google ditched the traditional Transformer and Recurrent Neural Network (RNN) architectures to develop a new method to teach AI models how to remember contextual information.

Titans Can Scale AI Models' Context Window More Than 2 Million Tokens

The lead researcher of the project, Ali Behrouz, posted about the new architecture on X (formerly known as Twitter). He claimed that the new architecture provides a meta in-context memory with attention that teaches AI models how to remember the information at test-time compute.

According to Google's paper, which has been published in the pre-print online journal arXiv, the Titans architecture can scale the context window of AI models to larger than two million tokens. Memory has been a tricky problem to solve for AI developers.

Humans remember information and events with context. If someone asked a person about what he wore last weekend, they would be able to remember additional contextual information, such as attending a birthday party of a person who they have known for the last 12 years.This way, when asked a follow-up question about why they wore a brown jacket and denim jeans last weekend, the person would be able to contextualise it with all these short-term and long-term information.

Advertisement

AI models, on the other hand, typically use retrieval-augmented generation (RAG) systems, modified for Transformer and RNN architectures. It uses information as neural nodes. So, when an AI model has been asked a question, it accesses the particular node that contains the main information, as well as the nearby nodes that might contain additional or related information. However, once a query is solved, the information is removed from the system to save processing power.

However, there are two downsides to this. First, an AI model cannot remember information in the long run. If one wanted to ask a follow-up question after a session was over, one would have to provide the full context again (unlike how humans function). Second, AI models do a poor job of retrieving information involving long-term context.

Advertisement

With Titans AI, Behrouz and other Google researchers sought to build an architecture which enables AI models to develop a long-term memory that can be continually run, while forgetting information so that it be computationally optimised.

To this end, the researchers designed an architecture that encodes history into the parameters of a neural network. Three variants were used — Memory as Context (MAC), Memory as Gating (MAG), and Memory as a Layer (MAL). Each of these variants is suited for particular tasks.

Advertisement

Additionally, Titans uses a new surprise-based learning systen, which tells AI models to remember unexpected or key information about a topic. These two changes allow Titans architecture to showcase improved memory function in LLMs.

In a separate post, Behrouz claimed that based on internal testing on the BABILong benchmark (needle-in-a-haystack approach), Titans (MAC) models were able to outperform large AI models such as GPT-4, LLama 3 + RAG, and LLama 3 70B.

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement
Popular Mobile Brands
  1. Know All About Apple's New Liquid Glass Design Language
  2. Samsung Galaxy S25 Ultra Allegedly Saves Life by Stopping Shrapnel
  3. WWDC: Prepare for iOS 26, iPadOS 26, and the Dazzling Era of Liquid Glass
  4. Nothing Announces 'Now or Nothing' Sale in India: Check All Offers
  5. Android 16 Update Is Coming Soon - Here's What to Expect
  6. Vodafone Idea (Vi) Launches 5G Services in Bengaluru
  7. Realme Announces Limited-Time Discounts on Realme GT 7 Series in India
  1. NASA Slightly Raises Odds of Asteroid Hitting the Moon in 2032 After Updated JWST Data
  2. James Webb Space Telescope Captures Stunning Near-Infrared View of Sombrero Galaxy
  3. Perseverance Rover Studies Ancient Martian Rocks at Fallbreen and Forlandet Quadrangle
  4. The Prosecutor OTT Release Date: When and Where to Watch it Online?
  5. Eleven OTT Release Date Announced: Know Where to Watch This Tamil Crime Thriller
  6. Nothing Announces 'Now or Nothing’ Sale in India for Nothing and CMF-Branded Products
  7. What is Liquid Glass Interface, Apple’s New Universal Design Language for iPhone, iPad, Mac, and Other Devices
  8. Activision Says It's Working With Nintendo to Bring Call of Duty to Switch After Black Ops 7 Reveal
  9. Asus TUF Gaming F16, TUF Gaming A16, ROG Strix G16 and ROG Zephyrus G14 2025 Variants Launched in India
  10. UK Bolsters Web3 Investigations, Appoints First Crypto Intelligence Specialist to Insolvency Service
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.