Meta is now allowing developers to test software tools for the glasses through a developer preview programme.
Photo Credit: Meta
Meta Ray-Ban Display (pictured) was launched in September 2025
Meta has announced a new software update for the Meta Ray-Ban Display, adding several features that were previewed when the smart glasses launched. The update expands messaging and accessibility tools, introduces new recording capabilities, and broadens navigation support. Meta has also opened developer preview access for the platform, allowing third-party developers to build web apps and extend mobile apps to the glasses. Additional software improvements and new AI features are scheduled to arrive later this year.
The company said in a blog post that Neural Handwriting is now rolling out to all users. The feature works with Meta's Neural Band and lets users compose messages using finger gestures while using Instagram, WhatsApp, Messenger, and the default messaging apps on Android and iOS.
The company has also introduced a display recording feature that combines the content shown on the glasses display, the user's camera feed, and audio into a single video. The company said turn-by-turn navigation is now available throughout the US and in major cities such as London, Paris, and Rome.
Meta is also extending Live Captions to WhatsApp, Facebook Messenger, and Instagram Direct. The feature can transcribe speech during face-to-face conversations and phone calls, and will support voice interactions within these apps as well.
So far, the glasses have received four software updates since its September 2025 launch. These updates added widgets for reminders, weather, stocks, and calendar, along with quicker access to Spotify playlists and support for Instagram Reels. These tools are intended to support a wide range of experiences, including games, transit utilities, cooking guides, shopping lists, and music practice apps.
Developers can now test software tools for the glasses through a developer preview programme. Developers can create standalone web apps using HTML, CSS, and JavaScript and deploy them to the glasses through a URL. Using the Wearables Device Access Toolkit, developers can also extend existing mobile apps to the glasses and add interface elements such as text, images, lists, buttons, and video playback.
Meta also said that Muse Spark, a new AI model developed for its wearable devices, is scheduled to roll out to the Meta Ray-Ban Display later this summer.
Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.
Samsung Galaxy Z Flip 8 Case Renders Hint at Design and Qi2 Charging Support