Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device.
The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users.
What is Neural Handwriting?
For those catching up, the feature uses the Neural Band, the sEMG wristband Meta ships in the box with the $799 glasses, to detect subtle finger movements. Then, it translates those movements into typed text on an app.
To use the feature, wear the Ray-Ban Display, and with the wristband in your hand, move your fingers in the way you’d written a letter. The glasses should be able to convert your finger movements (in the air) into a message in WhatsApp, Messenger, Instagram, or your phone’s native messaging app.
The feature works on both Android and iOS. While the feature surely opens a new use case for the Ray-Ban Meta Display, and it surely is generating quite a lot of headlines, the update also opens the device to third-party web app developers for the first time.
What else did Meta update?
To me, that sounds like Meta is treating the glasses as a platform, not just a product it sells to end users.
This could potentially enable developers to build AI assistants, productivity tools, navigation overlays, accessibility features, and gesture-controlled experiences that could increase the device’s appeal beyond a messaging and media-capturing device.
Beyond the two developments, Meta also brings Display Recording to the Display, a new mode that captures what appears in the lens display, the camera, and the surrounding audio, into a single video file.
Walking directions now cover the entire United States, along with major international cities like London, Paris, and Rome. The live captions feature is expanding to WhatsApp, Messenger, and Instagram DM voice messages. Further, Muse Spark AI is coming to the glasses this summer.

