Meta has just unveiled its latest innovation in future tech—Aria Gen 2 glasses, set to transform the field of machine perception systems, AI, and robotics as it gears up to hand them over to third-party researchers.
Flashback to 2020: That was when Meta first introduced Project Aria, showcasing a pair of glasses packed with sensors. The company used these internally to train machine perception systems, addressing the challenges of developing practical, all-day wearable augmented reality glasses. Now, they’re ready to take the next step.
Project Aria’s first generation didn’t just stay within Meta’s confines. It was embraced by partners like BMW and academic institutions such as Carnegie Mellon University, IIIT Hyderabad, the University of Bristol, and the University of Iowa to explore various machine perception challenges.
Fast forward to today, and the arrival of Aria Gen 2. The second generation glasses continue without any integrated displays but are now enhanced with a more advanced sensor suite. This suite includes an RGB camera, position-tracking cameras, eye-tracking cameras, spatial microphones, and a range of other sophisticated sensors like IMUs, barometer, magnetometer, GNSS, and even custom Meta silicon.
Taking it a step further, Aria Gen 2 introduces two innovative sensors in the nosepad: a photoplethysmogram (PPG) sensor for measuring heart rate, and a contact microphone to isolate the wearer’s voice from surrounding noise.
Meta is proud to boast that these glasses weigh just 75 grams and are optimized for all-day use, with a battery life spanning 6 to 8 hours. The glasses also sport a convenient foldable design.
The technology under the hood is impressive, featuring various on-device machine perception systems like hand and eye-tracking, speech recognition, and simultaneous localization and mapping (SLAM) tracking. SLAM tracking helps users map and navigate indoor areas without GPS, serving as a visual positioning system (VPS) that’s just as handy on a bustling street as in pinpointing items inside a store.
Even though Aria Gen 2 isn’t yet available for distribution, Meta promises more updates soon, with a focus on both commercial and academic research. Recently, Meta has teamed up with Envision, showcasing the glasses’ ability to help those with visual impairments. Their demo highlights how the glasses’ SLAM tracking allows a blind user to locate items in a store using spatial audio cues.
As Meta strives toward releasing its first commercial AR device, it’s clear these glasses are part of a bigger puzzle. The challenge lies in merging these complex systems with a lightweight, everyday wearable that includes a display, which demands significantly more computing power.
In a pivotal revelation, Meta introduced its AR prototype Orion, which is designed to handle these demands using a separate wireless compute unit—all while maintaining a sleek form. However, it’s not hitting store shelves anytime soon. The prototype, costing about $10,000 per unit to create, features cutting-edge silicon carbide lenses and offers an astonishing 70-degree field-of-view.
The race is on in the tech world to perfect these components, aiming to create a product that might eventually replace smartphones as the go-to mobile platform. Meta and other tech giants like Apple, Samsung, and Google are targeting the launch of such augmented reality glasses before 2030, each vying for that future crown of mobile computing.