Google Unveils Android XR Smart Glasses with Real-Time Translation

Google’s latest unveiling at the I/O 2025 conference has got everyone talking. The tech giant showcased its new smart glasses, equipped with live translation capabilities. These innovative specs use Gemini AI, a collaboration between Google and Samsung, allowing them to sync with smartphones and access various apps. They also feature a built-in speaker and display screen in the lens for private data viewing.

The new design is a significant upgrade from the old Google Glass model, focusing on real-world usage and a more stylish look. Google has partnered with Gentle Monster and Warby Parker to provide sleek frames for the glasses.

During the presentation, Shahram Izadi and Nishtha Bhatia demonstrated the live translation feature, speaking to each other in Farsi and Hindi while the XR glasses translated their conversation into English in real-time. Although the AI technology had some glitches, it worked as intended for short periods.

Bhatia also showed how the Gemini Assistant can work with the XR glasses, asking questions about the images she saw backstage at a theater and searching for information about a coffee shop she visited before the show.

The 2025 I/O Developer Conference, which started on May 20, has introduced several new features, including an AI-powered movie-making tool called Flow, live language translation in Google Meet, virtual try-on from uploaded photos, and AI advancements in Project Astra for computer vision.

Source:

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here