매일경제 빌리어드뉴스 MK빌리어드뉴스 로고

Samsung, Google to collaborate on AI smart glasses

  • Lee Deok-joo and Lee Eun-joo
  • 기사입력:2025.05.22 10:18:12
  • 최종수정:2025.05.22 10:18:12
  • 프린트
  • 이메일
  • 페이스북
  • 트위터
(AP/Yonhap)
(AP/Yonhap)

Samsung Electronics Co. is teaming up with Google LLC to launch smart glasses equipped with artificial intelligence (AI), which will be designed by Korean eyewear brand Gentle Monster.

Google unveiled a prototype of its smart glasses, which are currently in development, on Tuesday (local time) during its annual developer event in California, the United States. These are augmented reality (AR) glasses equipped with Google’s AI Gemini and a small display.

At the event, Google announced it is expanding its partnership with Samsung - which previously focused one extended reality (XR) headsets - into the realm of smart glasses, with both companies co-developing the hardware and software.

Just as Samsung’s upcoming XR headset, Project Infinity, which is scheduled to be released in the second half of 2025, will be a flagship device for the Android XR operating system, the two companies will also collaborate on the smart glasses.

However, Samsung will not be Google’s only smart glasses partner. Chinese company XREAL also unveiled its Android XR-based smart glasses on the sme day.

Google revealed that Gentle Monster from Korea and Warby Parker from the Unitedb States will be the design partners for the smart glasses. Following the announcement, Warby Parker’s stock surged by 15 percent.

The reason Google chose Gentle Monster and Warby Parker as design partners is because branding is critical for everyday products suxh as smart glasses.

Meta Platforms Inc., which previously released smart glasses, sold 1 million units in 2024 by collaborating with Ray-Ban.

Google demonstrated a prototype of its smart glasses, through which users could talk to the AI, take photos, and use real-time translation, during the event.

Participants were able to try the glasses on, and journalists wore the prototype with prescription lenses attached after having their vision measured. Despite the double-layered lenses, the glasses felt light and comfortable.

The design featured black, thick-rimmed frames that were subtle and looked more like regular glasses than sunglasses. Without the camera and display, they would be indistinguishable from ordinary glasses.

The most eye-catching feature was the display, which was a semi-transparent square display located in the center of the right lens similar to a head-up display in a car windshield.

A single camera is located at the top left. Users can communicate with the AI via voice using the built-in speakers and microphones.

The Gemini AI is activated by tapping the right temple. An animation of Gemini appears on the display, indicating that it is listening.

When asked in Korean, “What is this painting I am looking at?” the AI responded with a detailed explanation in Korean. The AI’s spoken response was also shown as text on the display.

There was also a demonstration of Gemini Live, which is currently app-based. This version uses real-time video recognition to interact conversationally with people, as if the AI is seeing through the glasses.

Whereas using a smartphone would require taking it out and pointing the camera, the smart glasses allowed for hands-free interaction.

They also supported navigation by using Google Maps to set a destination and pressing a guidance button that displayed directional instructions on the screen, such as how many meters to walk before turning.

These smart glasses are likely to play a key role in realizing Google’s vision of a “universal AI assistant.”

In the keynote, Google DeepMind Chief Executive Officer Demis Hassabis noted that the universal AI assistant will perform daily tasks and promised the continuous development of innovative technologies such as Gemini Live and smart glasses.

[ⓒ 매일경제 & mk.co.kr, 무단 전재, 재배포 및 AI학습 이용 금지]