APRIL 28th, 2025 – Apple is moving ahead with plans to develop AI-powered smart glasses, according to a new report, but the product is still some time away from reaching consumers.
As first detailed by Bloomberg’s Mark Gurman, the initiative – reportedly called “Project Atlas” – builds on Apple’s earlier studies into expanding its wearable offerings beyond the Apple Watch and AirPods. Last November, Gurman reported that Apple was testing smart glasses similar to the popular Ray-Ban Meta smart glasses, part of a broader effort to extend Apple Intelligence features across more wearable devices.

An internal Apple email from the Product Systems Quality team, obtained by Gurman last fall, stated:
“Testing and developing products that all can come to love is very important to what we do at Apple. This is why we are looking for participants to join us in an upcoming user study with current market smart glasses.”
The smart glasses project should not be confused with Apple’s longer-term goal of creating sophisticated AR glasses that might one day replace the iPhone. Those advanced AR devices, which will combine Apple Intelligence with full augmented reality overlays, remain years away due to technological hurdles.
Instead, the new smart glasses under development will focus on real-time audio and video capture, enabling Apple Intelligence to analyze a user’s environment and deliver contextual information without full AR functionality. The glasses are expected to work similarly to Meta’s Ray-Ban smart glasses, offering lightweight interaction with AI based on a continuous sensory feed.
Gurman indicated in his latest “Power On” newsletter that Apple has decided to move forward with manufacturing the smart glasses after a series of successful internal tests. Still, the device is not close to commercial readiness and likely won’t launch until at least 2027, aligning with earlier reports on Apple’s roadmap for camera-equipped wearables, including updated AirPods models.
The company is also exploring integrating cameras into future versions of AirPods and Apple Watch, further enhancing the use of Apple Intelligence’s Visual Intelligence capabilities. Currently, Visual Intelligence requires an iPhone 16 or iPhone 15 Pro, where users can press a button to let the AI analyze and answer questions about their surroundings via the camera feed.
With Apple facing stiff competition from other tech giants in the AI space, patience may be essential. As the report notes, launching a product heavily reliant on real-time AI before the software is dependable would risk undermining its utility. Unlike the iPhone, which remains a powerful standalone device, the smart glasses would rely almost entirely on seamless AI performance.
Meanwhile, Google is preparing its own moves in the AI wearables space. The company previewed prototypes of smart glasses equipped with Gemini AI support under its new Android XR initiative and is expected to share more details at its upcoming I/O event.
For Apple, the emphasis remains on refining the technology before releasing it to the public – a strategy consistent with CEO Tim Cook’s broader focus on building devices that are as polished and beloved as the iPhone.
