The Artificial Intelligence hardware race has officially shifted from our screens to our surroundings. For the past year, the industry has watched as competitors like Meta found surprise mainstream success with their AI-integrated Ray-Ban smart glasses, while companies like OpenAI and Google continue to tease their own ambient computing gadgets. But the tech world has been holding its breath, waiting for the sleeping giant in Cupertino to make its move. That wait is finally over.
Apple is officially stepping into the arena, reportedly accelerating the development of three brand-new, AI-powered wearables. According to the latest 2026 supply chain reports and industry insiders, Apple is actively working on a pair of premium smart glasses, a discreet camera-equipped pendant, and next-generation AirPods. However, Apple’s strategy is entirely different from the standalone, often-clunky “AI pins” we saw flop in recent years. Instead of trying to replace your smartphone, these three new devices are being meticulously designed to act as the “eyes and ears” of your iPhone, feeding real-world visual and audio context to a vastly upgraded, AI-driven Siri.

If you are wondering how this hardware will actually work—and when you can buy it—you are in the right place. In this complete breakdown, we are diving deep into Apple’s new wearable trifecta. We will explore the leaked specs of the upcoming smart glasses (internally codenamed N50), explain the surprising reason why Apple is putting cameras inside your AirPods, and reveal the projected release timelines for 2026 and 2027 so you know exactly what is coming next.
The Big Picture: Apple’s AI Hardware Strategy
Catching Up and Pulling Ahead in the AI Race
For the last couple of years, Apple has seemed unusually quiet while the rest of the tech industry chased the artificial intelligence boom. Meta struck gold with its Ray-Ban smart glasses, and companies like OpenAI have been actively exploring dedicated AI gadgets. But Apple was never sitting on the sidelines; they were simply biding their time. Now, the company is aggressively pivoting its hardware strategy to not just catch up, but to define the next era of ambient computing. Their goal is clear: to maintain the dominance of their ecosystem by expanding it beyond screens and into the physical world you interact with every day.
The iPhone as the Ultimate Processing Hub
If you look at recent tech history, dedicated “AI gadgets” like the Humane AI Pin or the Rabbit R1 have largely struggled to find their footing. Why? Because they tried to replace the smartphone. Apple is taking the exact opposite approach. A major detail about this new lineup of smart glasses, pendants, and camera-equipped AirPods is that they are not standalone devices. Instead, they are being engineered to tether directly to your iPhone. By using the iPhone to handle the heavy computational lifting, Apple can keep these new wearables incredibly thin, lightweight, and battery-efficient. The wearables simply act as sensors, while the iPhone remains the brain of the operation.
Powered by a Vastly Upgraded Siri
Of course, a network of cameras and microphones is useless without an intelligence to interpret that data. This entire hardware push is heavily reliant on a massive, upcoming overhaul to Siri. Expected to arrive with iOS 27, this next-generation version of Siri will be powered by advanced Large Language Models (LLMs)—with reports indicating a heavy reliance on Google’s Gemini foundation models. By combining this conversational AI with the real-time visual context gathered from your smart glasses or AirPods, Siri will finally evolve from a basic voice assistant into a proactive, context-aware companion that truly understands your surroundings.
Apple Smart Glasses: The Meta Ray-Ban Competitor
Design & Build: Premium Frames, No Displays
While the Apple Vision Pro aimed for fully immersive computing, Apple’s upcoming smart glasses—internally codenamed project “N50″—are taking a much more practical approach. According to recent supply chain leaks, these glasses will not feature an augmented reality (AR) display in the lenses. Instead, they are being designed to look and feel exactly like premium everyday eyewear. Rather than partnering with an established brand like Meta did with Ray-Ban, Apple is reportedly building its own frames from scratch. By utilizing high-end materials, including durable acrylic and titanium elements, Apple aims to ensure the glasses remain incredibly lightweight (reportedly under 50 grams) and comfortable for all-day wear.
Key Features: Dual Cameras and “Visual Intelligence”
The true power of the N50 glasses lies in the hardware packed subtly into the frames. They will reportedly feature a dual-camera system: one high-resolution sensor for capturing traditional photos and 1080p video, and a second, specialized computer vision camera dedicated entirely to AI. This secondary camera functions similarly to the LiDAR scanner on your iPhone, allowing the device to measure depth and truly understand its physical environment. Through a feature called “Visual Intelligence,” the glasses will essentially see what you see. You will be able to look at a foreign restaurant menu to get a real-time audio translation, ask Siri to identify a specific landmark in front of you, or receive a context-aware reminder to pick up milk as you walk past the dairy aisle in a grocery store.
Release Timeline: When Can You Buy Them?
If you are eager to get your hands on a pair, the wait might not be as long as initially expected. Apple has reportedly accelerated its wearable development, with mass production for the smart glasses targeted to begin in December 2026. This aggressive timeline points to a highly anticipated public launch in early 2027. While official pricing remains unconfirmed, industry analysts expect them to be positioned as a premium everyday accessory, likely starting around the $499 mark.
AirPods with Built-in Cameras: A New Way to Interact
Why Cameras in Earbuds?
When rumors first broke that Apple was putting cameras inside its wireless earbuds, the immediate reaction from the tech community was confusion. Why would anyone want to take a photo from the side of their head? However, recent supply chain leaks clarify that these are absolutely not meant for taking selfies or recording vlog-style videos. Instead, Apple is reportedly integrating tiny, low-resolution infrared (IR) sensors—very similar to the dot projector technology used for Face ID on your iPhone. These lenses are designed to operate invisibly in the background, focusing entirely on sensing depth and movement rather than capturing traditional, high-resolution photography.
Functionality: How the Cameras Will Actually Work
By adding IR cameras, Apple is transforming the AirPods from simple audio accessories into powerful spatial computers. These sensors will continuously capture environmental data, mapping the physical space around your head. This unlocks two major game-changing features. First, it enables advanced air gesture controls. Instead of tapping the stem of your earbud or pulling out your phone, you could simply wave your hand or nod your head to skip a track, lower the volume, or decline a call. Second, it provides spatial awareness for Siri. If you are looking at a specific building or product, the IR sensors can feed that visual context directly into Apple’s AI, allowing Siri to answer questions about exactly what you are looking at without you having to describe it.
Release Timeline: When Will They Launch?
Out of the three new AI wearables Apple is currently developing, these camera-equipped earbuds are reportedly the furthest along in the production pipeline. Industry analysts, including Bloomberg’s Mark Gurman, note that Apple has been working on this technology for years as an extension of their spatial audio ambitions. Because the core earbud technology is already highly refined, we could see these premium AI AirPods (potentially branded as a high-end tier of the AirPods Pro) launch as early as this year (2026), likely aligning with Apple’s traditional fall hardware event.
The Apple AI Pendant: The Wildcard
What is it?
When gadgets like the Humane AI Pin crashed and burned in recent years, many tech enthusiasts assumed the dream of an AI brooch was dead. However, Apple is reportedly reviving the concept with a much more practical approach. According to recent supply chain leaks, Apple is actively developing a tiny AI pendant roughly the exact size of an AirTag. It is designed to be highly versatile—featuring a secure clip to attach directly to your shirt lapel, as well as a loop to be worn discreetly around your neck like a necklace. Instead of trying to be a standalone smartphone replacement with a gimmicky laser projector, this pendant is built specifically to be an “always-there” accessory that quietly feeds real-world context back to your iPhone.
Hardware: The “Eyes and Ears” of Your iPhone
Because your iPhone will handle all the heavy AI processing, the internal hardware of the pendant can remain incredibly lightweight and battery-efficient. It reportedly features an always-on, low-resolution camera (designed for computer vision, not high-quality photography) and an array of sensitive microphones to pick up voice commands. This hardware is managed by a dedicated, low-power chip—something closer to what you would find inside your AirPods rather than the powerful processors inside an Apple Watch. Interestingly, recent Bloomberg reports indicate that Apple’s design team is still debating whether to include a built-in speaker for direct, two-way conversations with Siri, or if they will require users to rely on their AirPods for audio feedback.
Release Timeline: Will It Actually Launch?
Here is where we need to be completely candid with the rumors: out of the three AI wearables currently in development at Apple, the pendant is the biggest wildcard. It is still in the very early, experimental stages of hardware development. If the internal testing goes perfectly, we could potentially see it launch as a companion accessory alongside the new smart glasses in 2027. However, Apple is notorious for ruthlessly killing prototypes that do not meet their strict usability standards. There is a very real possibility that this project could be scrapped entirely before it ever sees a public release.
How Will These Impact Your Privacy?
The Societal Shift: Always-On Cameras in Public
The idea of walking around with constant, always-on cameras and microphones is naturally going to make people uncomfortable. We are already seeing societal pushback against devices like Meta’s Ray-Bans, with bystanders and privacy advocates raising concerns about being recorded in public spaces without their consent. For a company like Apple, which has built its entire modern brand identity around protecting user data, entering the wearable camera space is a massive tightrope walk. Regulators are already scrutinizing how AI glasses capture biometric data like facial features and voiceprints. If Apple wants these devices to go mainstream by 2027, they have to convince the public that wearing them won’t turn you into a walking surveillance hub.
Apple’s Secret Weapon: On-Device Processing
This is where Apple has a massive advantage over its competitors. While other companies often send your voice recordings and visual data to the cloud to be processed by AI, Apple is doubling down on keeping things local. Thanks to the highly efficient processors built directly into these new wearables and the iPhone they tether to, the vast majority of your AI requests will be processed completely on-device. When you ask your smart glasses to translate a menu, or your AirPods to identify a landmark, that data doesn’t need to sit on a server somewhere. For tasks that do require heavier lifting, Apple utilizes its “Private Cloud Compute”—a secure system designed to process complex data without ever storing it or exposing it to third parties.
Protecting the Bystander: Indicator Lights and Limits
So, how will Apple protect the people around you? While the final designs are still heavily guarded, industry analysts expect Apple to implement strict physical and software safeguards. For the N50 smart glasses and the wearable pendant, we will almost certainly see prominent, hard-wired LED indicator lights that activate the moment the camera or microphone is in use—making it obvious to anyone nearby. Furthermore, unlike Meta—which is reportedly experimenting with facial recognition to identify strangers in real-time—Apple is expected to strictly limit its “Visual Intelligence” to recognizing inanimate objects, text, and landmarks. By refusing to build a public facial recognition database, Apple can offer the utility of ambient AI without crossing the line into dystopian territory.
Conclusion: The Dawn of Apple’s Ambient Era
Summary: The Wearable Roadmap
To recap, Apple is quietly building an interconnected web of hardware designed to observe the world alongside you. The camera-equipped AirPods are currently the furthest along in development and could potentially hit the market as early as late 2026. Following closely behind are the highly anticipated N50 smart glasses, which are slated to begin mass production in December 2026 for an early 2027 launch. Finally, the AI pendant remains the wildcard of the bunch—a fascinating, AirTag-sized concept that could arrive alongside the glasses in 2027, provided it survives Apple’s rigorous internal testing.
Final Thoughts: The End of Screen Addiction?
As tech enthusiasts who spend all day analyzing the latest smartphone displays or pushing PC gaming hardware to its limits, the idea of actively stepping away from our screens feels like a massive paradigm shift. But that is exactly what ambient computing is all about. Apple’s real endgame isn’t to replace the iPhone; it is to transform it from a device you constantly stare at into a silent, powerful engine that lives in your pocket. By utilizing these new wearables as the “eyes and ears” of Siri, Apple wants to create a frictionless future where you interact with the digital world simply by looking, speaking, and existing in your environment.
Over to You!
The artificial intelligence hardware race is just getting started, and the next few years are going to completely change how we interact with our tech.
Which of these three Apple devices are you most excited about? Would you rock the N50 smart glasses, or do the camera-equipped AirPods sound more appealing to your daily routine? Let us know your thoughts in the comments below!