Apple’s AR headset features 3D sensors for hand tracking. Apple’s widely rumored upcoming mixed reality headset will make use of 3D sensors for advanced hand tracking, according to analyst Ming-chi Kuo, whose latest research note has been reported on by MacRumors and 9to5Mac.
The headset is said to have four sets of 3D sensors compared to the iPhone’s single unit which should give it more accuracy than the TrueDepth camera array currently used for Face ID.
Apple’s AR Headset Features 3D Sensors for Hand Tracking
According to Kuo, the structured light sensors can detect objects as well as “dynamic detail change” in the hands, comparable to how Face ID is able to figure out facial expressions to generate Animoji.
“Capturing the details of hand movement can provide a more intuitive and vivid human-machine UI,” he writes, giving the example of a virtual balloon in your hand flying away once the sensors detect that your fist is no longer clenched.
Kuo believes the sensors will be able to detect objects from up to 200 percent further away than the iPhone’s Face ID.
Meta’s Quest Headsets Are Capable of Hand Tracking
Meta’s Quest headsets are capable of hand tracking, but it’s not a core feature of the platform and it relies on conventional monochrome cameras.
Kuo’s note doesn’t mention whether Apple’s headset will use physical controllers as well as hand tracking. Bloomberg reported in January that Apple was testing hand tracking for the device.
Apple Reportedly Expects It to Sell About Three Million Units In 2023
Kuo also this week provided some details on what could come after Apple’s first headset. While he expects the first model to weigh in at around 300-400 grams (~0.66-0.88lbs), a “significantly lighter” second-generation model with an updated battery system and faster processor is said to be planned for 2024.
The first model will arrive sometime next year, according to Kuo, and Apple reportedly expects it to sell about three million units in 2023. That suggests the initial product may well be expensive and aimed at early adopters.
Just to get more understanding of this article, let’s see what the Apple AR means. The Apple Augmented Reality (AR) allows you to deliver immersive, engaging experiences that flawlessly make virtual objects look like that of the real world.
The device’s camera helps to present the physical world onscreen live, your app superimposes a three-dimensional virtual object that helps in creating the illusion that these objects actually exist.
Augmented reality starts with a camera-equipped device such as a smartphone, a tablet, or smart glasses loaded with AR software. When a user points to the device and looks at an object, the software recognizes it through computer vision technology, which analyzes the video stream.
The Apple VR and mixed reality headset may not be as exciting as the hi-tech Apple Glasses we know are coming, but they are still the next best thing. Granted it’s a lot closer to the best VR headsets of today, but it’s still a big step towards the augmented reality future Apple is gunning for
Let’s also talk about the 3D sensors. It is a depth-sensing technology that boosts camera capabilities for facial and objects recognition.
3D sensing technology mimics the human visual system using optical technology, which facilitates the emergence and integration of augmented reality, AI (Artificial Intelligence), and the Internet of Things (IoT).
There are two main technologies that power 3D sensing applications: Time of Flight and Structured Light Illumination (SLI). The light pattern bends to match the inconsistencies in the surface of that object. A camera with an infrared filter observes the distortion of that pattern.