Apple’s annual fall launch event on September 12 introduced the world to a number of new products including the new Apple Watch Series 3, iPhone 8, Apple TV 4K, and most notably, the iPhone X (pronounced “ten”), the much-forecasted ten-year anniversary iPhone.
Plenty of speculated features—the entire screen being Touch ID-enabled, for instance—didn’t make it into the final product. Many others did, however, such as the removal of the home button, the edge-to-edge screen (albeit with that already-infamous notch), and the introduction of Face ID. Face ID’s lofty security promises are as exciting as they are uncertain, and the underlying tech it sits on is every bit as interesting.
Solving for facial recognition
Handling facial recognition with the accuracy required for reliable, secure Face ID is no small feat, and in the iPhone X, it’s all done through the Apple’s new TrueDepth front camera, sitting on top of ARKit. But the capabilities of iPhone X’s new TrueDepth front camera have many other exciting AR implications.
When ARKit which was released at WWDC back in June, it brought three major capabilities to the table: user motion tracking, plane detection, and light estimation.
What do these three capabilities allow you to do? For one, you can place an AR model or object on a flat surface, and the object will recall and retain real environment light information, retaining its real environment coordinates even if you move away. It also integrates with the SceneKit SDK flawlessly allowing for all the features of SceneKit such as physics to be applied to the models, as seen here:
All of this sounds pretty cool right? But in reality the real-life use cases of a standalone ARKit (with no help from CoreML or Vision) is somewhat limited. This is where TrueDepth camera array comes in. Powered by the A11 Bionic neural chip running the iPhone X, the TrueDepth unlocks the full possibilities of ARKit.
What makes a successful AR app?
One of the most well-known augmented reality apps in history is Snapchat, the social network with AR baked right into the center of its premise. The popularity of Snapchat’s facial-recognition filters shows what the general consumer is looking for in an AR app; users might download a fun gimmicky AR app for a day and then delete it, but the AR apps with staying power seem to be designed to help a user test out how things might look: apps useful for tasks like, say, trying to visualize a piece of furniture in your apartment. The TrueDepth camera allows ARKit to bring that same power to faces by allowing a greater accuracy of depth recognition and modeling.
What’s new in the TrueDepth camera array?
TrueDepth is one of the biggest hardware changes to the iPhone X from the current iPhones. Some of the components already exist on previous iPhones (the light sensor and, obviously, the camera itself), but it’s the new kids on the block that makes the talking poop emoji from the animoji demo possible:
- Infrared camera: reads dot pattern and captures an infrared image which is sent securely.
- Flood Illuminator: allows face detection even in dark
- Dot projector: projects more than 30k invisible dots on the face to build your unique map. On top of now being able to unlock your iPhone with your face, Apple has expanded the current ARKit to include the newly added ARFaceTracking to integrate the face data to an ARKit app. That’s the power of Snapchat, available to all developers and users for their own applications and needs.
ARKit and TrueDepth, together at last
If you combine ARKit and TrueDepth, you’ve got the power to model not only external objects, but your own image. Imagine you own a hair salon and have a customer debating between two hair styles; with a new AR app you could show them exactly what they would look like in the exact styles. Or say you own a wedding hall and the soon-to-be-bride wants to know what table placement she would prefer without actually doing the manual work? Done.
Even more exciting, you can combine both into one app. What if you went to the American Museum of Natural History in NYC, and virtual dinosaurs gave you a personal tour of the museum with the facial movement pre-recorded by someone speaking the very same words? Cool, right?
The possibilities for this technology are truly profound, and we’re excited to see what the development community does with it once the iPhone X is in people’s hands.