Here's how much better Snapchat lenses are on the iPhone X
You'd totally do it for the Snap if all your lenses looked like this.
Look, I’m impressed with what app developers and third-party partners have created for Instagram, Facebook, and Snapchat using the middling front-facing cameras of most phones — but it's about to get way more interesting.
Apple’s iPhone X and its TrueDepth camera is about to transform these lenses (or filters) we know and love into something even more powerful and, potentially, enriching.
Apple’s TrueDepth Camera is a bit of a misnomer since the technology, which sits in the notched-out space on the iPhone X’s spectacular OLED screen, is not just a camera, it’s a collection of technologies.
Included in that quarter-inch by 1.5-inch iPhone X dark space is a 7 MP front-facing camera, a dot projector, infrared camera, and flood illuminator, all of which work together to give the iPhone X the ability to see your face in three dimensions.
The camera marries that information with a live picture of you and then uses augmented reality algorithms and the iPhone X A11 Bionic CPU to create a symphony of real-time facial special effects.
In these early days and as I tested the iPhone X for my monster review, the utility of the TrueDepth camera was limited to a few crucial and fun features. There’s Face ID, which is the iPhone X’s ability to identify my face out of all others and let me use it to unlock the phone with nothing more than a glance, the ability to take Portrait Mode selfies, and the wildly entertaining animojis, iMessage character animations driven entirely, and in perfect sync, by your face.
Most intriguing, though, for a Snapchat fan like myself was the special preview I got of a Snapchat update that takes full advantage of the TrueDepth Camera’s ability to see every curve, line, and movement of your face.
Snap Inc. is excited about the new technology. A company spokesperson told me that the lenses I've tested are more realistic thanks to faster and more accurate tracking.
In this very limited Snapchat Beta app, I got exactly four lenses: Luchador face paint, what looks like a porcelain mask, feathers with a jewel tiara, and a flower wreath.
Each lens hugs my face, expressions and head movements to an extraordinary degree. For example, I can still see the lines on my face through the green luchador lens, giving it the appearance of real face paint.
The Snapchat app beta lenses also read the room light to give these filters an almost unprecedented reality.
Snap told me that realism comes courtesy of facial depth data provided by the TrueDepth camera, which personalizes each lens to the contours of my face.
What Snapchat still can't do is tell the difference between my face and someone else's. It's combining object recognition with depth information to draw the lenses on each face, but it will do the same thing for any face-shaped object. At one point I had it put one of the new lenses on an Einstein robot toy.
These depth-information-powered Snapchat lenses are not perfect. The TrueDepth Camera doesn’t work if your head is too close. I also noticed almost as much judder (the lens shifting in my face as I moved) on these lenses as I did on ones I’ve used in Instagram and the full version of Snapchat.
Even so, as more Apple developers get their hands-on Apple’s face-tracking configuration in its ARKit API and start to test it with the Apple iPhone X and its TrueDepth Camera, we'll see more, never-before imagined ways of bringing our own faces into third-party applications. It's going to open entirely new vistas of application interaction.
As Apple’s global VP of marketing told me recently, when it comes to AR, “pretty much every new developer is thinking about incredible ways to do things that they could never do before.”
Stay tuned for the full Snapchat update, but remember, your lens experience might not be as good without the iPhone X and its TrueDepth camera.
Lance Ulanoff was Chief Correspondent and Editor-at-Large of Mashable. Lance acted as a senior member of the editing team, with a focus on defining internal and curated opinion content. He also helped develop staff-wide alternative story-telling skills and implementation of social media tools during live events. Prior to joining Mashable in September 2011 Lance Ulanoff served as Editor in Chief of PCMag.com and Senior Vice President of Content for the Ziff Davis, Inc. While there, he guided the brand to a 100% digital existence and oversaw content strategy for all of Ziff Davis’ Web sites. His long-running column on PCMag.com earned him a Bronze award from the ASBPE. Winmag.com, HomePC.com and PCMag.com were all been honored under Lance’s guidance.He makes frequent appearances on national, international, and local news programs including Fox News, the Today Show, Good Morning America, Kelly and Michael, CNBC, CNN and the BBC.He has also offered commentary on National Public Radio and been interviewed by newspapers and radio stations around the country. Lance has been an invited guest speaker at numerous technology conferences including SXSW, Think Mobile, CEA Line Shows, Digital Life, RoboBusiness, RoboNexus, Business Foresight and Digital Media Wire’s Games and Mobile Forum.