Snapchat’s AR dreams might be starting to look a bit more realistic. Every year the company has subtly improved its AR-powered lenses, improved the technical odds, and ends and strengthened its dev platform. As a result, more than 170 million people today — more than three-quarters of Snap’s daily active users — are accessing the AR features of the app daily, the company says. Two years ago, Snap shared that more than 100,000 lenses were designed by creators on the platform; now Snap says more than 1 million lenses were created.
The goofy filters are bringing users to the app, and the company is slowly building a more interconnected platform around AR, which is starting to look even more promising.
The company today unveiled a series of updates at Snap ‘s annual developer gathering, including Lens voice search, a bring-your-own machine learning platform update to Lens Studio, and a geography-specific AR program that will transform public snaps into spatial data that the company will use to map vast physical spaces three-dimensionally.
The Lens carousel of Snapchat was adequate to toggle between filters when there were just a few hundred to work together, but with one million lenses and counting, it was also obvious that Snapchat’s AR aspirations were suffering from discoverability issues.
Snap is preparing to roll out a new sorting method through Lenses, via voice, and if they can nail it, the company will have a clear path from entertainment-only AR to a utility-based platform. The new voice search of the app in its current format will allow Snapchat users to ask the app to help its surface filters and enable them to do something special.
For its visual search, the company announced new partnerships, teaming up with PlantSnap to help Snapchat users identify plants and trees, Dog Scanner let Snap users point their camera at a dog and determine their breed, and later this year with Yuka to help give food nutrition ratings after scanning the label of an item.
Snap needs developers to bring their neural network models to their platform to allow for a more innovative and machine-intensive Lenses class. SnapML enables users to add trained models and allow users to increase their environment, creating visual filters that turn scenes in a more sophisticated manner.
The data sets creators upload to Lens Studio will allow their lenses to display and search for new objects with a new set of eyes. Snap collaborated with AR startup Wannaby to provide developers with access to their foot-tracking software to create lenses that allow users to digitally put on sneakers.
Snapchat starts mapping the world
One of Snap’s major announcements on the AR front last year was a feature called Landmarks that allowed developers to create more sophisticated lenses that leveraged geometric models of famous large landmark structures such as the Eiffel Tower in Paris or Flatiron Building in NYC to make geography-specific lenses that played with the real world.
The tech was fairly easy to pull off, if only because the structures they selected were so common and there were readily accessible 3D files of their exteriors. The next AR initiative at the organization is a little more optimistic. A new feature called Local Lenses enables developers of Snapchat to create geography-specific lenses that interact with a wider swatch
Companies involved in virtual reality are increasingly competing for 3D data collection. Pokémon GO developer Niantic last month revealed they would start gathering 3D data from users on an opt-in basis.