Until now, Augmented Reality utilized the mobile phone's camera input to "scan" the world around us, and show us augmented content and experiences when detecting specific images, or by detecting the floor beneath us.
The number of experiences and use cases is quite extensive, with real benefits for companies by increasing sales, brand awareness, and consumer experiences.
A new step ahead was given recently by Apple when adding the LIDAR scanner to its iPhone 12 Pro. By doing this, the latest iPhone has been granted increased capabilities to enhance even more the experience for the end-user.
This LIDAR scanner adds depth analysis capabilities, making it possible to create AR apps and experiences that integrate even more seamlessly and accurately with the world around us, working even in poor lighting conditions. It can scan the surroundings up to 20 feet of distance, detecting the different objects around the users, from the floor and walls, up to chairs, tables, etc., generating a digital environment that AR content/experiences developers can use to create new and improved experiences.
One of the first to bring this new technology to their creators is Snapchat, including it in their Lens Studio 3.2 version. So, Snapchat Lens creators can now generate LIDAR-powered Lenses for iPhone 12 Pro owners like you can see here: https://www.youtube.com/watch?v=NS8Dqvi_wIk
Other use cases that are being enhanced include digital furniture perfectly allocated around the user, to speed up purchasing intention, wallpapers and painting visualization on user's walls in real-time, with extreme accuracy and control, accurate measure of objects, 3D object scanning to create digital twins of real-life objects easily, and more!
Analyzing more in detail this AR tech history, Google used to have something called Tango Project, which worked in a similar way, but requiring more sensors on the device (only 2 devices including this technology were available in the market at the moment, 4 years ago). It was canceled to give more strength and focus to ARCore, a software solution that scanned the surroundings using the current camera of the phone and other available sensors, which worked on more devices, similar to Apple's ARKit counterpart. Those technologies made it possible to launch thousands of AR apps, including for example the hit Pokemon Go, which used these capabilities to show digital little monsters around the user. With LIDAR, the little monsters now could hide behind a tree or car, enhancing the user's experience by making it more difficult to catch them all... It is called occlusion when a digital element (the little Pokemon) hides behind a real-world object that has been detected by the device (the tree or other object).
So, welcome LIDAR scanner to iPhone 12 Pro, we've been waiting for you for a long time. It's time to create new and great AR apps and experiences!