The latest iPhone 12 Pro and 2020 iPad Pro include a LiDAR sensor that attaches depth mapping to their cameras, though future points at smartglasses and HUDs. We’ve been investigating its potential.
So what is LiDAR?
If Apple calls the shots, LiDAR is the tech we all are bound to see on an increasingly growing number of devices. But how does it work?
LiDAR is a “time-of-flight” technology. Similar to how radars emit radio waves, it works by sending out laser flashes and counting how long it takes them to bounce from an object and return. While the speed of light is a constant, its travel time calculation can be transformed into accurate distance data. Sending laser light across two-dimensional grids gives us a dense 3D geo-referenced point cloud: a set of data points in space, showing the specific location, shape, and sometimes even textures of all objects within the area scanned.
It comes especially handy for AR application development as many of iOS 14 updates for Apple’s AR are taking advantage of this technology. That’s understandable, while augmented reality solutions rely heavily on real-world mapping precision to deliver their users’ most authentic experiences.
Why it is better than traditional AR cameras
The previous iPhone and iPad series camera used to have a similar AR tool: the TrueDepth. Among other capabilities, it powered Apple’s FaceID.TrueDepth is an infrared mesh building technology, projecting more than 30 thousand points and building 3D models of your face, guessing depths by the shape of the light that falls upon it.
Unlike TrueDepth, LiDAR measures depths strictly by timing how long it takes for the light to reach an object’s surface and go back to the sensor. This process results in much higher precision and increased scan ranges.
Lidar (left) vs TrueDepth (right). Image by iFixit
What’s more, the hardware behind the laser sensors today is relatively cheap. To send and receive laser emittance, iPad’s LiDAR relies on arrays of vertical-cavity surface-emitting lasers and single-photon avalanche diodes. Their most obvious advantage is that traditional semiconductor producers can build them. This implies their quality continually grows better, while the cost gradually reduces every year (initially, LiDAR used to cost over $ 75.000 per unit), meaning soon we are likely to witness its broader adoption.
AR Features enabled by LiDAR
Its core feature application allows you to measure the objects in the camera view without taking physical measurements. It lets apps create a 3D reconstruction of items and even environments, compute their sizes and volume. The applications are endless.
Precise depth mapping
Like we mentioned earlier, LiDAR sends waves of light flashes and measures the bounce time with its sensor, building a field of points with precise depth mapping. As a result, they create a three-dimensional grid with exact dimensions for an area and its objects. Among consumer applications, its accuracy is only second to photogrammetry software.
Virtual light emission
Laser sensors illuminate real-world surfaces with virtual lighting, meaning they can operate in obscured environments and at any time of the day. The human eye can’t capture the laser pulses, but they are visible with a night vision camera.
Ideas of LiDAR-powered AR apps
Real estate and construction
LiDAR apps improve indoor navigation, guiding visitors and tenants inside larger facilities. Yet, their most apparent application is the ability to measure apartments, rooms, and objects that can then be translated into precise tech drawings, complete with measurements and geo-tracking. You can incorporate the results later into architectural and design tools or create full-scale virtual visualizations.
Regular tenants and new property buyers use AR apps to measure their apartments, plan interior decor, and create shopping lists with no need to hire a designer. Check out IKEA’s AR smart home solution that allows Apple users to virtually test their furniture in real-time.
Where to use it:
- Architectural toolkits
- Interior design tools
- Navigation solutions
- Staging apps and product catalogs
With virtual try-ons and showrooms, AR has already become an essential technology for retailers. Improved mapping makes these experiences even more immersive.
Paired with computer vision technologies, LiDAR has endless uses when it comes to more efficient planning. It can count people accurately, help adhere to effective social distancing policies, and identify buyers’ sales journey for more efficient planning of your sales areas.
For example, Hitachi Consulting now works on a UK mall project, helping the landlords understand buyers’ journey around its floors. Gathering and analyzing such information will permit them to improve their offer and possibly modify the rental strategy.
Where to use it:
- Planning shop layouts
- Identifying buyers’ preferences
- Compliance with social distancing regulations
- Try-before-you-buy experiences
LiDAR has already seen its use in industrial manufacturing. Its object detection and measurement capabilities help build the exact product and object models to facilitate their design and catch flaws early.
The most frequent use cases also include equipment automation, autonomous navigation, collision avoidance for ground vehicles. LiDAR apps help self-driving cars create visual maps providing the vehicle with information about their surroundings. The LiDAR scanner even lets you point a camera at a driving vehicle and measure its speed.
Where to use it:
- Product prototyping
- Quality inspections
- Machine automation
- Autonomous car navigation
Again, with its precise mapping, LiDAR-powered AR brings a practically unlimited collection of educational applications, especially amidst pandemic concerns. Their capabilities are perfect for geo-surveying, engineering studies, medical learners, and more.
Here at OTR, we have already been working on an educational AR project and are looking forward to building new apps and enhancing some of the older solutions with LiDAR capabilities.
Where to use it:
- Immersive learning materials
- Geoscience measurements
- Precise anatomical modeling
- Engineering learning tools
Pros and cons of migrating your AR app to iPad LiDAR
So is it time for migrating your application to Apple’s latest devices? Let’s check it out.
According to BusinessInsider’s review, Apple users spend 2x more money on app purchases compared to Android users.
- Better ranging (5 meters)
LiDAR sends laser flashes at various sections of the user’s space. It results in an improved scanning distance of 5 meters and improved tracking of occluded objects.
- Faster operation
Running on Apple’s latest mobile processors, LiDAR apps deliver an excellent user experience with the speed of light.
Some applications, like precise Portrait mode, need higher resolution to build 100% accurate 3d face models. Not a case for regular consumer applications, though.
LiDAR’s precision is not as great as photogrammetry (a combination of multiple high-res photos made from various vantage points). This is likely to change as the tech upgrades.
Pro versions of the latest iPad and iPhone cost significantly higher than average smartphones, which might or might not limit your outreach (depending on your target audience).
Wrapping it up
By rapidly introducing LiDAR hardware across its product line, Apple encourages AR developers to build versatile applications empowered by its “time of flight” sensing data. The obvious next grand move is adding depth mapping sensors to their recently announced smartglasses and HUDs to deliver more complex augmented experiences to reality. As of today, LiDAR has its main goal in mobile AR.
Considering migrating your app to LiDAR-powered iOS platforms or looking for experienced developers to build a brand new product with precise depth mapping capabilities? Contact us to get started and to receive your project estimate free of charge!