LiDAR in smartphones – business opportunities for 3D scanning

Exploring the use of LiDAR modules in smartphones reveals a wide range of industry applications, including 3D scanning and more.
16 June 2023

Apple first introduced a LiDAR sensor into its range of devices with the 2020 iPad Pro and the module can now be found in the firm’s Pro and Pro Max smartphones as well as the recently announced Vision Pro mixed reality goggles. Image credit: Apple

Getting your Trinity Audio player ready...

It’s well-known that modern smartphones are a computing marvel, with the performance of the latest devices topping the specs of many laptops. But smartphone superpowers don’t end there. Adding to the appeal of handsets are the many sensors found inside mobile devices, which for Apple customers includes LiDAR in smartphones. And this technology, thanks to its ranging, 3D scanning, and even blood-measurement capabilities, opens the door to some very interesting use cases.

What makes LiDAR in smartphones so useful?

Tech giant Apple began including LiDAR sensors in its mobile devices beginning with the 2020 iPad Pro, and today all of its top-end smartphones carry the feature. In fact, you could argue that the majority of Apple’s iPhone and iPad products have LiDAR capabilities if you include Face ID, which also has 3D imaging features. But there are some key differences between the rear-mounted LiDAR module and the front-facing Face ID assembly.

The Face ID module projects a patterned point cloud of more than 30,000 infrared dots onto surfaces that are roughly an arm’s length (25 – 50 cm) away from the device. If the target surface were perfectly flat then the shape of that point cloud would be unchanged, comparing the optically generated pattern emitted from the iPhone or iPad with the image received by the infrared camera element within the Face ID hardware assembly.

Any variations in the surface reflecting those infrared dots back to the camera will shift the points either closer together or further apart. And the deviations can be used to infer the shape and physical characteristics of the object – in this case, the user’s face – being imaged.

In contrast, returning to the differences between Face ID and LiDAR, the sensor found on the rear of Apple’s products emits fewer infrared marker points, comprising a regular grid of 24 x 24 dots. However, each dot is brighter than the Face ID projected points, which gives the LiDAR on the back of handsets and tablets a much great working range up to 5 m.

And not only can the system determine shape information from distortions in the grid seen when the infrared dots are projected onto the scene in front of the sensor, it can also capture depth information based on time-of-flight. The detection of laser pulses emitted by the LiDAR chip is affected by whether reflecting surfaces are closer or farther away from the device’s infrared source.

Patents such as US-20200256669 (view PDF) show that Apple uses a sparse array of single photon avalanche diodes (SPADs) to perform a kind of optical stocktake on the whereabouts of the infrared range-finding LiDAR emissions. And the iPhone maker’s True Depth Face ID design also features innovations that allows the hardware to perceive distance more accurately, albeit on different length scales.

LiDAR in smartphones benefits photography, helping to focus images even in low light conditions thanks to the infrared range finding. And being able to accurately map out a device’s physical surroundings gives developers the opportunity to blend real and virtual worlds to create augmented reality experiences.

There are a growing number of iPhone and iPad apps that use LiDAR capabilities to automatically generate CAD drawings from optical images gathered from the devices. Commercial software tools such as Autodesk’s AutoCAD have long had the capability to ingest LiDAR point cloud data, but being able to do everything (from capturing the digital measurements to rendering the final file) on your smartphone or tablet streamlines the process considerably.

Smartphones as wellbeing devices

One of the most curious business opportunities opened up by having LiDAR in smartphones is being able to optically probe fluids such as blood or milk to determine the degree of coagulation or characterize fat content. In 2022, researchers showed how it was possible to test a drop of liquid using smartphone LiDAR, and the work highlights the huge potential for mobile devices to support users’ wellbeing.

In the case of using smartphones to measure liquid properties, the University of Washington team makes use of the coherent properties of laser light to gather sample data remotely. Shining beams of infrared laser emission from an iPhone’s LiDAR module onto liquid samples produces speckle patterns formed by constructive and destructive optical interference.

And looking at how these speckle patterns change over time can be used to tease out fluidic properties without having to send droplets to the lab for analysis.