The iPhone 12’s camera specs have finally been revealed – and we now know that the iPhone 12 Pro range will use the new LiDAR scanner on the back. That’s right, the same mysterious point that first appeared on the 2020 iPad Pro.
But what is a LiDAR scanner? A built-in lie detector? A more relaxed version of radar, perhaps? As we shall see, LiDAR (or “Light Detection and Ranging”) works in a similar way to radar, but only uses lasers to assess distances and depths. This is big news for augmented reality (AR) and, to a lesser extent, photography.
However, the more interesting question is what does LiDAR allow on the iPhone 1
But first, a quick look back at the origins of technology so you can sound smart at your next Zoom family gathering …
What is LiDAR?
The concept behind LiDAR has been around since the 1960s. In short, the technology lets you scan and map your surroundings by firing laser beams and then setting how quickly they return. A bit like “seeing” bats with sound waves, only with lasers – which makes it even cooler than Batman’s Batarang.
Like most futuristic technologies, it began as a military tool on airplanes before becoming known as the system that enabled the Apollo 15 mission to map the surface of the moon.
More recently, LiDAR (also known as lidar) has been used in self-driving cars to detect objects such as cyclists and pedestrians. You may also have accidentally stumbled upon the technology in your robotic vacuum.
But in the last few years the possibilities of LiDAR have really opened up. As systems get smaller, cheaper, and more accurate, they become viable additions to mobile devices that already have powerful processors and GPS tablets and phones.
Of course, not all LiDAR systems are created equal. Until recently, the most common types of people created 3D maps of their surroundings by physically sweeping around like a radar dish.
This obviously won’t work on mobile devices, so newer LiDAR systems – including the 3D time-of-flight (ToF) sensors seen on many smartphones – are solid-state matters with no moving parts. But what’s the difference between a time-of-flight sensor and the LiDAR scanner we’re likely to see on the iPhone 12?
What is different about the Apple LiDAR scanner?
You may already be familiar with the time-of-flight (ToF) sensors on many Android phones. These help them capture the depth of the scene and mimic the bokeh effects of larger cameras.
However, the LiDAR system used in the iPhone 12 Pro and iPad Pro 2020 promises to go beyond that. This is due to the fact that it is a LiDAR scanner and not the systems without a scanner previously used on smartphones.
The latter use a single pulse of infrared light to create their 3D maps, but a scanning LiDAR system fires a train of laser pulses at various points in a scene over a short period of time.
This has two main advantages – an improved range of up to five meters and better object occlusion, ie the appearance of virtual objects that disappear behind real objects such as trees.
What’s impressive is that it’s a fast process too, but that speed is only really possible with the latest mobile processors.
As Apple noted at the launch of the 2020 iPad Pro, the data from the LiDAR scanner is processed along with data from cameras and a motion sensor and then “enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of the scene”. In other words, there’s a lot going on to make it seem seamless.
While the iPhone 12’s A14 Bionic processor provides good support for Apple’s LiDAR scanner, there is plenty of room for improvement in the scanner itself.
As a blog post from the developer of the Halide camera app points out, the iPad Pro’s depth data does not currently provide the resolution required for some applications, e.g. B. detailed 3D scanning or even portrait mode.
This means that the iPad Pro’s LiDAR scanner is designed more for room-scale applications like playing games or moving AR furniture around in the IKEA Place app. Currently, you cannot scan 3D objects more precisely than other techniques such as photogrammetry, which instead combines high-resolution RGB photos from different angles.
Wouldn’t it be great if these LiDAR scanner networks could be combined with the resolution and textures of RGB cameras or Face ID? That’s the ideal, but we’re not quite there yet – we don’t have a full look at the iPhone 12 Pro yet, but we’re not sure if it can do that too.
Exactly what can you do with a LiDAR scanner on iPhone 12?
What can you do with a LiDAR scanner on iPhone 12?
So now we know that the iPad Pro’s LiDAR scanner works best on a room scale. What can he do on the iPhone 12? For the average person, the two most important ones are AR games and AR shopping.
Apple has a preview of some LiDAR-specific apps that will be available “later this year” (conveniently after the iPhone 12 launches). One of the more interesting is the game “Hot Lava”.
Hot Lava, a first-person adventure game for iOS and PC, will have a new AR mode in late 2020 based on Apple’s LiDAR sensor to bring its molten rivers into your living room.
So far, the demo isn’t quite as impressive as we’d hoped – most of the objects your character bounces around are in-game renderings rather than your actual furniture, but there’s still time to develop them.
Of course, every mention of AR games is reminiscent of Pokemon Go, the only real hit for augmented reality to date. Interestingly, game maker Niantic seems to be going its own AR path instead of relying on Apple’s technology. A new “Reality Blending” feature was recently announced for Pokemon Go, which allows characters to realistically hide behind real objects such as trees. It also announced the acquisition of a 3D spatial map company called 6D.ai.
This shows that next-gen AR games aren’t necessarily tied to Apple’s LiDAR-based technology or ARKit platform, but the iPhone 12 should at least give you a seat on the ring to watch the AR battle on.
But ultimately, the introduction of LiDAR on the iPhone 12 Pro will massively increase the number of apps that use this technology – and that could fundamentally change the iPhone camera.
But what about non-gaming experiences for the LiDAR sensor? So far, the most polished ones seem to be based on interior design. With the IKEA Place app, for example, you can move around your living room like in a real version of The Sims in virtual furniture.
While the iPhone 12 Pro’s improved AR placement and occlusion (or the ability to hide virtual objects behind real objects) is helpful, it’s still not a dazzling new use for the LiDAR scanner.
While the technology is currently more useful to CAD designers and healthcare professionals (if you own an iPad Pro, check out the stunning Complete Anatomy app), there is still plenty of room for creativity and surprise in the next year.
As Halide’s proof-of-concept app Esper shows, the LiDAR sensor could help app developers invent new creative forms that go well beyond traditional photography and video.
In the meantime, it is fair to say that the LiDAR scanner on the iPad Pro and iPhone 12 Pro will initially inspire developers rather than technology fans.
You have the opportunity to test the future on devices equipped with LiDAR – but the real leap should come when those sensors and apps hit the Apple Glasses.