stub
HomeNewsiPhone 12 Pro And LiDAR tech - Better Than The Galaxy Note...

iPhone 12 Pro And LiDAR tech – Better Than The Galaxy Note 20?

Could Apple’s LiDAR propel the company in front of the flagship leaderboard?

Most smartphones these days have decent or very impressive photographic capabilities.

Back In The Lead?

iPhones used to be the go-to for those who wanted outstanding camera performance and overall superior quality.

However, Apple’s rivals quickly took notes and improved so much that the performance gap narrowed down significantly.

Apple’s LiDAR Sensor quickly became popular among technology enthusiasts, especially after seeing what it’s capable of in the iPad Pro 2020. The new setup promises improved camera capabilities, though they are not that significant yet.

So you might be wondering – Why is LiDAR so special?

LiDAR is short for Light Detection And Ranging. It is the way lasers can be used to determine distances and depths. Laser beams are emitted from the LiDAR module, and the time between emission and reception of the reflection helps determine distance accurately and rapidly.

Such technology is used in advanced applications like aircraft, spacecraft, and military hardware.

LiDAR is also used in some home appliances, like robot vacuum cleaners, that use it to figure out their surroundings.

LiDAR is also present in self-driving cars to determine distances between vehicles, obstacles, pedestrians, and so on.

LiDAR In Smartphones – Why Do They Use It?

Current flagship devices use time-of-flight (ToF) sensors to measure how long infrared light requires to bounce back to the sensor, which improves depth sensing in photography.

You might be wondering why does the iPad Pro use a LiDAR scanner, or why would they put one on a smartphone?

The newest LiDAR modules have multiple lasers that pulse to scan areas around them. In contrast, ToF sensors use only a beam of IR (infrared) light.

Therefore, LiDAR sensors are capable of scanning more objects and register more data from the scene they are pointed at. LiDAR is also capable of recognizing objects that are behind other objects.

When paired with specialized algorithms and advanced computational photography, LiDAR can significantly improve the sense of depth.

However, we haven’t observed that on the iPad Pro, which can’t support a portrait mode with its rear-camera module.

What is its purpose then? The LiDAR module of the iPad Pro is better at scanning the surroundings and transposing them into virtual reality.

The scanner on the iPad Pro works well with some AR applications, like IKEA Place, it does not allow three-dimensional objects to be scanned accurately enough to replicate them with a designated printer potentially.

At the moment, the list of possible applications of the LiDAR scanner is short.

The Halide camera app is one of the few apps that use some of the capabilities of the LiDAR module. However, currently, no API permits developers to access data more complicated than what the scanner captures at the 3D surface level.

Apple offers an ARKit to help developers with a framework for working on AR applications.

Unfortunately, the palette of AR apps is far from generous.

Does the iPhone 12 Pro need LiDAR?

It’s tough to say if a LiDAR sensor is a must, even for high-end devices like the upcoming iPhone.

To put it simply, Apple’s LiDAR module is impressive, a triumph of technology. However, it’s just a piece of hardware. What good is high-spec hardware without software support?

To put it simply, you most likely don’t need a LiDAR on your iPhone unless you are a developer of AR apps or need to work with AR software, the way it is now.

We are sure that the AR capabilities of the iPad Pro 2020 and upcoming iPhone will someday be extremely relevant, but it’s not the case yet.

RELATED ARTICLES

Most Popular

Recent Comments