To move a step ahead in the sphere of augmented reality, Apple is developing a 3D depth sensing technology for the rear-facing cameras in its 2019 iPhones, as per the latest report published by Bloomberg.
This 3D sensor system will be different to the one found in the iPhone X’s front-facing camera and is said to be the next big step in turning the smartphone into a leading augmented reality device.
Apple is resorting to a different technology from the one being currently used in the TrueDepth sensor system on the front of the iPhone X. The existing system depends on a structured-light technique that produces a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D image for authentication. However, the new and planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
Now, the company is expected to keep the TrueDepth system, so the future iPhones would have both the front and rear-facing 3-D sensing capabilities. For this, Apple has apparently started the discussions with Companies manufacturing time-of-flight sensors including Infineon Technologies AG, Sony Corp., STMicroelectronics NV and Panasonic Corp. The testing of the technology is still in early stages and it could end up not being used in the final version of the phone, confirmed an Apple spokesperson.
The addition of this rear-facing sensor would definitely enable more augmented-reality applications in the iPhone. Also, Apple Chief Executive Officer Tim Cook considers AR potentially as revolutionary as the smartphone itself. He has considered it as giving similar attention to the sales growth. Further, he added that “We’re already seeing things that will transform the way you work, play, connect and learn. AR is going to change the way we use technology forever.”
Apple also added a software tool called ARKit this year that made it easier for developers to make apps for the iPhone using AR. This tool is good at identifying flat surfaces and placing virtual objects or images on them. But it struggles with vertical planes, such as walls, doors or windows, and lacks accurate depth perception, which makes it harder for digital images to interact with real things. The instance can be witnessed in the image below.
The iPhone X uses its front-facing 3-D sensor for Face ID, i.e. a facial-recognition system that replaces the fingerprint sensor used in earlier models to unlock the handset. Production problems with the sensor array initially slowed manufacturing of the flagship smartphone, partly because the components must be assembled to a very high degree of accuracy.
While the structured light approach requires lasers to be positioned very precisely, the time-of-flight technology instead relies on a more advanced image sensor. That may make time-of-flight systems easier to assemble in high volume.
Apple helped bring personal computers into the home with its early Macs, revolutionized the way we listen to music with the iPod, and put a supercomputer in everyone’s pocket with the iPhone. In 2017, the Cupertino, Calif. company came for the smartphone—again —with its iPhone X, one of TIME’s 25 Best Inventions of the Year. Jonathan Ive...
In this hands-on video walkthrough we highlight more than 15 handy tips and tricks for new iPhone X owners. Included in the video are easier methods for invoking Control Center and Notifications, a method for recording Animoji without a time limit, and the best way to quickly switch between two apps. Have a look at...