With Apple having recently launched their first iPhone ‘Pro’ model – a concept that has other iterations in the company’s Mac, MacBook, and iPad range – all eyes were on the upgrade to the retina display (Super Retina XDR) and the smooth optics of an ultra-modern triple camera set-up.
Consumers can now capture 4K video content on the iPhone Pro model with boost to all photograph and video modes provided by the integrated A13 Bionic chip and AI system giving users an unrivalled hand-held portable photo-suite. With all of the buzz around the new iPhone much speculation has emerged around the next generation of iPad Pro. The Korean website The Elec has recently sparked rumors that electronics multinational LG will be providing 3D sensing cameras for a possible 2020 release of Apple’s tablet range.
Current murmurs suggest that the configuration of camera lenses will closely mimic those of the iPhone 11 Pro and Pro Max; yet, the similarities may end there as Apple would once again shift the paradigm. This is due to the integration of 3D sensing Time of Flight (ToF) technology which would vastly improve the amount of data that the camera system could capture. Thus, another leap in camera and motion capture technology for Apple means bounds of potential for the future of Augmented Reality functions and other real-time applications in their products.
Time of flight 3D sensors and cameras can measure all distances of a complete scene in just a single shot and is similar to the Lidar laser technology used in self-driving cars: it works by calculating how long light takes to reflect back of a surface to the sensors on the device.
Applications for ToF technology embraces both gestured and non-gestured categories which includes converting movement and human gestures into commands for portable devices and gaming systems as well as data-capture and measurement for 3D image photography, tracking the movement of objects, and augmented reality gaming systems.
Apple’s Face ID technology is currently supported by an infrared TrueDepth system which allows the front-facing camera to sense depth and distance of an object – in its current application, a user’s face for recognition. However, whereas the TrueDepth technology has a limited range of just a few inches, ToF operates at greater distances, enabling the high-fidelity mapping of an environment in real-time.
This is just one of the reasons experts in the techno-sphere are excited by the arrival of new opportunities for augmented reality technology in Apple devices. Furthermore, continued influence over the realm of augmented gaming, which has already proven to be a successful route for developers with the success of Pokémon Go, it opens new avenues for the art world and user-interaction with their surroundings.
What’s more is that with exciting opportunities are also opening up in industries such as architecture, construction, design, and even agriculture for 3D scanning and tablet based solutions, the next generation of iPad could further accelerate the evolution of integrating, hand-held technology with machine learning and advanced 3D modelling.
While this all remains speculative for the time being, as no official release or confirmation has emerged from either Apple or LG, the potential for this collaboration between these two tech-giants could revolutionize hand-held device and tablet technologies. Considering Apple is also pushing forward and breaking new boundaries with their chip technology – such as the aforementioned A13 Bionic – the prospect of advanced machine learning in tandem with 3D ToF sensing technology in tablet and hand-held devices is hotly anticipated.
If the hype of the rumors is lived up to then we could see this technology assimilated into the next phase of iPhones and the generation of world in which immersive experience is taken up a level as the virtual and the real continue to interact on the same plane.