Apple’s newest innovation, the ‘breakthrough LiDAR Scanner’ for the 2020 iPad pro model, is a first in the mobile industry – patented by Apple close to nine years ago. The LiDAR Scanner is compared to the Face ID Scanner of Apple devices (positioned on the front). In contrast, the LiDAR Scanner placed on the rear camera of the iPad Pro gives a better understanding of the environment owing to LiDAR’s 3D mapping magic.
What is Apple’s LiDAR Scanner?
LiDAR, in general, stands for ‘Light Detection and Ranging’ – is a well-known technology in the geospatial industry as a revolutionary tool for scanning and surveying. LiDAR enables detection of objects and builds 3D maps of surroundings in near real-time, allowing the user to see objects as they are.
Apple’s LiDAR Scanner is not at the level of professional LiDAR scanners (as that available for outdoor surveying and scanning) –yet it can measure the distance to surrounding objects for approximately 5 meters away, works both indoors and outdoors, at a photon level at nanosecond speed, and enables tracking the location of people (spatial tracking). The advantage of the LiDAR Scanner is that it shall fit in smaller and provide complete access to lower and confined areas, where fitting in large-scale scans is difficult – necessary for precise surveying.
Primarily, Apple’s LiDAR Scanner aims to be used to create cutting-edge-depth mapping points, when combined with camera and motion sensor data to create a more detailed and accurate three-dimensional (3D) information of the scene and for instant object placement. The depth-sensing from LiDAR, combined with Apple’s dual depth cameras, and sophisticated computer vision algorithms, has the potential to become a powerful scanning platform in the future. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro. Soon, there shall exist a possibility for application developers – to utilize the available depth of mapping capability to transform the mapping industry – especially indoor mapping industry by taking advantage of the available Scanner.
Apple’s LiDAR Scanner – Applications
Over the last month, the use of the LiDAR Scanner on the iPad Pro model has been under discussion. While the naysayers –do find the current use of LiDAR Scanner to be limited; in its capacity, the LiDAR Scanner and the AR experiences are suited to work best in indoor and confined environments and for short travel distances (5 meters). The LiDAR Scanner can accurately create a 3D map and model of a home in minutes (augmented by other AR apps).
Scan data from the LiDAR Scanner, the image data from camera’s and motion data from the arrays of sensors on iPad Pro – simplifies the construction of accurate and precise spatial maps of the confined and indoor environment in nanoseconds. – Apple’s LiDAR Scanner provides for a robust and accurate 3D scanning and measuring device (primarily when used with Measure App) – delivering for 3D mapping, and measuring of interior spaces, object identification and classification. Apple LiDAR Scanner is a solution to the growing demand for mobile scanning in short/indoor and complex environments; and may also be extended to confined outdoor industrial sites.
One of the recent examples is that of Team Viewer who recently announced an update of the AR-based TeamViewer Pilot to leverage the LiDAR Scanner of the new Apple iPad Pro. In industrial sites, mainly, expert or technicians can use the TeamViewer Pilot and Apple’s LiDAR Scanner together to define and present an accurate understanding of the environment. By providing an understanding of the physical environment, the LiDAR Scanner in the iPad Pro also enables Pilot to detect and impede annotations behind physical objects. This provides a much better understanding of the real situation for both, the remote expert, as well as the person in the field– significantly improving first-time fix rates and productivity. Additionally, using ARkit, stakeholders, can be virtually transported to the field and collaborate in real-time, seeing what workers on-site are seeing through iPad’s new cameras, integrated with AR and detailed depth information generated from the LiDAR Scanner.
Futuristic spin on 3D Modelling
Since the Scanner is now easily accessible to customers/surveyors – the developers have a strategic role in increasing the adaptation, and integration of the technology to engage users and elevate the technology on industrial sites, for indoor mapping, complex, and confined environments. The hand-held tool is affordable and lightweight, multi-functional, covering a small ground in almost no time. For architects and industrial designers too – this means the ability to measure and model the space accurately. As LiDAR is an essential technology for architects, and industrial designers – as also specified in the GEOBIM Maturity Model – the Scanner can be integrated with Shapr3D – a design modelling tool – to make a CAD model of the indoor environment in real-time on a mobile device.
It is noteworthy, that various studies by LiDAR experts and app developer show that while the LiDAR Scanner does make Apple’s iPad useful for sensing 3D-spaces, it is yet not ‘accurate’ enough to use it with a 3D printer – yet. vGIS – a leading developer of AR/MR solutions conducted an analysis of Apple’s LiDAR Scanner and its use in spatial tracking and found that the Scanner offers improvements in surface scanning and object detection, but unable to accurately still do position tracking. However, the results concluded that the LiDAR Scanner is best suited for an indoor environment. The use of raw depth data from the LiDAR Scanner is still heavily dependent on the innovations of the application developers, especially in the AR domain (such as Measure, Halide, Shapr3D).
The official integration of Apple’s LiDAR Scanner on the iPad model sets the possibility of future integration in other iOS devices – a rumoured feature of iPhone 12 models expected to come out later in 2020. The app at present is best used towards scanning, creation of maps of confined environments and creation of augmented reality environments in real-time. Given that Google too is not far behind with its Project Soli, from its Advanced Technology and Projects (ATAP) group – using a RADAR chip instead of LiDAR to achieve similar objectives, how is the scanning environment going to change then? Only time will tell!