Apple gets patent for mobile mapping in iPhones

Apple gets patent for mobile mapping in iPhones

SHARE


Mobile mapping in iPhones
Apple iPhones are likely to have mobile mapping capabilities soon. Of the 48 newly granted patents for Apple published by the US Patent and Trademark Office on September 27, Patent No. 9,456,307 titled “Electronic device with mapping circuitry” talks about a future mapping application for an iOS device like an iPhone in which data from a laser sensor and positioning circuitry is combined to create an accurate depth map of a user’s surroundings. 

How mobile mapping in iPhones will function

The 3D mapping capabilities that iPhone 7 Plus camera possesses requires a laser to perform a 3D scan of an environment. This new patent is said to be linked to functioning of the said camera. The patent notes that (Fig 3) “when the device is in a mapping mode of operation, a user may compress a button, thus activating a laser sensor to generate laser beam that may cause circuitry in the device to gather sample data such as a sample of laser data and a sample of device position data.”

How Mobile mapping in iPhones will function
Apple patent application diagram showing how the iPhone 7 Plus 3D camera will use laser to perform a 3D scan of its surrounding environment

While Apple’s patent application doesn’t mention iPhone in particuar, it talks about “an electronic device” which may be provided with “electronic components such as mapping circuitry, a display, communications circuitry, and other electronic components.”

“The mapping circuitry may include a laser sensor and positioning circuitry. During mapping operations with the electronic device, laser sample data and device position data may be gathered at multiple sample points on one or more surfaces. The positioning circuitry may be used to determine the device position and orientation at each point at which laser sample data is gathered. In this way, distances between sample points, surface mapping data, volume mapping data or other mapping data may be gathered while freely moving the device.”

A user may stand in a room and aim a laser beam generated by the laser sensor at a wall. The user may gather laser sample data while pointing the laser at first, second, and third points on the wall. The user can freely move the device between sample points. As the user moves the device, the positioning circuitry monitors the position and orientation of the device. When the user gathers laser sample data at the second and third sample points, the positioning circuitry can determine the relative position and orientation of the device with respect to the position and orientation of the device during the first sample point measurement or with respect to a global coordinate system established at the outset of mapping operations.

The application contains diagrams showing how a laser sensor may provide laser data to a mapping application and other applications (Fig 8) and how laser sample data and device position data may be combined to form mapping data (Fig 9).

Mobile mapping in iphones: how a laser sensor will provide laser data to a mapping application and other applications
Apple patent application diagram shows how a laser sensor may provide laser data to a mapping application and other applications

 

Mobile mapping in iPhones: how laser sample data and device position data may be combined to form mapping data
Apple patent application diagram shows how laser sample data and device position data may be combined to form mapping data

The patent application was filed way back in 2013 and credits Vivek Katiyar, Andrzej T. Baranski, Dhaval N. Shah and Stephen Brian Lynch as its inventors.

While many are seeing this development as part of computer vision integrations for its Apple’s automated car initiative, what can be said with certainty is that the technology would surely be part of the tech giant’s in-house indoor mapping efforts.

Apple’s tryst with mapping

Despite its disastrous track record in mapping in general, Apple has been experimenting with mobile mapping for some time now. In 2015 it launched an Indoor Survey App to help businesses map out their physical footprints. The app would show shoppers around, direct them to particular products, and encourage them to stick around for a bit longer. The description of the app was pretty simple: “By dropping ‘points’ on a map … you indicate your position within the venue as you walk through… As you do so, the indoor Survey App measures the radio frequency (RF) signal data and combines it with an iPhone’s sensor data.”

That time it was seen as an attempt by Apple to break through into indoor positioning without the need to install special hardware such as its iBeacons, and a part of its broader strategy to corner the mapping market including taking on arch rival Google.

New challenge to traditional mobile mapping players

Advent of digital cameras and rapid, development of direct reading georeferencing technologies and accurate position location using GPS and Inertial Navigation Systems have given a fillip to mobile mapping. GPS and Inertial Navigation Systems, have allowed rapid and accurate determination of position and attitude of remote sensing equipment, effectively leading to direct mapping of features of interest without the need for complex post-processing of observed data.

In addition to being bulky and difficult to use, conventional laser scanning devices can measure only the distance from the device to a surface and not between multiple points. Apple’s plan to equip smaller and user friendly mobile phones with this capability will surely go a long way in transforming the mobile mapping industry and throws up new challenges for the existing players.