Geospatial data is likely to be more visible in new industries –...

Geospatial data is likely to be more visible in new industries – Mark Johnson, Descartes Labs

SHARE
Mark Johnson, Co-founder and CEO, Descartes Labs, US
Mark Johnson, Co-founder and CEO, Descartes Labs, US.

The years ahead will be extremely transformative for the geospatial industry. There will be confluence of three major technological trends that are at the heart of this exciting era — the rapidly proliferating fleets of sensors, the maturation of commercial Cloud providers, and advances in machine learning and computer vision.

We see geospatial data playing a much broader and visible role. Industries that have traditionally harnessed geospatial data will see more use cases unlocked by new sensors and algorithms. But we can also expect geospatial data to become far more visible in new industries as well, including consumer, health care, robotics, personal transportation, and beyond.

Overcoming barriers for innovation

Traditionally there have been barriers to accessing geospatial data, and these have slowed innovation. Recently, advances in managing large datasets such as satellite imagery and distribution made it feasible for organizations to look for opportunities to leverage geospatial data. As they discovered value, they sought to push the boundaries as to what can be done, and this meant bringing capabilities in house. However, the technology landscape will continue to evolve. Cloud computing and machine learning are trends that help leverage large data sets of all types, and geospatial is not an exception.

The key to staying relevant is to experiment, discover value, and act quickly to harness new ideas. This rapid cycle of innovation can be aided by tools. The best tools aren’t those used by a single person, but that enable collaborative experimentation. The hallmarks of these tools are cheap iteration, on-demand data and elastic computing resources, robust management of experiments, and ability to track successes and failures. The goal is to move from experimentation to a product offering that offers significant value, and to do so quickly at a cost commensurate with the value of the output. Companies that prioritize innovation, adaptation, and efficiency will win over those that focus on a static array of products.

Fusion is the future

Recently we have witnessed machine learning and artificial intelligence thriving. The new frontier is sensor data, which enables models of the physical world. In the next three to five years, new commercial and public satellites will increase resolution, observation cadence, spectral density, and introduce RF, SAR, and other modalities. Every day at Descartes Labs, we discuss disruptive applications that would be possible if the right sensors were available. The good news is that those sensors in most case are already scheduled for launch.

I think the industry has woken up to the fact that a huge opportunity exists to fuse different types of data — both sensor data and other data modalities (such as social) into a single computational environment. It is the coupling of different datasets that will unlock analytics that surpasses what has been available in the past.