Convergence of various geospatial technologies within and with other technologies, and the increasing need for such a practice is the need of the hour. Interestingly, at a couple of top geospatial conferences I attended recently, speakers echoed a similar theme. As topics ranged from greenway planning, to use of laser systems for land-use planning, automated access, processing and integration of the growing volume of geospatial data through analytics and cloud processing, the speakers provided insight on how geospatial information and technologies are underpinning success in business processes today.
Worldwide, a massive integration of geospatial information across government and business processes and enterprises is underway to improve situational awareness and decision making.
Location-aware, sensor-filled mobile devices are constantly sending location data into the Cloud and receiving location information from the Cloud
In an article titled ‘Geospatial Workflows Redefining Industry Ecosystem’, the May 2013 edition of Geospatial World had highlighted how geospatial industry has evolved rapidly in the past decade through convergence, collaboration, and integration of constituent technologies. This convergence facilitated end-to-end geo-enabled workflows across vertical industries, triggering the sector’s transformation from a relatively small and highly compartmentalised industry to a much larger, richer and more richly connected industry at the centre of a much larger geospatial business ecosystem.
The benefits of traditional geospatial technologies are heightened by their convergence with new technologies and approaches such as Cloud, analytics, mobile processing and 3D data collection.
Overall data volume in the IT world is outgrowing data storage capacity. Much of the data is spatial in nature and is unstructured. Standards help by facilitating the creation of data that is structured rather than unstructured. Nevertheless, in the “signal to noise ratio” analogy, it becomes increasingly important to have analytical power to filter the extraneous data (noise) from meaningful data (signal).
Standards continue to play an important role in this continuing evolution — facilitating the rapid integration of information, supporting processing and fusion of data sources to yield new levels of insight, enabling the rapid insertion of new technologies without major disruption.
‘Sensors everywhere’ are creating a vast amount of location data
A parallel convergence is taking place in the SDO world. There is a growing need to anticipate and identify new interoperability challenges. As technologies and industries evolve and as communities realise the value of data integration and analysis, these challenges can best be met with standards and related best practices. Here are some of the OGC activities that represent this convergence trend:
- GeoPackage: The OGC GeoPackage standard lets developers of lightweight mobile apps tap into a mix of powerful standards-based location services available in the Cloud. GeoPackages are interoperable across all enterprise and personal computing environments, and they can even be sent in emails and text messages. They are particularly useful on mobile devices like cell phones and tablets in communications environments with limited connectivity and bandwidth.
- Indoor location: Most of us spend most of our lives indoors, but our mobile devices don’t support indoor navigation as they do outdoor navigation. The candidate OGC IndoorGML Encoding Standard has been developed to provide interoperability between indoor navigation applications, with a link to outdoor location systems. The international participants in the IndoorGML Standards Working Group work in collaboration with indoor navigation information standards groups in other standardisation organisations including ISO/TC204 and the IEEE Robotics & Automation Society (RAS).
- Modelling: Environmental modelling is extremely complex, and there is an explosion of models to address issues such as climate change and water resource availability. The OpenMI Association brought their Open Modelling Interface (OpenMI) into the OGC. The OpenMI Version 2, recently approved as an OGC standard, defines a means by which independently developed computer models of environmental processes, or indeed any processes, can exchange data as they run. This facilitates modelling workflows that chain multiple models together — creating a new capacity for efficiently solving increasingly complex issues.
- Internet of Things (IoT): In today’s world, most sensors have special purpose and often proprietary software interfaces. This situation requires significant investment in API development with each new sensor or project involving multiple sensor systems. Standardised interfaces for communicating with sensors and sensor systems permit the proliferation of new high value services with lower overhead of development and wider reach. It will also lower the cost for sensor and gateway providers and increase the industry’s overall market potential. An integrated constellation of open standards, including the OGC Sensor Web Enablement Standards and the candidate OGC Sensor Web for IoT Standard, will soon be essential infrastructure for the IoT.