A cloud-based approach allows for leaner, cleaner workflows

A cloud-based approach allows for leaner, cleaner workflows

Jeff Allen, General Manager, Geospatial eXploitation Products, BAE systems
Jeff Allen
General Manager, Geospatial eXploitation Products, BAE systems

With consumers increasingly looking for intelligent applications, and not raw geospatial data, integration of various geospatial technologies is fast becoming a trend, believes Jeff Allen, General Manager, Geospatial eXploitation Products, BAE systems

What are the current technology trends in integration of multiple geospatial capabilities?
A major emphasis for integration is to introduce greater photogrammetic rigour (vice ‘Mashup Mentality’) into the processing of geospatial data. The capabilities being integrated should be at the same maturity level of rigour. You would want to avoid passing data from a very mature service to a beta-release open source service whose maturity is questionable at best, with the potential to break the chain of rigorous photogrammetric processing.

We see a movement to avoid creating stove-pipe integrations on local IsP resources and instead migrate the common geospatial data and services to a cloud-based architecture accessible via open, cloud-friendly geospatial standards. As more geospatial capabilities are made available via the cloud, consumer organisations can acquire only those services and data they need to build the workflows to provide the geospatial information/knowledge they require. Further, as an organisation’s needs and requirements change, they are able to modify existing workflows or create new workflows — again procuring only the data and services that are actually required.

Raw geospatial data, such as LiDAR and EO imagery, is not what most organisations want when they procure geospatial data and services; instead they are looking for answers to questions that require intelligent application of these. For example, cellular service providers do not want to procure and process LiDAR and terrain elevation data. They want answers to questions such as “what are the best locations to place cell towers in the northcentral suburban region of the city of Mumbai in India to provide the best coverage with the least overall cost?”

This points to a need to develop cloud-based, geospatially aware semantic Web capabilities that would answer these ‘geospatial knowledge’ questions. Cloud-based geospatial capabilities would enable organisations to create multi-tiered integrated workflows to populate semantic Web repositories based on their specific problem spaces — again only procuring what is specifically needed.

Is there a need to merge the full spectrum of mapping, remote sensing and analysis capabilities?
A major concern is that what is merged becomes yet another stove-pipe system that is not flexible and is expensive to modify or expand as an organisation’s needs change. A cloud-based approach allows for leaner, cleaner workflows to answer geospatial-based questions.

What are the ongoing trends in effective data management across sensors inputs/ databases to support effective integration?
Effective sensor inputs/data management can only be affected well when all data and services have open, standards-based mature metadata. Metadata population is both a blessing and a curse. Metadata aids greatly in forensic analysis when a significant problem is found with the geospatial data. However, it is often extremely difficult to collect as the most important metadata must be physically entered by operators.

All of this calls for the geospatial community to adapt and adhere to a metadata model that not only captures metadata about the data but also, and perhaps more importantly, adopt processes that modify or create data. The community then needs to develop and publish a set of ‘best practices’ for metadata collection that all members are expected to follow for maximum efficiency and interoperability. This should further encourage application and service providers to make metadata collection and update more automated to reduce operator involvement.

What are the benefits of simplified workflows?
The first and foremost is the lower per unit cost. Training curves this way are much flatter, thus allowing data providers to relax the skillset requirements of operators, giving them much more staffing flexibility. For instance, instead of having a large number of college- and graduatelevel operators collecting and processing data, less expensive high school graduates (with training) could perform the basic tasks. This would enable the higher skillset operators to become team leads for the lower skilled operators.

secondly, increased automation in the workflow will change the role of the operator during production. Operators will review results and only extract or edit data where automation is not mature enough. When properly applied, automation can lead to even greater efficiency. With less interaction required on wellestablished production processes, operators can develop new workflows for new products that will be operator-intensive and evolve into something much more automated.

What are the other future trends, for example, compatibility with other software and hardware systems?
Cross-platform, and cross-application compatibility and interoperability are the desirable goals with two basic ways to address the issues: the use of open, standards-based (for data and interfaces) solutions (e.g., OGC/IsO) and proprietary/bespoke solutions.

Terrestrial LiDAR of Vaughan Mills Mall, Ontario, colour-coded by elevation in the SOCET GXP 3D Multiport. Data courtesy Optech
Terrestrial LiDAR of Vaughan Mills Mall, Ontario, colour-coded by elevation in the SOCET GXP 3D Multiport. Data courtesy Optech

Open standards-based solutions have a number of advantages. Vendors develop and optimise their own internal solutions with the realisation that they only need to adhere to open standards while interfacing with other applications and services. Vendors are able to update or replace their algorithms and processing without having to worry about how it will affect users of their capabilities, or when they consume data and services themselves.

Top: Interactive point measurement allows the review and measurement of tie and control points on separate image panels. Image courtesy NOAA/NGS
Bottom: ABI system events in SOCET GXP using Web services. Image courtesy DigitalGlobe

Vendors only have to address changes to their external interfaces as open standards evolve over time. When new open standard data types are available, vendors have the options to update both their internal processing to leverage the new data, and enhance their external interfaces to accommodate the new data type.

Bespoke solutions may benefit consumers in some cases where standards do not yet exist or in cases of unique requirements such as highperformance or high-security.