Life Cycle Approach to Powering Geospatial Data Management

Life Cycle Approach to Powering Geospatial Data Management

SHARE

Geospatial industry is focused on gathering data and getting it to the customer in a usable form, but few are concerned with the process. It is why decision-makers have an increasing number of geospatial specialists, and why those specialists are turning to workflows.

Problems are easy to find. Seek out the enemy, help the ally. Define the customer, define the product and define the link between the two. Make a job easier, safer and more cost effective.

Problems are opportunities for geospatial solutions. They are reasons for professionals in our sector to develop sensors and other technology, and techniques for using that technology to process, exploit and disseminate the data from those sensors, to analyse and store it.

Back in 2012, Hexagon President and CEO Ola Rollén had outlined the geospatial workflow in its simplest terms: “From the real world to the digital world and back.” Expanding, he said, “Capture the real world in real time, using all kinds of sensors. Bring that information into a system where you can sort the information, then present it back to the real world in a comprehensive way so that it’s usable.”

The path between data and the customer is paved with details the geospatial industry continues to address in various ways, too often in silos spawned by competition that inhibit the ability to derive real, comprehensive and creative solutions. Closed workflows can prevent exchanging data to meet needs that can change from minute to minute in a crisis.

At the GEOINT 2013 symposium, Adm. William McRaven, who heads the United States Special Operations Command (USSCOM), spoke of the frustration his forces experience stemming from systems that do not coalesce. As the USSOCOM builds its global mission, it is constructing an intelligence, surveillance and reconnaissance support system that has geospatial aspects, with an emphasis on signals intelligence and human geography as its foundation. But that system has experienced root issues of data gathering and processing incompatibility, slowing geospatial workflows until those issues are overcome. Slow workflows in war endanger people who operate on the edge of conflict, and Special Operations is trying to find ways to knock down data walls between competitors’ systems.

“I think it is about building systems that naturally collaborate with each other,” McRaven said of competing companies creating technology that does not coexist. “This may sound a little Pollyannaish, but I am working on a number of projects now where I have asked my industry partners to be prepared to share their intellectual property with others in a way they haven’t done before. I think I can show them on the back end that, if you assume that risk, your return in the end will be better.” It certainly would be better for Special Forces troops whose risk is already high without complete or conflicting data.

It started with war
The impact of geospatial intelligence on war is nothing new. Hannibal used eyewitness accounts of troop movements and encampment fires to lure the Romans into a trap at Lake Trasimene, killing 15,000 and capturing 15,000 more during the June 217 B.C. battle in the Second Punic War.

William became the conqueror only after waiting for intelligence that told him that King Harold had withdrawn Saxon troops who had been defending the English coastline to send them against Norwegian invaders from the north. After a sexton hung two lanterns in the belfry of Old North Church in Boston, Paul Revere and William Dawes rode through the countryside to warn that the British were coming — by sea — to begin the American Revolution at Lexington and Concord.

Even today, much of what we have in geospatial intelligence had its genesis in war, including the use of full-motion video and 3D imagery from sensors mounted in drones used in Iraq and Afghanistan, and the technology needed to fuse their products into a usable picture.

With government austerity programmes inhibiting additional development, the geospatial industry and its customers are taking up the investment mantle to improve technology and add to it, including products that can be mounted on smaller and less expensive drones in a regulatory environment that’s evolving to allow for their increased commercial use. They also are developing satellites that are getting smaller and cheap enough to send constellations aloft, with each satellite performing a single or a few functions, and the constellation adding those functions together to derive a comprehensive system.

In a circular effect, the defence industry is leveraging the advancing commercial technology to reap the additional benefit of new sensors — including those satellites — and workflow enablers to derive solutions to problems that can no longer be mapped on a single country, or region or even continent.

Workflow to the world
The US National Geospatial-Intelligence Agency, the largest single information consumer on the globe, is building a ‘Map of the World’ on which geospatial data from a myriad of sensors and bases can be layered to offer a common operating picture from which disparate customers can work. Its goal is to offer such fidelity and such a complete presentation, that it gives the operator a sensory experience with the data that provides a fourth dimension, according to NGA Director Letitia Long.

“In the not-too-distant future, I hope that analysts are able to live within the data … immersed in a multi-sensory, fully integrated environment,” she said. “They may be equipped with advanced visual, auditory tactical tools and technologies.”

Life or death
One does not have to wear a uniform to find the potential for slow or irregular workflows endangering lives. In their book, Age of Context, Robert Scoble and Shel Israel write of the “connected human”: an interaction between man and machines in a geospatial chain that can prolong life and make that life better lived.

In one scenario, a Redwood City, Calif., firm, Proteus Digital Health, has developed a sensor the size of a grain of sand that is embedded into a pill that is swallowed.

“When the chip mixes with stomach acids, the processor is powered by the body’s electricity and transmits data to a patch worn on the skin,” Scoble and Israel write. “That patch, in turn, transmits the data via Bluetooth to a mobile app, which then transmits the data to a central database, where a health technician can verify if a patient has taken her medications.”

They add that not taking medications as prescribed cost $258 billion in medical costs in 2012. An average of 130,000 Americans and hundreds of thousands of others around the world die each year because they didn’t follow prescription regimens properly.

It’s not difficult to see the importance of geospatial workflows in lowering those costs. Nor is it difficult to see where siloes that inhibit data sharing can shorten lives.

The ‘Map of the World’ uses a data amalgamation from sources as simple as human geography interviews and observations, to social media inputs such as those used to map the Arab Spring as it evolved, to constellations of satellites and, increasingly, mini-satellites that are the products of technology spawned by decreasing sensor costs. That data is coming from members of the US intelligence community, plus service components of the US Department of Defense on a workflow bridge called the Defense Intelligence Information Environment.

Data from all parties is frequently updated in a dynamic mode that addresses an ever-changing world. The quality of data in this context is directly connected to its real-time nature. In fact, temporal filters are used in judging data value.

That data will be presented in classified and unclassified forms to address security needs. Products from the ‘Map of the World’ were used in relief efforts after Typhoon Haiyan struck the Philippines late last year.

The dynamic nature of data is being born out in geospatial maps that are continually used in disaster relief, with efforts directed by quickly established workflows to streamline data that often walk a security tightrope, and which are fed by social media that requires quick evaluation of validity.

Such maps also are being used to prepare for disaster by determining potential impacts of sea-level change on shoreline areas around the world. Events like Hurricane Katrina, which devastated New Orleans and the coastline of the Gulf of Mexico in 2005, costing 1,600 lives; like Superstorm Sandy, which wreaked billions of dollars in damage in New York and New Jersey; and Typhoon Haiyan, which cost more than 6,000 lives in the Philippines in December, sent cartographers and engineers scurrying to their computer models.

By taking advantage of geospatial data and layering inputs from sources like wetlands and weather history and projections onto maps, officials from shoreline cities learned just how vulnerable their constituents were. Projecting the impact of specific storm categories onto those maps foretold a potentially catastrophic future. Officials learned that land they thought was high and dry was endangered — that a Category 3 hurricane would render half or more of a city and Katrina was a Category 5.


Clockwise from top left: Planet Labs’ nano satellites are all set to revolutionise the earth imaging industry; drones offer a cheap and efficient way to collect information; and NGA’s ‘Map of the World’ aims to provide easy access to the agency’s spatially accurate geointelligence data

Those maps supported plans and pleas for help to fund schemes to mitigate damage. Geospatial intelligence remains a security staple, with workflows dealing with an increasing amount of data. The Port of Long Beach, Calif., is building a system called ‘Virtual Port’ to fuse data from 20 different sources, 12 of them geospatial, into a workflow that can generate a single-screen common operating system that includes the capability of tracking ship traffic through the vessels’ Automatic Identification Systems, but also warnings of potentially cataclysmic events, such as a terrorist attack on the port’s petroleum tank farm, near Los Angeles.

Business cases
The industry has simpler, but no less useful, maps to identify its customers and to place its stores and even its products next to them. Layers of information anticipate — even foster — the next customer need.

The study of economics is increasingly dependent on geospatial data to chart monetary trends and their impact on society around the world. The 2013 Nobel Prize for Economics was awarded to Eugene Fama, Lars Peter Hansen and Robert Schiller for their discovery of new methods of studying asset prices to spot trends, investigation of detailed data on the prices of stocks, bonds and other assets. The data included temporal elements, as well as layered events that could impact the trends along the geospatial timeline.

Trends in commerce are perhaps less complex, but no less telling and impactful. Macy’s credits a large part of a 10% increase in its business last year to geospatial data. Kohl’s Department Stores have embarked on a test programme of spatial and temporal data in which customers receive bargains based on where they are standing in the store at the time. A woman will get a message on her mobile phone giving her a 10% discount on a pair of shoes simply because she is standing in the shoe department.

Google Glass, the latest product from the company that changed the entire geospatial industry with its mapping capability, can allow a customer to view advertisements as he shops in a market.

A utility company can use a sensor mounted on a drone that is precisely driven from the ground to inspect insulators on a high-voltage line. Information from the drone is downloaded to a system on the ground, catalogued because the insulators and lines are geo-referenced, and an expert can view the insulators without taking the risk of scaling the poles. Information is everywhere, and the use of processed data is proceeding in lockstep with that information in an expanding geospatial ecosystem.

The geospatial industry is leading — albeit occasionally — in helping develop and use new geospatial genres and in knocking down siloes that inhibit those genres. More agile workflows are required as the genres develop and combine, and systems with more capacity are needed as the numbers of users and the information they generate and demand expand.

Power to the solution
A democratisation of geography must be accommodated to include increased input from social media that can offer real-time data, often more readily than other sensors. This democratisation was exemplified in a demonstration in Denmark in which its citizens were urged to use their iPhones to take pictures of potholes on the nation’s roads. The geo-referenced photos were sent to a government system as inputs that were used to established plans, priorities and funding for road repair.

Socialisation of geography should come from a fusion of the Web, mobile, the cloud and crowdsourcing. As governments and industries find new uses for geospatial intelligence and demand increases capabilities, the geospatial industry has to adapt from a “here’s the sensor and here’s the software to process it” stance. We have to understand that customers are concerned with the end results: getting the data and advice they need to make decisions. To get that advice, others along the decision chain have to get much of the same data.

The power, then, comes from the solution itself and not the technology. To the decision maker, and therefore to us, the solution should drive the technology to create it, rather than the other way around. The days of the one-size-fits-all workflow are over, if they ever really existed at all.


Clockwise from top: Smartphones and tablets are driving the demand for location data; Google Glass enables a customer to view advertisements as and when he shops in a market; and Hexagon Geospatial 2014 emphasises how each part of the workflow integrates to offer end-to-end solutions

Our industry is focused on gathering that data and getting it to the customer in a usable form, but few at the end of the chain are as concerned with the process. It is why decision-makers have an increasing number of geospatial specialists, and why those specialists are turning to workflows trending from the horizontal to the vertically focused to get data to more people in the decision chain. The adage that knowledge is power as a reason to withhold information is being replaced by an understanding that knowledge is power only when the people who need it to make their decisions have that knowledge.

The geospatial industry is developing templates to use with existing and developing technology to adapt to this vertical trend. Hexagon Geospatial, for example, is fusing sensors and software needed for specific vertical solutions in industries such as mining, agriculture and urban planning, with more applicable industries — such as government property tax assessment — on the horizon. Other companies have gathered capabilities to bring together various steps along the workflow path, but there remains opportunity for ideas from entrepreneurial innovators working in open-source systems to become part of what is estimated to be a $100-billion business.

Indeed, small and medium-sized companies are generating half of this business. Major geospatial companies would do well to combine all their the products and capabilities to form a technology umbrella to enable workflow from end to end, from sensor to finished product. For instance, Hexagon’s Geospatial 2014 emphasises how each part of the workflow integrates and interacts with every other part to offer dynamic solutions to dynamic problems.

These capabilities include continual development of sensors from high range video, plus photogrammetry and non-traditional optical sensors, including LiDAR, as well as hyperspectral sensors for satellites to see at night and through clouds. Sensor evolution is going to push more people into cloud computing. Desktop processing remains, but it’s coupled with mobile capacity driven by an increasing number of smartphones and tablet computers.

Geospatial industry is recognising that opportunity comes through need. It is heeding the challenge thrown down by Rollén at the Geospatial World Forum two years ago, “We must find new ways to promote our technologies and enable people … enable and empower billions of people. Empower the people to create a do-it-yourself system, where you pick and choose.” And educate them to realise the options they have in picking and choosing now and in the future.

Brad Skelton,
Chief Technology Strategist,
Hexagon Geospatial,