| Prof. Arup Dasgupta
I always like to ask my students, “What is the most important element of a geospatial project – powerful computers, versatile software, accurate data or intelligent users?” It takes a while before they realise that unless you have an intelligent user, the best computer, software and data are just not enough to get the work done. An intelligent user can cope with the shortcomings in computing power, software and data and still produce useful results. If I ask what is the next most important element then the answer has to be: data.
Data is the theme of this issue. Geospatial data has been of great importance since long but the real fillip came with the integration of computer technology with surveying and mapping. From those early days of computerised cartography to today, the explosion in the variety of data sets and data sources is mind boggling. Chuck Killpack, our Editor-North America covers this theme and looks at the new kinds of data driving the industry. Even in the trying recession period, data has shown a healthy growth which in turn, is driving technology and applications. There are issues of cost and regulations but never the less data demand is driving the industry to great heights.
During Geospatial World Forum 2011, I had a stimulating talk with Jeff Jonas, IBM Distinguished Engineer and Chief Scientist at IBM Entity Analytics. His attention is on big data and the need for new physics to make sense of the data. Geospatial data is nothing if not big and therefore it is not surprising that he calls geospatial data as ‘analytic super food’ because it provides a space-time context which, when overlaid with other data can help make much more sense of that data. To make sense we need analytics of a very high and complex order; analytics that will crunch masses of data automatically and direct our attention to the relevant information.
Consider the recent devastating earthquake in Japan. Researchers in Japan studied the Kobe quake and discovered that the destruction is caused by a ‘bubble’ anomaly. They could use this to predict another quake. This is an excellent example of the application of analytics to geospatial data. However, the power of this analysis fell short of predicting the recent earthquake. Clearly, there is some way to go in this area. We need more data and better analytics and an intelligent user to make sense of the results.