Home Articles Mladen Stojic explains the role of location in Big Data analytics

Mladen Stojic explains the role of location in Big Data analytics

Mladen Stojic, President, Hexagon Geospatial, firmly believes that open data help supports downstream economies that can use data to facilitate better and smarter decision making. 

What are the key drivers boosting the global Cloud market?

As organizations are accumulating and working with more data, they need more cost-effective solutions to manage it. This does not necessarily mean within the existing IT infrastructure but effectively moving that data to the Cloud, and then leveraging streaming technologies and data access technology with some other new Cloud capabilities in the market. The total cost of ownership is effectively going down because the content does not have to be managed in-house; it could be managed as a hosted service and ultimately connected in order to get that content and use it with an application and solutions.

What role do you see for geospatial in Big Data analytics?

In terms of geospatial, everything has a location. If we look at the perfect union of Cloud and IoT in geospatial, we now have the ability to basically understand how things change and the ability to connect dynamic sensor feeds, whether they are traffic video surveillance or weather sensors or capturing information about noise pollution or air quality. All of these sensors are now collecting massive amounts of data and this can then be connected to a geoprocessing service so that insight about analytics of the data can be gathered and delivered in a way of dashboard or different VI tools. So, it is the perfect union of Cloud with IoT or analytics — all coming together once again under the assumption that everything has a location.

If you allow opportunity, not fear, to drive your decision making, good things will happen

What role do you think social media is playing in geospatial?

Social media is seeing a lot of activity in the areas of public safety and security. In some of our solutions, we have the ability to connect to social media feeds as they relate to incoming information regarding an event. Having access to it or connecting a pipe of that information into a system gives us the ability to understand what is happening in real time. Being able to sift through feeds in an intelligent way enables you to extract the necessary pieces of information that is critical to feeding other downstream process for running analysis or determining where to deploy emergency responders in the time of need. All of that really comes together on social media because crowds are collecting data and ultimately feeding through social media.

How can such unstructured data be curated to yield stable patterns?

All this data need to be tied to a location and I think that is the precondition to success. Without location, it is ultimately very difficult to do downstream analytics or any geoprocessing analysis on the data. So, the first step is to geo code or to tag some of these unstructured data and then allow it to be available so that you can catalog all these inputs and ultimately gain access to those inputs in a geoprocessing service or an analytical service. And then fuse this content in some sort of analysis model and ultimately deliver additional insights.

role of location in Big Data analytics
Intergraph Planning & Response is an application for emergency operations centers and mobile command staff to manage major events.

Data privacy has been a very burning issue of late, especially with regards to location. How do we keep a balance between privacy and security?

It is a very important topic and one that varies by region. In Europe I have seen a trend and a call for open data. But then you get the same agencies or same countries calling for open data getting upset on privacy issues when, say, Google is driving around with a mobile device capturing street-level imagery of a neighborhood of a city. Data is just data… it could be image data, mapping data, 3D data… it could be used for great benefit, and there is a huge opportunity in being able to leverage data. I am a firm believer that open data help supports downstream economies that can use data to facilitate better and smarter decision making.

Closed data is certainly being driven from the perspective that if that data gets into the wrong hand then it could be used for whole variety of unpleasant purposes. If individuals want to get access to the data they will find ways to get access to it. But there are much greater opportunities when you have open data as opposed to not having it open. If you allow opportunity to drive your decision making, good things will happen. If you allow fear to drive your decision making processes, it limits not only the geospatial economy but also the opportunity of what can be done with open data in the society, particularly at the state or local levels. I am of the belief that open data is really good and supports citizen engagements, and also opens up the industry to much broader use of not only geospatial services but also applications.

role of location in Big Data analytics
Hexagon’s M.App Chest stores and manages high volume geospatial data.

Every case is unique. You have to look at the interest of each country and then the interest and safety of its citizens, and decide on what is your priority. Do you compromise privacy? It is a tough topic. Even organizations like Apple have their terms and conditions. When they collect data about a given individual’s location, Apple has every right to not to disclose the information to the government. But, again the government can go through its legal processes to hand over that data. It is a sensitive issue. When it comes to an overall security of the general public, I am a fan of protecting the general public.

About AI effecting geospatial and visa-versa, how do you think geospatial can encourage and empower some of these areas?

Geospatial is still unique to the fact that the data is coming from satellite sensors, radars, and mobile sensors. In order to fully leverage the power of that content you have to know about the sensor, and know about the metadata of that sensor. Without that knowledge you can still extract information from the sensor but it is not powerful; especially, in productive modelling, where you look at the trends and changes overtime — which is where deep learning and AI can have a very profound impact.

We have to leverage the sensor metadata, and to understand the physics associated with how data is collected. Having that knowledge facilitates even more and better information extraction through AI and also more predictive analytics with deep learning. The opportunity is immense but we have just started scratching the surface. We are beginning to see a number of start-ups coming up that are focussing on very surgical solutions in terms of solving specific problems with the synthesis and the fusion of multi-source content — imagery, point Clouds, GIS databases, CAT databases, or BIM. When you start fusing these sensors together in obvious data inputs, we have the opportunity to use very smart algorithms to get even more insight or more information for all the inputs. That is where AI and deep learning can have an insightful impact on the industry.