‘Geospatial applications consume, require and produce big data. Cloud can address this...

‘Geospatial applications consume, require and produce big data. Cloud can address this environment’

SHARE

Kurnia Wahyudi discusses various issues related to cloud computing and its deployment in the geospatial arena Kurnia Wahyudi discusses various issues related to cloud computing and its deployment in the geospatial arena

Kurnia Wahyudi
Kurnia Wahyudi
Country Manager – Cloud Computing Solutions
PT IBM Indonesia

Kurnia Wahyudi discusses various issues related to cloud computing and its deployment in the geospatial arena

Cloud computing is the buzzword in the geospatial industry and IBM has been providing cloud computing services for long. What are the capabilities and benefits of cloud computing in geospatial applications and data?
Cloud computing refers to both the applications delivered as services over the Internet and the hardware and systems software in the datacenters that provide those services. The datacenter hardware and software is what we will call a cloud.

We can approach the use of cloud in geospatial applications from different viewpoints. First is from the provider of geospatial application and data. IT organisations – including the software providers – are under increased pressure to deal with the rising costs of delivering and maintaining IT services. When they want to make an investment, they hope to reduce the risk by implementing the solution at a small level. Most IT providers are realising that by leveraging cloud computing they can significantly change the economics related to delivery of anytime, anywhere world-class computing services, in flexible and scalable way.

Second is from the viewpoint of the user or consumer of geospatial application and data. When they are utilising an IT resource (including an application) – their expectations are “transformed” by their personal experience with the Internet. They are expecting, if not demanding, that IT (application or software)-delivered services incorporate the same rich experience and anywhere-anytime access to applications and data that they have become accustomed to. To respond to the rapidly changing market dynamics and address increasing competition and escalating customer and shareholder expectations, organisations need to be able to ensure faster delivery of higher value products and services. Organisations see cloud computing (in this case geospatial application and data in the cloud provide) as THE way to accomplish that.

What are the various workloads that cloud computing can address?
Not all workloads are necessarily ready for a cloud computing deployment model. Also, workloads in different organisations tend to be at different stages of readiness, including geospatial application.

For example, different workloads have very different architectural characteristics – think about how Google is optimised for search or massive reading, Amazon for web displaying, and Salesforce for a heavy multi-tenant environment. Those areas that have a high degree of affinity with the cloud model – technically and from a risk/reward perspective – include infrastructure-as-a-service and software-as-a-service solutions. Workloads that clients are adopting now include test and development, desktop, collaboration, storage, compute and analytics. All of these are highly standardised ? the more standard the environments the better the economics are going to be.

There are other workloads that, at this time, cannot move to the cloud due to government regulations, criticality or security concerns. New workloads are being made possible due to the benefits of clouds – massive scalability and self-service, economies of scale ? such as medical images, fraud detection, or energy management.

IBM takes workloads into account when building its cloud solutions.

Do you agree that the success of cloud capable software is, to a very large extent, dependent on a solid foundation of hardware underneath?
In general, to certain extent the view is relevant. But then again, one of the key characteristics of cloud computing is resource pooling. A traditional enterprise tends to pull together resources and deploy them in support of a business function workload on project at a time, or in silos. The resources are dedicated to the workload and are unable to support other workloads where they can be leveraged as added support.

Cloud computing, on the other hand, leverages a pooled resources environment that uses virtualisation to enable physical assets to support multiple workloads. In order to drive efficiency of the delivery to enable the self-service, self-management of cloud computing requires standardisation of the assets (including hardware, software and delivery) as well as automation. This is what delivers a responsive end-user experience. Therefore from the end users point of view, it is elastic in scalability, accessible from any device, anywhere, any time, and if charged, payable only for what users use during the time they are using it. From a provider’s perspective, it is about an environment of highly virtualised resources that are location independent and have automated service management to handle provisioning, de-provisioning, change management, security and overall environment controls. If there is special requirement of utilising a specific type of hardware, it is usually triggered by the customers’ experience, the optimum environment where the workload can run and how they value the performance compared to the price. IBM addresses those kinds of needs and levels of customers’ comfort by providing a choice of platform models.

What is the contribution of the geospatial industry to your cloud computing business?
We don’t have a quantified value of the contribution of geospatial industry to our cloud computing business. But as part of our global initiative under “Smarter Planet” project, the geospatial industry plays has an important role in developing smarter solutions – like smarter city, smarter water grid, smarter traffic, etc. So we believe that the geospatial industry will play an increasingly larger role in our business, followed by the needs of smarter or intelligent and real time solution provided by geospatial and data provider.

In your view, how is the uptake of implementation of cloud in the geospatial industry in Indonesia?
To my personal view, a major constraint of implementation of cloud in the geospatial industry in Indonesia is a low level of customers’ awareness about the technology itself as well as the benefits. Cloud computing – if we refer to Crossing the Chasm curve by Geoffrey Moore – is still in the “early adopters” phase.

So we still need to put efforts in educate the market – either the users or the IT service/ application providers. We do understand that the economic factors driving cloud computing are not new technologies. Rather it is the combination of existing technologies with a focus on the end user and enabling the end user experience.

Virtualisation drives higher utilisation which lowers capital expenses. Standardisation also reduces capital and labor costs, while automation reduces management costs, drives an enhanced user experience and automates many manual tasks to reduce errors and reduce the costs associated with managing an environment.

You can have cloud without these elements, but it will not be a cost effective cloud that delivers with stability and an enhanced end-user experience of self-enablement and self-management.

How can geospatial technology users in the Asia Pacific region benefit from cloud?
The cloud may apply within the private data centres of these users so that they can take advantage of the best practices that public clouds have established, namely scalability, agility, automation and resource sharing. Creating these private clouds enable these users to focus on innovation for the business, reducing both capital and operational costs and automating the management of complex technologies.

Moreover, the core applications within technology-driven enterprises create the most strategic competitive advantages. Because each enterprise using geospatial technology has its own unique challenges, these are typically large-scale custom-developed applications whose development, deployment and management can greatly benefit from a cloud-enabled platform. The financial services industry is among one of the leading industries to venture down the cloud computing path to utilise these benefits.

The nature of geospatial applications is such that they consume, require and produce big data. The data and the number of customers will increase over time, as well as the layering map required. Data fluctuation as well as number of customers will impact various infrastructure requirements. Cloud will address this kind of environment by offering scalability via dynamic (“on-demand”) provisioning of resources on a fine-grained, self-service basis in near real-time, without users having to engineer for peak loads. Performance is monitored and loosely coupled architectures are constructed using Web services as the system interface. One of the most important new methods for overcoming performance bottlenecks for a large class of applications is data parallel programming on a distributed data grid.

Scalability can be measured in various dimensions, such as:

Load scalability: The ability of a distributed system to easily expand and contract its resource pool to accommodate heavier or lighter loads. Alternatively, the ease with which a system or component can be modified, added, or removed, to accommodate changing load.

Geographic scalability: The ability to maintain performance, usefulness, or usability regardless of expansion from concentration in a local area to a more distributed geographic pattern.

Administrative scalability: The ability of an increasing number of organisations to easily share a single distributed system.

Functional scalability: The ability to enhance the system by adding new functionality at minimal effort.

For example, a scalable online transaction processing system or database management system is one that can be upgraded to process more transactions by adding new processors, devices and storage and which can be upgraded easily and transparently without shutting it down.

What are the specific offerings from IBM for the geospatial industry?
We have specific products for GIS-based solutions, but they are encapsulated as part of IBM’s Smarter Planet strategy. In IBM Smarter Planet, almost anything can become digitally aware and interconnected. Smart airports, smart banks, smart roadways, smart cities – with so much technology available at such a low cost, the list of possibilities is endless.

Smart systems are emerging that can improve our lives and have a profound impact on environmental challenges facing society and the planet. These smart systems are “instrumented” into the fabric of our world and provide new, actionable information in real time that can be used to gain insights, optimise systems and change behavior in areas such as electricity grids, transportation, upstream petroleum, distribution, and water management. So, GIS is taking a role part in presenting those SmarterPlanet result.

Interoperability and security are concerns associated with cloud computing. How can these issues be addressed?
Having said all this, we know that there are hurdles or obstacles to be overcome for any enterprise looking at cloud environment. Based on feedback from clients and research from firms like IDC, we know that security remains a top-of-mind concern when discussing cloud adoption. Multiple surveys have also shown that the top inhibitor to moving forward with cloud computing is security.

Just think about the security implications in the consumer space with challenges that Facebook and Google have been facing – now apply that to the enterprise. We need to have standards, security architectures, end points and typologies, etc. So, the answer in addressing the security concern for using the cloud is from the customer’s itself. Have they had the security standard applied? If they do have the security standard applied to their IT system, what they have to do is to cross-check whether the applied security standard run by cloud provider follows or meets the customer’s one.

It is clear that cloud presents challenges that traditional IT environments do not have. There is obviously a lot to think about in terms of security when moving to a cloud environment. IBM approaches its security strategy in a similar, but adapted way to our strategic outsourcing business. We primarily look at five angles that I think it can be applied universally.

But, those security concerns should not become a barrier for an enterprise to use the cloud. There is still an alternative on how to utilise cloud by choosing the appropriate cloud deployment model – if they are not comfortable utilising public cloud. There are three deployment models of cloud: private cloud, public cloud and hybrid.

Besides that, an enterprise may use the general guidelines as a framework that allows an IT organisation to pick and choose the model that can be moved to the cloud. The advice is:

  • Critical and confidential data or information needs to be processed and held inside secure private systems; this could be held on a private cloud on customer premises or in a service provider’s data centre;
  • Less critical and confidential data or applications could be held in a shared community cloud where the cloud service is shared with a controlled group of other cloud users;
  • Public domain or non-critical information that does not place the company at risk can be held outside the organisation in the public cloud.