Home Articles Standards in policy: Maximise value of data

Standards in policy: Maximise value of data

Steven Ramage
Steven Ramage
Executive Director, Marketing and Communications
Open Geospatial Consortium (OGC)
[email protected]

Governments and large enterprises routinely create and implement policies that help them maximise the value of their information systems and data. They naturally want to leverage their past investments and plan their future investments in ways that maximise the value to their organisations. They want to reduce technology risk as a software lifecycle management issue. This article explains why OGC and ISO standards must be taken into account in designing such policies.

Standards maximise value
Before making policy recommendations, let's first look at what it means to "maximise the value of geoprocessing systems and geospatial data". That obvious and general need points to the specific classes of user needs:

The need to share and reuse data in order to decrease costs (avoid redundant data collection), get more or better information, and increase the value of data holdings.

Organisations and departments within organisations want to access each other's spatial information without copying and converting whole data sets.

This involves passing data and instructions between different vendor's systems; integrating data models, formats and coordinate systems; integrating map displays (symbology) from different data servers; and finding and evaluating data and services held in other locations.

The need to choose the best tool for the job, and the related need of reducing technology and procurement risk (i.e., the need for flexibility in choosing vendors).

Pieces of a solution must work together. Organisations want to be able to add or augment their current system with a standards based component from any provider, with minimal integration costs, and have it work seamlessly.

They want to understand the interoperability requirements of stakeholders and help them define architectures that enable data sharing. And they want to make geospatial or location based services usable with mainstream services within their organisations.

The need for more people with less training to benefit from using geospatial data in more applications: That is, the need to leverage investments in software and data.

Once geoprocessing systems work together and work with other systems on the Web, new opportunities arise: Users want to organise geospatial data stored in text and on video, audio, and other media.

They want to access and process online sensor data from multiple sources, and they want location based services that are portable across devices, networks and providers. 'Semantic translation' between data models becomes a possibility, as does grid computing for geoprocessing applications.

These needs cannot be met unless organisations' geoprocessing systems and components interoperate through standard interfaces and encodings based on the workings of the Internet and the Web.

Yesterday's policies don't meet today's needs
Before the Internet and the Web, a well-run organisation would use in-house experts or consultants to provide interoperability among the geoprocessing systems used in various offices of the organisation. Policies focussed on standardising on specific vendor solutions, standardising on certain vendor-specific or domain-specific proprietary formats and encodings and providing custom or third party tools for connectivity between geospatial systems and between geospatial and non-geospatial systems.

Now, enterprise information and communication technology (ICT) has evolved dramatically and the model described above is outdated. Today, a well-run organisation can best maximise the value of geoprocessing systems and geospatial data by using the expanding framework of Internet and Web standards to enable interoperability both internally and universally.

Standardisation is the reason for the success of the Internet, the World Wide Web, e-commerce, and the emerging wireless revolution. 'Communication' means 'transmitting or exchanging through a common system of symbols, signs or behaviour', and 'standardisation' means 'agreeing on a common system'. Thus we have much more communication today because we have much wider use of common ICT interfaces and encodings.

Yesterday, interoperability depended on tight coupling of systems under centralised control. Today interoperability depends on wide use of standard interfaces and encodings that enable loose coupling of countless systems that can exchange much more information, at much less cost, than was possible yesterday. Today, because open standards enable 'plug and play', much less expertise is required in integrating systems. Also, because Web standards such as HTML and XML make user interface design easier, and because widespread implementation of a small number of user interface design patterns has made millions of people familiar with those patterns, less expertise is required in using the systems.

Policy makers should also keep in mind that advances in technology inexorably change organisational structures, workflows and business models. Today's SDI is different from yesterday's SDI. SDI 2.0 allows distributed or centralised approaches to fit the needs of users. It is built on Web services and online catalogues and registries, not file transfers and manual clearinghouses. It is more adaptable for place-based decision-making. The pace of change requires new thinking about national SDI roles and investments, and a commitment to interoperability based on open standards is essential in dealing with this transition.

Today's procurement policy imperative: Buy products that comply with OGC and ISO standards.
Open consensus standards organisations including the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF) are responsible for the underlying standards of the Web and the Internet, such as the Hypertext Markup Language (HTML), eXtensible Markup Language (XML), and Transmission Control Protocol/ Internet Protocol (TCP/IP).

Building on those standards, the OGC, an open consensus standards organisation, sets standards for geoprocessing, which includes capabilities found in geographic information systems (GIS) and digital systems for Earth imaging, Web mapping, location based services, surveying and mapping, CAD-based facilities management, sensor webs, navigation, cartography, etc. OGC standards are consensus-derived specifications for open interfaces, protocols, schemas etc. that enable different vendors' systems to exchange data and instructions, and that enable full integration of these capabilities into all kinds of information systems. Several important OGC standards have become ISO standards, and ISO standards for geospatial metadata are important in the use of some OGC standards.

Private and government enterprises once based their geospatial procurement policies on the yesterday scenario described above. Today, to maximise the value of investments as described in the first section of this article, organisations should use procurement language that requires vendors to offer standards-based solutions. Tests exist that check for correct implementation of many (but not all) OGC standards. A vendor's product is "Certified OGC-Compliant" for a particular standard if it passes such a compliance test under OGC supervision. When OGC compliance tests do not yet exist for a standard, procurement language should require vendors to offer products that implement OGC standards. In this way, procurements take advantage of best industry practice and standards that have been developed by consensus among a large body of industry experts. Procurements that mandate standards save time and money as described above.

The OGC Reference Model (ORM), which describes the set of adopted OGC standards and their associated documents, is a useful resource for defining architectures for specific applications.

Such architectures show the relationships between different information resources and specify the particular interface and encoding standards to include in requests for quotes (RFQs), requests for information (RFI), and prequalification questionnaires (PQQ). (The PQQ helps buyers assess the capabilities of potential providers prior to any RFx being issued.)

What exactly is OGC compliance?
A developer runs an OGC compliance test (available free and online) for a particular OGC standard to determine whether or not the developer has correctly implemented the standard. After the product passes the OGC compliance test, the developer can seek formal OGC certification of the product and the right to use the Certified OGC Compliant service mark (Figure 1). An OGC service mark on a product gives buyers confidence that a product will work with another product that carries the same OGC service mark, regardless of which company developed the product. For purchasers of geospatial technology products and services, reducing technology risk is an important goal, and the Certified OGC Compliant service mark helps achieve that goal. That is, compliant products are very likely to interoperate with all of the following:

  • Legacy systems that have been upgraded with interfaces that implement OGC standards
  • Components and solutions purchased in the future from different vendors and integrators who all implement OGC standards
  • Systems used by data sharing partners, now and in the future

OGC compliance tests are available for many but not all OGC standards, and currently they are available for server- side products but not client-side products. OGC is constantly increasing the number and percentage of standards for which tests are available.

A product's advertising or packaging may legally carry the Certified OGC Compliant service mark for compliance if the product has, under OGC supervision, passed an OGC compliance test.

This mark will be valid for a particular version of a particular standard and it may only be used if the product developer has paid for the OGC-supervised compliance test and the right to use the mark. A buyer can confirm whether the product is an "implementing product" or is "certified as OGC compliant" by looking for the product name on the OGC Website at . org/resource/products .

The OGC service mark in Figure 1 must be used in conjunction with one or more individual OGC standard service marks such as those shown in Figure 2. Version numbers are important. OGC standards are available free online, so any software developer can access and implement the standards. Products whose encodings and interfaces implement OGC standards, but which have not yet been tested and accepted by OGC as 'Certified OGC Compliant', are referred to as 'implementing products'. The vendor of a product that implements an OGC standard but that is not Certified OGC Compliant may not legally claim that the product is compliant with that OGC standard. The vendor of such a product may, however, legally claim that the product 'implements' the standard. Implementing products are listed on the OGC website . org/resource/products ) along with OGC compliant products. Implementing products might interoperate with other implementing products and with OGC compliant products, but interoperability between two products is more likely when the products have both passed OGCsupervised compliance tests and carry the Certified OGC Compliant service mark.

This is because the Certified OGC Compliant service mark certifies that the standard has been implemented correctly.

A lot of time and effort goes into planning policies and procurements, and more time and effort goes into implementing and adapting them over time. At the outset and over time, stakeholder organisations can save time and effort by building on a foundation of international standards that are developed and maintained in an open, international process that considers the needs of these stakeholders.

Vendors in the OGC who have committed significant resources to developing the OGC standards, with input from users, did so because they subscribe to openness and interoperability in their provision of solutions. They have implemented these standards in products, but they depend on their customers and potential customers to understand the benefits of using such standards and ask for them.

Public and private sector organisations owe it to themselves, their customers, shareholders, stakeholders, data sharing partners and constituencies to make open standards a central part of their procurement policies, so they can reuse the massive investment that has been made in open standards by a global community as well as reuse the investments the organizations have made.