Home Articles OGC using data standards for urban planning

OGC using data standards for urban planning

OGC members from geospatial or built environment communities help in informing, developing and applying the standards that can knit together successful cities of tomorrow. 


We hear the term “smart cities” a lot these days, but asking people to define what an ideal smart city actually is often leaves them at a loss for words. It’s a hazy concept that has come to represent the future of urbanism, and often the explanation comes down to something like this: “[mumble mumble mumble] BIG DATA!”

As a standards organization, the Open Geospatial Consortium (OGC) operates with a specific idea of what a smart city should look like: one where data from disparate sources and isolated systems can be thoughtfully combined in new ways to build solutions to city challenges and benefit citizens and other stakeholders. Datasets that were never “made for one another” are nonetheless able to enrich each other because they literally describe a common ground in the urban environment. By integrating data from various systems throughout a city, we gain improved understanding of a city and its health.

Creating Urban Models

A helpful concept for smart cities is that of the Digital Twin. In a functioning Digital Twin, data describing the real world is being continuously and seamlessly ingested, modeled and analyzed. The resulting virtual world evolves synchronously and nearly simultaneously with its real-world counterpart, opening a quantitative window into how the real world behaves. It can also show the potential impact of real-world changes, such as sudden flooding events or gradual population shift, and what might be done to prepare for, or respond to these events.

While a 1:1 real-time Digital Twin will remain unachievable for the foreseeable future, the concept illustrates the value of creating an “urban model” that represents just the essential elements of a city using data pulled from different sources. Unifying the required data doesn’t have to be difficult: many of the mechanical problems of joining disparate data sources can be solved by using current technologies coupled with common formats and interfaces — standards.

The shortest path from status quo to smart data integration and urban models is where PPI (Pivotal Points of Interoperability) opportunities coincide with MIM (Minimum Interoperability Mechanisms) needs

OGC has a strong portfolio of standards relevant to urban models and for cities becoming “smartly functional”. However, a successful urban model also requires bridging two previously disparate digital perspectives: the integrative, layered geospatial perspective used to understand how different elements of a city or landscape fit together; and the constructive, detail-oriented Building Information Model (BIM) perspective used to build and maintain the individual structures that fill a city.

ALSO READ: We need humans at the center of emerging technologies

This isn’t just a question of competing scales. What makes the urban model powerful is also what makes it tricky to integrate. Contained within BIM is a wealth of detailed semantic information describing structural components and how they fit together into functioning buildings or other elements of the built environment. Geospatial models map features and how they relate to one another by type and position. The question is which part of a BIM conglomerate of brick and steel is the same reality as that map symbol laid out with many others in a neighborhood, and which parts are useful to maintain in a city-wide model representing human interests?

Aligning Paradigms

This alignment of the two paradigms is tough, but by no means insurmountable: OGC is actively collaborating with buildingSMART International through their Integrated Digital Built Environment Subcommittee to jointly develop standards that allow the details — both semantic and spatial — of BIM models to take their place within the greater contexts of city-wide geospatial models. The IDBE SC will soon release its first technical documents.

The challenge of assembling city data into meaningful and actionable urban models has many aspects: bringing data together from diverse systems; reconciling the very different perspectives of the BIM builder and geospatial analyst; and collecting, aggregating, processing and integrating information from single-purpose information systems and sensors or other sources, and making that data easily available to specific decision-making applications for city stakeholders. Ideally, each component and stage of such a system can contribute to new models and applications through conformance with data access standards. The reality today, however, is that many such systems have already been installed, but without the necessary standards support to enable data to be shared. What is needed is a move towards a sensible and effective approach for a city to become “smarter” in place.

Realizing this, in 2016, OGC headed the European Innovation Partnership on Smart Cities and Communities (EIP-​SCC) ESPRESSO program. ESPRESSO was created in order to develop a conceptual “Smart City Urban Platform” based on open standards. Two major pilot programs were established in the Dutch city of Rotterdam and in Tartu, Estonia. The Rotterdam pilot updated its existing 3D city model using CityGML, and began integrating new forms of sensor information, as well as data previously collected manually. The Tartu program moved to integrate systems that measure energy use, with the aim of becoming Europe’s most energy-efficient city.
In studying the results of ESPRESSO, OGC recognized two major concepts critical in cities moving their existing systems towards open standards:

Minimum Interoperability Mechanisms (MIM), a concept developed by Open & Agile Smart Cities (OASC), refers to specifications and architectures that do not attempt to encompass every conceivable aspect of a city but instead allow systems to be connected only when and where needed. They enable future innovation and adaptation without requiring wholesale reconstruction of existing functioning systems.
Pivotal Points of Interoperability (PPI), a concept developed by the National Institute of Standards and Technology (NIST), are component standards and methodologies so widespread and useful that, even in an isolated system, vendors don’t think about not using them. In other words, it’s an existing system integration practice that also happens to facilitate data access and sharing between other systems.
The shortest path from status quo to smart data integration and urban models is where PPI opportunities coincide with MIM needs.

Imagine that a city has a system that analyzes the flow of traffic on its streets using video data taken from cameras installed for the purpose. In a typical system, the traffic cameras feed their data through a custom network into a closed single-purpose system for storage, retrieval and analysis. Suppose that the results are so successful that the city’s transportation department wants to similarly analyze pedestrian traffic. They then learn that the vendor’s pedestrian package has lately been deprecated and are advised by a re-seller that a new system – cameras and all – will have to be purchased for the task.

It turns out, however, that such a closed vehicle monitoring system often feeds its video data into a commoditized data storage system, a potential PIP supporting standards like SQL and JDBC. While refitting hundreds of cameras to support standard interfaces would be a major hurdle, commodity data storage systems can often be (or already have been) assembled into “Data Lakes”. A Data Lake is a MIM that lets just enough light into closed systems, and with it, allow the standards-based mixing and matching of data from different sources needed to power versatile urban models – or in this case to feed an application that analyses pedestrian traffic using video data.
Data standards underpin MIM and PPI, as well as opportunities such as Data Lakes, and thus support the organic growth of urban digital models. This “plug and play” approach sustains capabilities through generations of vendors and technologies and allows players of all sizes to innovate and participate in the smart city market. Standards can also prevent cities from having to commission overly complex, expensive, all-in-one solutions to what are often just new sequences of distinct but common challenges.

Creating truly smart cities, where bringing together data is a means to bringing together communities, is both a moral and financial imperative. As the world becomes more urbanized, cities need to meet the challenge of upholding quality of life in the face of growing populations, dwindling resources and changing climate. OGC members, whether from the geospatial or built environment communities, help to inform, develop, and apply the standards that will knit together the successful cities of tomorrow.

ALSO READ: Data platform to monitor the planet’s pulse