Requisites for smooth functioning of utilities

Requisites for smooth functioning of utilities

Stephen Brockwell, President, Brockwell IT

Up-close with Stephen Brockwell, President, Brockwell IT, Geoff Zeiss – Editor, Energy and Building finds out how electric utilities are taking stock of challenges and adopting new technologies by integrating spatial data

What do you think are the biggest challenges the electric power utility sector is facing, especially with respect to smart grid?

Smart grid is interesting as well as necessary, but from my observations, the real challenges fall into a handful of related categories such as inspection, maintenance and reconstruction, labour force changes and supply/demand management. The crucial ones are, of course, inspections, maintenance and reconstruction since infrastructure — not only the equipment, but the structures — is ageing. And an enormous amount of capital is being pumped in to keep up. This is exacerbated by weather pattern changes. Much of the evidence for this is apocryphal, but I know of two large customers (one of our regular ones), who have had massive inspection programmes to mitigate weather related infrastructure damage. Significant unpredicted outages due to ice and salt take the load off conductors. The weather problem isn’t strictly related to the age of the infrastructure, and inspections have to adapt to weather patterns, weather events and proactively inspect equipment that’s at highest risk.

The labour force issue is also impacting utilities for long now and much system knowledge is retiring. Risk was mitigated by people who know system, the lines and the transformers and had 30 years’ experience in keeping it operating. With the loss of that knowledge, it’s not clear if there are systems in place to replace them. Spatial data has a role in this, but it’s not sufficient. Big data and learning systems can mitigate some of it, but those are massive investments that are complex to implement. I don’t think most utilities have been completely successful when implementing predictive maintenance as part of asset management. Many have programmes to put it in place. Supply and demand management are made vastly more complex by diversity of energy sources and opening the grid to different generation and consumption patterns. The real-time and big-data systems manage the data for these patters are not fully evolved from what I can see.

What are the key IT technologies, especially location-aware that are contributing in transforming the electric power industry ?

When it comes to the assessment of field conditions in emergency and maintenance situations, good old-fashioned GPS and its increasing position is ubiquitous. But reality capture using LIDAR and other technologies has enormous promise as well. Massively detailed information about a site has become very affordable. The integration of the capture devices with fully functional tablet-like network-connected field computers is in combination with the cloud for storing and managing the massive data sets has enormous promise for accelerating the entire site survey and design process. There’s another important aspect to that — designers can design
with enormous precision and with realistic 3D models that are structurally and electrically balanced.

In your experience working in the electric power utility industry, where has geospatial data and technology had the biggest impact? Where do you expect it to have the biggest impact in the future?

Geospatial data and technology is regarded as indispensable. I don’t know of a utility that doesn’t understand that at this point. City planners for a medium-sized municipality that operates electric, gas, water and waste water infrastructure with over 60,000 service connections for each discipline have a massive backlog problem. Their location data for as-builts is becoming increasingly out-of-date because they don’t have the resources in place. They also don’t have the regulatory framework to hold engineering firms to account to deliver accurate as-built data. Accurate here means a survey-quality design with elevation and relative location details. They
knew that they simply had no choice, but to address the gap. The problem is defining a sustainable data programme that addresses the need for timeliness and accuracy and finding new ways to approach the problem of data capture.

I think it is not impossible that, in the not-too-distant future, a municipality will use one of the numerous partners out there to skip 2D data capture altogether—we’re almost at the point where it would be cheaper to scan all the facilities than it would be embark on an old-fashioned off-shore 2D scanning and conversion effort.

The metrics for old-school migration no longer make sense. The ROI is long-term too and the quality management programmes are too intensive while the duration of such projects simply puts too much risk on the organisation. This scanning approach presents problems for underground infrastructure,but equipment for scanning those is also improving.

We need to completely redefine storage management, data management and other scenarios. 2D data used for theconnected network will be a projection of the 3D data augmentedwith SCADA information about device states. That’s future, but it ties in big-data and cloud computing with many of the services that are production ready today.

Industry leaders are projecting a much greater role for analytics and especially spatial analytics as utilities transform themselves into data-driven enterprises. What has your experience been with analytics, especially spatial analytics, when working with utilities?
That’s a very exciting area, especially when it comes to correlating weather events, soil conditions, ice loading with age, design parameters and condition of equipment. It should have been possible to predict the recent weather-related outages in Ontario. I believe changes will be made to get moving on that.

Many don’t fully realise this, but most of the database technologies have easy-to-use tools for analytics management. That’s one change we had been championing for years. Once spatial data is simply a column in a database, it can be processed and managed through the architectural tiers as easily as any other data. Spatial analytics problems are now data analytics problems; more needs to be done to push spatial processing into the cores of big data tools. It is, however, not impossible to extend them today to perform that function because the database fundamentals have been in place for a number of years.

Design technology is moving from CAD to BIM (Building Information Modelling for infrastructure) motivated by improving efficiency. Are you seeing the adoption of this new technology by utilities for designing distribution networks, substations, transmission lines, generation facilities, and other infrastructure?

BIM is fine for the structural elements, but it doesn’t come close to covering the complexity and design problems faced by electric power engineers. Inadequate investment in harmonised design processes are seen. The GIS data is pulled in, the design is done, the construction’s completed and the as-built is redrafted. I strongly believe that utilities need their own modelling tools, specific to their business that handles both site-specific modelling conditions and the long distance nature of transmission and distribution networks.

Are you seeing demand from electric power utility customers to support standards that include geospatial standards such as Multispeak ?

We’re supporting a number of customer efforts by providing Multispeak interfaces for some of the hosted services the customers use to capture real time data. For smaller customers in Ontario, that means ensuring the network is compliant with Multispeak 3.0. That is a process to keep its representation in the SCADA/ OMS hosted system up to date and having web servicesavailable to receive the real-time information and store a representation of it appropriate for geospatial analysis.

Read the Interview also here