Quality time: smart grids11 June 2015
The increasing prevalence of smart grid technology has multiplied the opportunities for monitoring power quality. Used intelligently, such information can help utilities provide the reliable, stable power supply a thriving economy demands, but tackling the huge volumes of data available can be a problematic task. Bill Howe, power-quality programme manager at the Electric Power Research Institute, speaks to Sarah Williams about how to gather and analyse grid data for maximum effect.
It may be the second-largest economy in Africa, but frequent rolling blackouts in recent years have played a key role in stagnating South Africa's economic growth.
An acute lack of investment has led to ailing power stations and infrastructure. So grievous is the problem that Eskom, the state-owned company behind 95% of the country's electricity supply, actually publishes a 'load-shedding schedule' to warn of planned outages.
Meanwhile, in Brazil, transmission troubles and increased demand amid January's drought saw the national grid operator ordering electricity supply to be cut to several states.
A rainfall shortage might have been expected to cause some disruption in a nation where hydropower accounts for around two thirds of installed power generation capacity, but Brazil's problems extend to wider issues within the grid. In 2014, more than 16% of its total generation was lost in transmission and distribution, and in some parts of the country, around 20% of all electricity generated is lost to theft, according to figures from GlobalData.
A secure and high-quality power supply is about as crucial as it gets when it comes to sustaining industry, commerce and investment in a growing economy, and the effects of failure in this regard can be expensive and damaging. The ability to keep track of supply, and to detect, predict and repair problems with minimum disruption to users is therefore vital.
As the role of the 'smart grid' becomes increasingly significant in achieving this aim, utilities are faced with a huge volume and variety of data from across their networks.
Knowing how best to process and analyse this is no easy task, but capturing the right information in the first place is also a major concern. As obvious as this might seem, Bill Howe of the Electric Power Research Institute (EPRI) is often surprised by utilities' frequent assumption that data is reliable and accurate without first taking steps to verify the means by which it is measured.
As manager of the power-quality programme at the US-based non-profit institute, Howe is quick to emphasise the need for a careful and thorough approach to data gathering.
"If you think dealing with large amounts of data is challenging, try dealing with large amounts of incorrect data," he says. "If we want to get good value from our investment in data infrastructure, we need that data to be accurate, and we know from our vast experience in the power-quality world that this doesn't happen by default. It has to be designed from the outset into the entire data architecture.
"The first major step is the device itself. It needs to be capable of measuring power-quality parameters with a good degree of accuracy and precision, and to be enabled and configured for making those measurements. We often find that even though the device may be suitable, utilities are not enabling the specific capabilities required.
"Devices need to be calibrated and installed properly, and have the appropriate data communications and storage infrastructures to make that data available for subsequent analysis," adds Howe.
In an EPRI survey of US-based utilities, Howe's team found that fewer than one in ten utilities had "most of these elements adequately covered" when it came to setting up smart meters.
"Our biggest concern is that most utilities do not consult their power-quality teams when they're selecting smart meters," he says.
"The assumption is, 'Well, it says it will measure power quality, that must be correct', but the power-quality teams need to be part of that decision-making process if the utility wants to get good power-quality data upon which it can base management decisions."
The root of power monitoring
As part of its power-quality programme, EPRI tests the various measurement devices available, from smart meters to relays and switches, that purport to measure power quality. It is also working on developing best practice for utilities to validate data as it is collected.
"The utility industry tends to accumulate data and then go back to it when there is a need for it, and try to use it to make an informed decision. That model does not serve us well, because you discover problems with the data only after the fact.
"So we're working on visual and analytical methods to validate data as it is acquired so that any problems can be identified quickly and remedied quickly, rather than months or years after low-quality data has been acquired and stored, when it's too late."
The most common type of data collected is root-mean-square (RMS) voltage data, which can be used to observe the long-term changes in distribution voltage. With the right equipment, other readings can be taken too, such as RMS current data, and power measurements like kVar (reactive power) and kVA (apparent power). The development of smart grid technology is also allowing utilities to access large amounts of 'edge-of-grid' data for the first time too - a huge step forward as it allows utilities to evaluate their performance at the point of use.
Once a utility has established a reliable and accessible system for capturing all these data types, analysis opens up a huge range of capabilities. For Howe, the smart grid facilitates three major areas of improvement: grid stability (avoiding large regional outages); reliability (ensuring there is always voltage at the plug); and asset utilisation (making better use of the existing grid infrastructure).
Analyses undertaken could include observing load profiles and considering the impact of widely distributed renewable generation resources, or analysing asset use to extract more value from existing utility infrastructure. The ability to detect, predict and respond to errors is another clear advantage.
"Storms are obviously a big factor in the power-quality world, so to be able to directly correlate weather events with edge-of-grid power quality could really help," Howe explains. "It's something we've only been able to do anecdotally so far, but the smart grid offers an opportunity to do it with a much higher degree of precision.
"For example, you could have a sensitive facility that is very vulnerable to lightning-caused transience, and you could potentially tell the facility to switch to its back-up generator for ten minutes while the storm cell passes. That could save it from voltage transients that might otherwise cost $1 million in lost productivity."
Another area of analysis concerns using power-quality monitoring data to identify the 'signatures' of possible equipment failure - looking for the patterns or markers within the data that a transformer, cable or capacitor is exhibiting early signs of a problem. A repair team can then be dispatched to the problem and the facility taken offline before a failure - and the resultant outage - actually occurs.
"This capacity touches upon every aspect of what the smart grid is intended to accomplish, and it absolutely improves reliability, because instead of a piece of equipment failing and perhaps taking down a segment of the grid, we capture it beforehand.
"It obviously improves asset use too, because we are keeping equipment operating rather than having it fail.
That's very exciting and, in terms of payback, it has a huge potential economic return for the utility industry, as well as a societal benefit."
Quality not quantity
While the return on investment is clear, Howe cautions that an objective and efficient approach to monitoring is important, and contests the presumption he often encounters in utility companies that 'the more data the better'.
"I'm fond of saying, 'The person with two watches never knows what time it is', and what the utility industry will need to focus on is prioritising what data streams it needs from the grid," Howe says.
"I think that over the next five to ten years, we may actually start to shut off data being acquired from a large percentage of smart meters and just strategically select the readers we need to enable key outcomes.
"We certainly don't want to pay for the data infrastructure and storage to keep everything that we are capable of measuring, so we'll need to be more selective."
As to where these select measurement points may be, Howe believes that substations are the natural starting place. He suggests that high-accuracy power-quality measurement at every substation, or even every bus within a substation, may become standard, if cost allows. Grid complexity and economic factors will also influence where these meters are placed.
"If I have neighbourhoods that are high adopters of distributed energy resources like rooftop PV, I'd want to have more power-quality measurement there," he says.
"Industry-intensive circuits - ones that are feeding high-value industrial customers as well as probably large commercial customers - are also obvious candidates.
"Another interesting economic driver could be theft detection. If we're making measurements along multiple points on most circuits, we can do very simple load balancing and discover if there's a circuit where the revenue meters report, say, 1.0MW worth of collective load but we're actually seeing 1.2MW of overall demand at more-central monitoring points. That may be an indicator that there's some illegal tapping in to the line."
For some utilities, particularly in developing nations or in areas where theft is prevalent, this detection capability may even be the primary economic driver to adopt smart grid. Perhaps it's no surprise then that smart grid technology is one of three key sectors targeted by Brazil's new Inova Program, a national initiative designed to encourage innovation in the power industry.
Whatever each nation's specific goal, it's clear that smart grid technology will play a key role as utilities around the world seek to improve reliability, reduce cost, and increase efficiency and sustainability. Success in doing so, however, will hinge on their degree of willingness to undertake the checks and balances needed to procure reliable and accurate data.