The U.S. Department of Energy recently published an excellent report on recent utility ADMS implementation experiences. Entitled Voices of Experience – Insights into Advanced Distribution Management Systems, the report documents experiences of several North American companies that have expended the great deal of effort required to make ADMS functionality part of their day-to-day business practice.
A critical part of this report addresses data sources that feed the ADMS, and in particular talks about the importance of GIS data completeness and data quality. The report clearly describes that, even if a company’s GIS has been successful in supplying data to an outage management system (OMS) it may not be – and likely isn’t — the case that the data quality is sufficient to support an ADMS.
While the report addressed many questions about ADMS in general and in GIS-ADMS data management practices in particular with dome focus on GIS data quality, one question it did not address was, “how good is good enough?”
Let’s start with the proposition that no one’s GIS data is perfect. For example, no company knows with absolute certainly which transformer serves every customer all the time. A company may be certain of 99%, or 99.9%, or even 100% at certain points in time. But once a storm hits and customers are reconnected in the quickest possible way to minimize outage time its nearly inevitable that connectivity errors creep in. Similarly, no company knows the ampacity of all conductor spans in their service territory, nor the load and no-load losses of all its distribution transformers.
On the other hand, most companies (at least the ones we’ve worked with) have GIS data that is of good quality. Islands and loops in the radial-fed part of the network are rare. Significant properties of conductors and devices are known and well maintained. Customer-to-transformer relationships are known with a good level of confidence for a very high percentage of customers, and processes are in place to maintain these relationships.
Further, it cannot be the case that a company’s GIS data has to be absolutely perfect for it to get value from ADMS technology. No ADMS would ever go into production otherwise. (I’m reminded of a saying from my old boss, “the perfect can be the enemy of the good,” or a corollary from an economist, “if you never miss a plane you’re spending too much time at the airport.”)
Finally, as our data quality improves there are certainly diminishing returns obtained from improving it past a certain point. If the quality of data in the model supplied from our GIS results in the ADMS predicting a voltage of 12.47112 kV at a given node in our network and a sensor at that location reports a voltage of 12.47113 kV is it worthwhile trying to explain the hundredth of a volt difference?
A Middle Ground?
So, there must be an acceptable middle ground. A place where the values for voltage, current and so forth from field sensors come sufficiently close to values calculated by the ADMS load flow/state estimation models that operators have confidence in the system. And just as important, what are the most effective tasks one can undertake within the GIS to reach that acceptable middle ground. And of course as with any such question, there is professional judgment involved.
How do we arrive at a place where we know our GIS data quality is good enough to meet the needs of our ADMS? This question is one we’ll be pursuing over the next weeks and months, hoping to talk with as many utilities and subject matter experts as possible. If you or someone you know has thoughts on the subject, please chime in.