Moving power around the grid comes with a unique set of problems. Everything from energy theft to grid failure can interfere with the process. And, every time something goes wrong, it costs time and money to correct.
Machine learning has been an increasingly powerful tool to help reduce the risk associated with energy transmission. It can help you:
- Anticipate and prevent power grid failure
- Prevent brownouts with real-time monitoring and AI prediction
- Detect power grid faults
- Balance the grid
- Differentiate power system disturbances from cyber attacks
- Detect energy theft
1. Anticipate and Prevent Power Grid Failure
Massive power outages cause chaos for the general public, and they cost utility providers roughly $49 billion a year.
This wouldn’t be much of a problem if massive power outages were rare, but outages affecting more than 50,000 people have increased dramatically in recent years. This means utility companies need to find new ways of anticipating and managing these outages.
These days, smart grids are producing massive amounts of data, which means predicting and managing outages is easier than ever. Unlike traditional power grids, which are one-directional (meaning they only transmit power in one direction), smart grids are two-directional. They can capture data from every possible source in the grid at the same time as they’re providing electricity. They collect and monitor data from sources like smart meters, IoT devices, and power generation stations, providing a clear, real-time look at power usage.
Machine learning can use this data to anticipate and prevent massive power outages in the grid. It helps identify non-obvious patterns in the data that can be precursors to grid failure, which helps maintenance teams preempt failure.
2. Prevent Brownouts With Real-time Monitoring and AI Prediction
Power grids have a lot of obstacles to overcome in providing continuous energy to customers. Weather patterns, usage, internal failure, even wildcard incidents like lightning strikes and interference from wild animals can all affect power delivery.
Machine learning is increasingly being used to help predict potential brownout conditions. By feeding historical data into the AI and running Monte Carlo simulations to predict potential outcomes, grid operators can use machine learning to identify conditions that could lead to grid failure. And they can act accordingly.
Sensors like phase measurement units (PMU) and smart meters can provide usage information in real-time. When we combine these tools with historical and simulation data, AI can help mitigate potential grid failure, using techniques like grid balancing and demand-response optimization. Incidents that would otherwise have affected millions of people can be contained to a smaller area and fixed faster for less money.
3. Detect Power Grid Faults
The methods most utility providers currently use to detect faults in the grid consume a lot of unnecessary time and resources. This creates a situation where power transmission is interrupted and customers are without electricity while faults are first located, then fixed.
Machine learning can find faults quickly and more accurately, helping you minimize service interruption for your customers. For example, we can combine Support Vector Machines (SVM) with Discrete Wavelet Transformation (DWT) to locate faults in the lines.
When we apply DWT (a form of numerical and functional analysis that captures both frequency and location information) to the transient voltage recorded on the transmission line, we can determine the location of the fault by calculating aerial mode voltage wavelets (for above ground transmission wire) and ground mode voltage wavelets (for in-ground transmission wires). So far, this method has successfully detected fault inception angles, fault locations, loading levels, and non-linear high-impedance faults for both aerial and underground transmission lines.
4. Balance the Grid
Balancing the grid — making sure energy supply matches energy demand — is one of the most important jobs a transmission operator has. But renewable energy sources depend heavily on the weather, making them harder to predict.
Transmission operators spend millions each year fixing planning mistakes that lead to producing too much or too little power. In hybrid systems — which rely on both renewable energy sources and fossil fuels to generate electricity — these mistakes have to be corrected at the last minute by buying more energy or compensating power plants for the excess.
Knowing precisely when demand levels will peak allows utility providers to connect to secondary power sources (like conventionally generated electricity) to bolster the available resources and ensure constant service provision.
Machine learning is the most accurate method available to forecast renewable energy output. When we apply advanced methods like Long Short-Term Neural Networks (LSTMs), AI can weigh the many factors involved — wind, temperature, sunlight, and humidity forecasts — and make the best predictions. This saves money for operators and preserves resources for power plants.
We can also feed historical data into machine learning algorithms -- like Support Vector Machines (SVM) -- to accurately forecast energy usage and ensure sufficient levels and constant supply.
5. Differentiate Power System Disturbances from Cyber Attacks
Cyber attacks are increasingly used to target important infrastructure, like hijacking utilities and demanding cash. This can take a long time and a lot of money to sort out. In Colorado, for example, it took them three weeks to unlock the data after a cyber attack. They had to bring in extra personnel to comb through the records to make sure their customers' personal info wasn't compromised, and this is the second time it's happened to the same utility in two years.
Detecting these attacks is critical.
Developers are using machine learning to differentiate between a fault (a short-circuit, for example) or a disturbance in the grid (such as line maintenance) and an intelligent cyber attack (like a data injection).
Since deception is a huge component of these attacks, the model needs to be trained to look for suspicious activity – things like malicious code or bots that get left behind after the deception has occurred. Once the model is trained, it can monitor the system and catch suspicious activity faster.
One method we can apply uses feature extraction with Symbolic Dynamic Filtering (an information theory-based pattern recognition tool) to discover causal interactions between the subsystems, without overburdening computer systems. In testing, it accurately detected 99% of cyber attacks, with a true-positive rate of 98% and a false-positive rate of less than 2%. This low false-positive rate is significant because false alarms are one of the biggest concerns in detecting cyber attacks.
6. Detect Energy Theft
In the energy world, “non-technical losses” means energy theft or fraud.
There are two common types of non-technical losses. The first is when a customer uses more energy than the meter reports. The second involves rogue connections stealing energy from paying customers. To pull this off, bad actors can bypass smart meters completely or insert chips into the system that change how meters track energy use. Meter readers can also be bribed to report lower numbers (though thanks to smart meters, this is increasingly hard to do).
Because these non-technical losses cost $96 billion annually, utility providers are turning to machine learning to combat the problem.
We can help you mine historical customer data to discover irregularities that indicate theft or fraud. These can be things like unusual spikes in usage, differences between reported and actual usage, or even evidence of equipment tampering.