The past few years have brought a number of technological changes the power grid must contend with. Specifically, an increase in dual-purpose (heterogenous) sensing and actuating devices, the IoT, edge computing, and distributed systems in general have characterized the technological landscape.

What this has resulted in is a shift from centralized, inflexible power sources running on a unidirectional grid toward decentralized smart grids and microgrids. Looking only at solar, research shows an average year-over-year growth of 48% for the past decade.

The next generation of grids can integrate inconsistent renewable energy, facilitate demand-side management and real-time pricing, and function bidirectionally with a growing number of “prosumers” — systems that both produce and consume energy. It’s fault-tolerant, self-healing, and allows for more advanced metering infrastructure. 

But as the grid has become more inclusive, to handle the billions of requests from energy prosumers, the smart grid must now depend upon cloud data centers. The monumental resources of the cloud have been an invaluable partner of the smart grid. That being said, the distance between end users, smart homes, etc., and the cloud brings latency and potential bandwidth issues as baggage to the relationship. For speed-sensitive applications, this is unacceptable. Smart meters also record power usage data and personal data from users and send it to the cloud for reporting purposes. In addition to the difficulties raised in processing this volume of personal data, storing it there raises privacy concerns.

Fog and Cloud Coordination

In answer, the data center is distributed through edge or fog computing to shift control closer to endpoints. To some researchers, edge computing is inclusive of base stations, routers, gateways, and device-level control, while fog includes things, the cloud, core, metro, clients, and LAN-level control. I will use the two terms interchangeably as many researchers do. The combination of IoT, edge, cloud, and smart grid is sometimes referred to as the Internet of Energy (IoE).

The multifaceted nature of the IoE has many benefits. For starters, response time and latency is reduced. In addition, energy efficiency, scalability, resource utilization, security, resilience, heterogeneity, location awareness, regulatory compliance, and coordination and interoperability can all be improved through the use of an edge/fog architecture. However, the potential benefits of an edge architecture require new strategies for optimal implementation. 

Optimizing the Edge

Adding more device endpoints and calculating at the edge can produce new security vulnerabilities. To address this and minimize energy consumption from smart meter data transmission, Lyu et al have proposed a gaussian mechanism to gather smart meter aggregate statistics at the fog level. The fog level aggregate is then handled at the cloud level, and privacy is maintained via a simple two-layer encryption. 

While interoperability has steadily been improving since the National Institute of Standards and Technology (NIST) implemented the Smart Grid Interoperability Panel, there are still issues with device commingling and protocols from multiple vendors along the grid’s network. Protocol conflicts can cause interoperability issues that are not easily resolved. While the use of standards is certainly necessary and invaluable, another methodology to address continuity is with the adoption of software-defined networking (SDN). In addition to providing protocol independence, SDN is programmable and granular.

As such, interoperability conflicts must be addressed at different levels, such as the application layer and the communication layer, and SDN may be useful for this. Another idea involves conversion to an IP network and subsequent integration of different kinds of networks/devices, which it allows. Interoperability will be an ongoing challenge, and it remains to be seen how smart grid systems, subsystems, and vertical IoT apps will be blended to resolve it moving forward. 

Another area of contention is algorithmic resource allocation strategy in the IoE. Some of the algorithms put forth are the round-robin (RR), particle swarm optimization (PSO), cuckoo optimization, particle swarm optimization with simulated annealing (PSOSA), First Fit, throttled, First Come First Serve, and the ant colony optimization algorithm. Different algorithms have shown varied success across smart homes, smart buildings, and architectures in general, so it remains to be seen what the general consensus will be for the purposes of resource allocation. In a more tangible sense, there has already been some compelling work done in the realm of forecasting.

Forecasting Algorithm Variations

An interesting application of specific machine learning algorithms is in smart grid forecasting. Day-ahead load voltage forecasting helps inform the smart grid to make better decisions regarding fuel/energy transactions, generating capacity scheduling, and even security evaluations. In particular, the inconsistency of renewable energy generation can burden the smart grid to regulate load voltage (or simply load, for short). If improperly regulated, reverse power flow or backfeeding results.

To address this issue, Luo et al. proposed varying the data center’s load in sequence with renewable energy fluctuations as a counterbalance. By forecasting renewable energy dynamics and predicting delays using a machine learning framework, “Keras,” they were able to reduce operational costs and efficiency in any weather conditions while stabilizing the grid. Another team, Jurado et al. found forecasting success with another machine learning method, Flexible Fuzzy Inductive Reasoning (implementing K Nearest Neighbor Optimal Selection). 

A modular approach could also be taken for short-term load forecasting. To elaborate, a team at the university of Newcastle used a forecasting model with “(i) a pre-processing module; (ii) a forecast module; and (iii) an optimization module.” The first removes extraneous inputs, the second trains and activates the neural network, and the third uses heuristics to improve accuracy. They report better results at 98.76% load prediction accuracy using this approach than with other nonlinear models like the Markov chain-based model or stochastic distribution-based model.

For the difficult task of empirically evaluating the performance of algorithmic strategies in general, as well as virtual machine configurations, various modeling tools such as Cloudsim, Cloudanalyst, MATpower, and SimIoT can and have been used with great success. Unfortunately, simulation options specific to the IoT are limited.

Demands upon the smart grid will surely climb with the growth of IoT devices and data usage as a whole. Resource constraints, scalability issues, lack of standardization, data bottlenecks, packet loss, lack of IoT application development, outdated networking architectures, and a number of other issues need to be addressed, especially as new generations of devices come onto the grid, so it is crucial to stay current.