Debriefing from Davos: Thank you for your questions!

Jos Schuurmans 26-01-2021 Energy Virtual Power Plants
Photo by Damian Markutt on Unsplash. Edited by Palash Mukhopadhyay.

Photo by Damian Markutt on Unsplash. Edited by Palash Mukhopadhyay.

Our CEO Henri Kivioja, Head of Product Pekka Immonen and I spent a good chunk of last week watching presentations, listening to panel discussions, and conversing with attendees at Davos Energy Week.

Our goal was to learn more about the transformation towards distributed energy production that has been set in motion, especially with renewable energy, and how this will move Energy Management Systems (EMSs), smart grids and microgrids, Distributed Energy Resources (DER), and Virtual Power Plants (VPPs) towards more flexible and real-time balancing, production and consumption.

We wanted to learn how, with our RAIN platform and our services business, we can best enable system providers, operators and aggregators of distributed energy production to put the necessary distributed 5G edge-to-cloud AI in place. What do they need to figure out in order to make the right investments? What information do they need? What questions do they have?

(I tried to give some background to those questions in a blog post at the start of the conference.)

And questions we received! Dozens of them, in fact. So here’s a big Thank You to all those representatives of governments, consulting firms, technology companies, green energy lobby groups, industry associations, oil, gas, nuclear, solar, wind, hydro and what-have-you energy producers, as well as consumer groups, for chatting with us!

Energy goes dynamic

For one thing, we learned that Virtual Power Plants are not on everybody’s radar yet, but many of the challenges put forward by the transformation towards distributed energy certainly are. There was a lot of talk about how to upgrade electricity grids with fast data capabilities, how to integrate the supply of renewable energy sources into existing infrastructure, and how to make consumption more predictable and flexible.

The current paradigm of power grids and power plants was designed and built after the Second World War. It is efficient and reliable. It works fine for power on demand, as long as our societies find it acceptable to meet that demand with a small number of large-scale nuclear and fossil energy sources.

But energy markets are becoming ever more dynamic. On the supply side, a diverse swath of renewable energy sources is entering the playing field. Wind and solar energy cannot be invoked on-demand, which means that some of their produce needs to be stored in batteries. This brings about new challenges around low inertia in the grid, which in turn requires fast-response reserve capabilities, for which more data intelligence in the network is needed.

On the supply side, smart buildings, smart cities, smart transportation, and communities of prosumers demand to be able to purchase whatever type of electricity, whenever they want, and from whichever provider they want it - at specific price points.

The OT-IT-telco angle

This transition requires more fine-grained control, which requires more intelligence throughout the system, which requires digital transformation.

For some years now, Operational Technology (OT) and Information Technology (IT) providers have contributed to digital transformation in various industries by connecting equipment to centralized servers and cloud services so that data from their machines can be processed centrally.

This means that intelligence has been steadily rolling out across industrial processes (think Industrial Internet of Things - IIoT), supply and logistics chains, and lately even within hospitality and retail.

But we at Lempea, the team behind RAIN, come in from a different angle - at least in part. Yes, we know as much as the next gal about cloud computing and data processing. We actually know more than most about embedded industrial software and how to connect embedded systems to the cloud.

What really sets us apart is our IT and cloud expertise in combination with our strong background in telecommunications technology.

Another smart energy grid

Mobile networks have always been distributed, in more ways than one.

The data is generated, computed and sent off from edge devices - some of them being known as mobile phones. :-)

It is computed, directed and passed through via base stations. Some data is sent on to cloud computers for central processing, but much of it is not. All kinds of metadata and environmental data is produced by hardware throughout the network in order to keep things working.

And guess what? The telco grid is also a distributed energy grid. Every mobile phone and every base station has power supply, be it rechargeable batteries or by tapping directly into the electricity network.

Every piece of hardware needs to know how much power it has available, and it needs to send alerts to its users or operators whenever energy becomes an issue. That could be a “low battery” warning on your phone, a power shortage alert appearing on a cloud dashboard at “air traffic control” of a mobile operator, or a signal within the computing unit of a base station that triggers a switch so that the station’s battery starts being recharged by the power grid.

The future is edge-to-cloud

Sorry for wandering off a bit here. The point I’m trying to make is that this telco paradigm of intelligent data processing and routing at the edges AND the cloud - which we sometimes refer to as the “edge-to-cloud continuum” - is the future of the energy system. It requires a horizontally distributed software platform on top of which AI algorithms can compute and reduce the vast amount of data that is being generated, and 5G data speeds to make it all available and actionable in real time.

So, with that rant out of the way (phew!), let’s unpack some of the questions we heard most often at Davos. It could be the start of an FAQ or glossary. Who knows, in conversations from here on, it might help us get onto the same page more quickly.

What is a Virtual Power Plant?

A Virtual Power Plant (VPP) is software in the cloud that virtualizes a power plant. It represents every asset in the real, physical infrastructure and enables operational control of the whole network. It provides a way to combine different sources of energy production, including nuclear, fossil, and renewables.

Is RAIN a Virtual Power Plant?

No. Our software can add essential capabilities to a VPP cloud solution, namely fast actuation of energy supply based on real-time data from the edges (supply AND consumption) as well as the cloud.

What problem does RAIN solve?

Our basic value proposition for power infrastructure is that we enable fast actuation of frequency reserves in response to low-inertia situations. In practice this means providing data computing capabilities close to distributed energy sources, as well as data connectivity throughout the edge-to-cloud continuum.

What is the issue with low inertia and fast frequency reserves?

Electricity grids need to remain a frequency of around 50Hz. In traditional grids, power plants need time to fire up or shut down. This causes some inertia, meaning that it delays any sudden fluctuations in frequency.

When consumption is larger than supply, the grid’s frequency drops. This is compensated by mobilizing fast frequency reserves, usually by burning more oil, gas, and coal.

However, renewable energy such as wind and solar relies in part on battery storage, and batteries have no inertia. In the future, solar and wind will cause lower inertia and more fluctuation, while at the same time also fast reserve frequency will increasingly come from battery power.
Balancing the frequency will require real-time demand response, which is not possible without edge computing and 5G connectivity.

When do I benefit most from computing at the edge?

There are many use cases for edge computing, and the most compelling ones are combined with artificial intelligence and 5G connectivity. To name a few:

  • When low latency, or low response time, is a requirement. A sensor notices that a motor starts to overheat. Latency is the time that passes before the data is processed, and the motor is stopped.
  • When large amounts of data are generated, those data can be computed and reduced near the source to save costs in data transfer bandwidth, storage, and processing in the cloud.
  • When privacy-sensitive data are generated, for example by cameras and microphones, AI software can compute anonymized conclusions. Those conclusions can be sent to a cloud destination while the original data are never stored.
  • Where availability is critical, local operations can remain available even when a connection to the cloud is interrupted. Cloud data transfers can be queued and re-initiated when connectivity is re-established.

I have a unique data challenge. How can RAIN help me address it?

With our platform as a base, together with our customers we can quickly build case-specific data applications, for example for renewable energy integration, carbon emission reduction, demand response of fast energy reserves, fast frequency balancing, and smart buildings’ energy consumption.

As always, feel absolutely free to reach back with comments or questions. We love questions! :-)