Things, IoT & Digital Twins for the Energy & Utilities sector

Internet of things

Written by Frederik Baert

7 April 2021

Things & Digital Twins?

What is the Internet of Things, often referred to as IoT? According to Wikipedia, the Internet of Things (IoT) describes the network of physical objects—“things” or objects—that are embedded with or connected to sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet.

According to Gartner, an IoT Platform performs the following conceptual roles:

  • Sense (often resulting in timeseries data)
  • Communicate the data (transport to the cloud)
  • Analyze the data (ingest, persist, integrate, aggregate, visualize, augment)
  • Act on the data (event-driven actions).

What is a Digital Twin? A digital twin is a virtual representation that serves as the real-time digital counterpart of a physical object or process. You can think of it as a bridge between the digital and physical world. A digital twin is a “virtual copy” stored in the cloud that is enriched with data collected in realtime or near-realtime.

Both Things and Digital Twins are brought together by data from many sources, such as sensors, which are synchronized to offer real-time status and insights.

IoT & Digital Twins applied to Energy & Utilities

Let us apply these concepts specifically to Energy Assets. Within the Energy landscape, we could distinguish Energy Assets into roughly two types of “resources”:

  • Distributed Energy Resources (DER), these resources are mainly “producing” energy in the form of electricity. Some examples are Solar Installations, Windturbines, CHP’s, Battery Storage, Hydro-based turbines…
  • Demand-Side Resources (DSR), resources that are energy-intensive, but can be steered, optimized, or used for balancing purposes from a flexibility perspective. Some examples are Lighting, HVAC, Buildings, EV Chargers

If we look at these resources, a digital twin representation in combination with IoT technology can deliver some benefits for companies by:

  • Having real-time analysis and insight in critical efficiency parameters, preventing downtime, applying predictive analytics and efficiency optimization
  • Maximizing asset performance, reducing service costs
  • Managing and activating distributed energy resources (DER) based on different external inputs like pricing signals, demand-supply data, or weather forecast data.

Why now?

Evolving a more diverse portfolio of renewable energy assets is required to enable the transition to sustainable, affordable, and reliable energy.

This transition causes a multitude of challenges in the order of reliability, resilience, and security of the energy assets. This requires new solutions for management and operations instead of current traditional asset management processes and tools (often still in separate systems).

The increasing deployment of renewable energy assets (DER) and growing complexity of these assets require a new type of architecture to cope with a large diversity of sensors, large amounts of data, and machine learning (ML) to utilize this data for analysis, predictions, and learning from historical data.

I found a very nice reference model in the research for this article, which is the Digital Utility Asset Management Model (by Deloitte: Digital Utility Asset Management):

Digital utility asset management

Deloitte defines it as the physical-digital-physical (PDP) loop which consists of nine phases:

  • Create: generation of data that capture conditions in the physical world from sensors monitoring energy assets.
  • Communicate: transmit the data to a central processing system
  • Integrate/Aggregate: combine data in a common repository and enable user access and system/application integration, add external data and standardize as required.
  • Analyze: deploy the appropriate tools and processes to harness available data and provide actionable insights that can drive asset strategies to balance risk, cost, and compliance.
  • Visualize: use data and analytics to create integrated, real-time dashboards to inform decisions. This can help situational awareness, predict asset risks and inform proactive action. The digital twin, a digital profile can provide insights on the asset’s historical, current and expected behavior and help forecast the asset’s health and performance over its lifetime.
  • Augment: use cognitive capabilities such as machine learning to analyze current and historical data and predict an asset’s future condition and behavior.
  • Automate: the augmented behavior technologies advance the process into the digital-to-physical side of the model where it can enable automated action by other devices
  • Act: Humans use insights provided by the PDP loop to inform decisions about where and what to inspect, monitor, repair, replace or retire.
  • Assess: Measure benefits and ROI, and decide whether to finish a project or revise and refresh the strategy.

Now that the “journey” is described from a high-level perspective, how do you translate this into a Roadmap and Solution Architecture for a real implementation?

How should an IoT Architecture look like?

The following IoT Architecture illustrates how an IoT Business Solution should look like and which minimal set of functional capabilities are required. You can use this as a guide in your search for IoT Solution(s).

Things & IoT

Things are anything related to Distributed Energy Resources and Demand-Side Resources as explained before.

Edge Compute or Fog Compute

The IoT Edge is deployed near things, and processes data locally where applicable to support local decisions, even when the connection to the cloud is temporarily unavailable.


The IoT layer is the centrally deployed layer in either the cloud or the datacenter. Within the IoT layer, the heavy-lifting decisions are taken that require more contextual data and more compute time.

Functional Capabilities of an Integrated IoT Solution

Device Control

IoT Device Control ensures the provisioning, authenticating, configuring, maintaining, monitoring, and diagnosing connected devices. It ensures the operation of all functional capabilities of the solution.

Effective device control is critical to keep assets in control connected, up-to-date and secure.


Within an IoT architecture, it is important that the solution supports multiple protocols. This is important on a couple of levels:

  • Protocols and communication between gateway and sensors, some examples: M-Bus, Modbus, KNX, or BACNet.
  • Protocols between gateway and IoT platform or other solution(s): MQTT, AMQP, WebSockets, REST. Network support – for example, LoRaWAN.
  • Protocols between the IoT platform and other solution(s)

Data Handling

Data handling in an IoT platform is performed at different levels: starting from data ingestion (often through streaming) to persistence and organization of data. Due to the massive amount of data generated by a huge number of Things attention needs to be paid to scalability (large volumes of data), latency (real-time response), and distribution of the data handling between IoT and IoT Edge.

Integration / Transformation

Because of the diversity of Things to communicate with, it is important to ensure that data can be transformed in the proper formats for further processing and interpretation in the IoT platform or on the Edge. It should be possible to easily develop connectors for integration and transformation purposes.


Analytics must provide in the processing of datastreams, enriching the IoT generated data with contextual data (digital twin) and/or business process-related data using rules and algorithms to provide real-time insights in the assets and equipment monitored. Terms to look for: rule engines, event stream processing, machine learning, and data visualization.

Application Enablement

IoT in itself is just part of an end-to-end solution. An IoT platform should be easily configurable to enable business applications to work with the generated information. From a data perspective, the modeling of digital twins is an important aspect, next to that it should be easy to develop new solutions and integrate using APIs.

How byNeuron ENERGY implements IoT and enables end-to-end solutions for Energy & Utilities

Data is the cornerstone of energy asset management. It drives the efficiency of operations and grids and impacts strategies and business models. The traditional structure of energy management involves data silos across different system software, which makes it difficult to connect platforms and share resources.

Energy Data Integration Platform

To avoid data silos byNeuron integrates all relevant aspects of data management capabilities in one integrated solution. The byNeuron ENERGY platform spans multiple areas: event & streams processing, persistence using a fast underlying horizontally scalable database, and organization: Energy Information Model.

The Energy Information Model covers from an organization perspective relevant energy data concepts: ranging from parties, sites, assets to products and contracts. It provides a configurable (data-driven) and extensible datamodel to use for platform integration purposes, or functional solution implementation.

Energy Data Integration Platform

Next to the Energy Information Model, the byNeuron Platform includes a fast Timeseries database, with a calculation engine that allows customers to model and deploy their own (complex) calculation models on top of the timeseries data, including very complex and rule-driven calculations with a combination of timeseries and contextual masterdata. The calculation modeler is a low-code solution accessible and understandable for the knowledge worker.

Events and Streams processing is an additional important component of the Energy Data Integration Platform, it takes care of the processing of incoming data and events, including filtering, validation, transformation and ensures that events are processed and/or stored. Next to processing streams processing also includes enrichment, calculations, or aggregations on one event or multiple events and timeseries data, such as average values over a certain time period, or complex event processing.

How does this look like in practice?

In order to understand how this works let us dive into a use case:

Take a customer who has one or more DER assets. These could be Solar, CHP, Micro CHP (based on fuelcell technology), Windmills, Hydro Turbines, or any other sort of renewable energy resource.


For this example, we work with the example of a FuelCell, as this technology consumes gas, and generates electricity and heat from it. In order to validate the proper operations of such a FuelCell, there is a set of parameters that need to be monitored and validated, the following parameters are a minor set:

  • Gas Consumption
  • Electricity Production
  • Heat Production

This data is captured using “submeters” or “sensors” if you will. Here we are confronted with a first “issue” – which protocol(s) do the “sensors” talk. Our recommendation is to use submeter technology that is not proprietary for example meters that support a protocol like KNX, M-Bus, Modbus as these are properly documented formats that can be interpreted if implemented correctly.

Edge Compute

A gateway is required to understand the protocols and translate these into a more generic format to exchange data on the “asset”, “metrics” and “values”. The gateway sends the data on configurable intervals to the central IoT platform using MQTT which is a low bandwidth optimized protocol and supports real-time or near-realtime data communication.


In case you would use any kind of proprietary technology, ensure that the platform that gathers the data is able to communicate data to other platforms either by means of APIs (disadvantage of this it is not real-time or near real-time) or by MQTT (for real-time purposes).

Digital Twin & Analytics

In order to provide context to the user of the platform, the FuelCell needs to be modeled as a “digital twin” so all the parameters can be visualized and interpreted in context. The FuelCell is modeled as an asset with a hierarchical relation to its sensors/submeters, and all timeseries (metrics/measurements) are linked with a certain role type to provide additional context.

As you define the “digital twin”, additional specifications and relevant documentation and parameters can be configured into the platform. At the same time, a set of rules can be defined on how assets should behave in certain circumstances. These rule definitions are important as these instruments how events and streams are handled.

Events / Streams & Timeseries

Based on the real-time or near real-time measurement of the 3 parameters we identified for this example, this data needs to be ingested into the IoT platform. Events & Streams are used to identify the metrics and enrich the data with additional context. One of these contexts could be to decide whether or not to store the data for further processing.

Why is this important?

Imagine your platform sends metrics data continuously, for example, every second, then you don’t necessarily want this event data to be stored every second. Although storage and processing costs in the cloud have significantly lowered, huge amounts of data still lead up to potentially high costs, so evaluate whether it is necessary to have every raw event data stored on disk.

Using Event streams and enrichment, we are perfectly capable to define what needs to be done with the data: store it, buffer it, calculate it, or do any other kind of operation. A specific example of a more complex event could be alerted when different metrics exceed certain thresholds (based on predefined rules).

Back to our example, using the measurement values, we decide to store these as timeseries (every 2 minutes) into the database. This data will then be used to do some derived calculations like Thermal and Electrical Efficiency, resulting in a new timeseries.

Real-Time Insights

All these timeseries are then visualized using different kinds of dashboards, charts, widgets depending on the use case to visually assist the user of the platform into the correct interpretation of the data of the energy asset(s).


The technician is now capable of understanding whether a FuelCell is behaving correctly, and is alerted when the defined rules are triggered.

At the same time, using the same timeseries, but visualized in a less technical way, the end customer sees how much heat and energy the FuelCell has produced to validate his own business case.

Innovate & Operate Efficiently

Decarbonization trends are leading to developments in clean technologies highlighting the importance of connected assets, as well as the need for flexibility to consume the power of energy produced by renewable energy assets (article in Dutch).

Buildings and Energy applications will be a major driver in the Digital Twin evolution as the rise of (renewable generated) power consumption and rapid urbanization will fuel the technology demand.

Modern & agile energy players will be able to unlock growth at a lower cost by implementing an end-to-end IoT Business solution specifically targeted to the Energy Market.

Therefore zelospark has developed byNeuron ENERGY, a unique vertically integrated and modular cloud platform that combines IoT – specifically designed for energy assets – with your commercial and operational business processes into one scalable cloud solution, which integrates seamlessly in your existing landscape.


Other articles