Quotes:
The energy system is therefore progressively evolving from a system centrally operated and optimized into a federated system … As a consequence the associated digital infrastructures need to accelerate their transformation from few central monolithic control room environments … towards new orchestrated platform architectures taking advantage of IOT, edge computing as well as hybrid private and public cloud architectures where real-time data exchanges, interoperability and Open Application Programmable Interfaces have become critical technology building blocks to enable plug and play data interfaces.
Enterprise service bus (ESB) or other integration solutions were used for the deployment of first Service Oriented architectures to decouple systems. However, the number of connectors, as well as the growing requirements for heterogeneous data consuming interfaces feeding a variety of data interfaces through real-time has led applications, system components and enterprise services to remain closely intertwined, as legacy SCADA platform vendors did not provide sufficient reengineering efforts to establish interoperability across data interfaces although the emergence the IEC CIM standards. This situation has limited the possibility to integrate platform and application from different vendors as well as expand control room platforms with modern technology stacks developed through the opensource community.
Event streaming platforms:
This recent transformation is progressively heading towards the migration of original Control Room platforms into new event streaming platforms leveraging events as a core integration principle and orchestrating most of energy management and control business processes around real-time data streams. This new architecture is designed in view of the event data flows while data processing is orchestrated on data while they are in motion. It has the following key benefits:
- Event-based data flows is a foundation for (near) real-time and batch processing as required in most of flexibility management processes. In previous SOA architectures, applications were built on data stores (data at rest), which was making it impossible to build flexible and agile services to act on data very close to real-time.
- Scalable architectures for all events shared across infinite source and sink processes. As opposed to centralised monolithic applications, the architecture is built on scalable, distributed infrastructures by design for zero downtime, handling the failure of nodes and networks while being able to roll out upgrades online. Different versions of infrastructure (like Kafka) and applications (business services) can be deployed and managed in an agile, dynamic way. This approach minimises dependencies across application which was particularly complex to manage through historical SOA architectures.
- Openness to any kind of applications, system components and microservices. Technology does not matter. The streaming environment connects anything: programming languages, APIs like REST or MQTT, open standards, proprietary tools, and legacy applications which reduces the need to redesign existing legacy applications while benefitting from highly scalable cloud containerized environments enabling rapid prototyping of new applications. It also allows to minimize processing speed constraints for real-time control applications.
- Distributed storage for decoupling applications. The platform data streaming environment allows to store the state of a microservice instead of requiring a separate database.
- Stateless service and stateful business processes. Business processes typically are stateful processes. They often need to be implemented with events and state changes, for which remote procedure calls and request-response as considered through SOA architecture is not optimal.
The solid Open Source approach deployed around Apache Kafka makes it the preferred choice…
