Author: adk

  • The Importance of a Unified Namespace

    The Unified Namespace concept differs significantly from traditional industrial data architectures, which are often siloed. Companies today are grappling with how to effectively leverage the massive amounts of data generated across their factories, industries, and assets. For example, one VP at Rockwell Automation mentioned that their company produces the same amount of data in a year as what was used to train the GPT-3.5 language model.

    What is the challenge?

    The challenge is that this data is often locked away in proprietary ecosystems, making it inaccessible and unusable. Research by MIT and Microsoft found that over 50% of executives cite data inaccessibility, data unusability, and data governance issues as major problems.

    The Unified Namespace concept addresses these challenges by creating a data architecture that democratizes access to data across the organization – from the shop floor to the executive suite. The idea is to have all operational and contextual data available in real-time to all relevant stakeholders.

    Implementing this requires bridging the IT-OT divide, as industrial environments often have a mix of legacy and modern systems with over 150 different communication protocols. Unified Namespace uses technologies like Z-Mesh create a common data transport layer, allowing data to flow from device to edge to cloud.

    This shift enables several key transformations for enterprises:

    1. Data democratization – Making data accessible and usable for all stakeholders, not just machines.
    2. Moving away from point-to-point integrations towards a flexible data architecture with no centralized point (of failure).
    3. Providing real-time insights and enabling new use cases like AI-powered decision support.
    4. Bridging the IT-OT gap and unifying data flows across the organization.

    Ultimately, a Unified Namespace is helping enterprises rethink their approach to industrial data management, data ops, and digital transformation by providing a scalable way to unlock the value of their data assets.

    Why is a Unified Namespace important?

    A Unified Namespace is important in achieving the Network Effect, as described by Metcalfe’s law, because it allows for seamless interaction and communication between different users or nodes within a network.

    In a network with a unified namespace, all users or nodes share a common naming convention or addressing system, making it easier for them to find, connect, and interact with content or with each other. This, in turn, increases the value of the network to each individual user, as they can easily communicate and exchange information with a larger number of people.

    Metcalfe’s law states that the value of a network is proportional to the square of the number of connected users ( n2 ). A unified namespace helps to achieve this by reducing the barriers to entry and making it easier for new users to join and participate in the network.

    Without a unified namespace, users may face difficulties in finding and connecting with others, which can limit the growth and value of the network. For example, if users have different naming conventions or addressing systems, it may be harder for them to find and communicate with each other, reducing the overall value of the network.

    In summary, a unified namespace is essential for achieving the Network Effect because it enables seamless interaction and communication between users, making it easier for them to connect and exchange information, and thereby increasing the value of the network.

    Example: The Telephone system

    The concept of a Unified Namespace can be compared to the telephone system, where every phone has a unique phone number that allows it to be easily reached by other phones.

    In the early days of telephony, there were multiple, separate telephone systems, each with its own naming convention and switching equipment. This made it difficult for people on different systems to call each other, limiting the value of the network.

    The introduction of a unified namespace, in the form of a standardized phone number system (e.g., the North American Numbering Plan), allowed different telephone systems to interconnect and enabled seamless communication between users across different networks.

    With a unified namespace, anyone can dial a unique phone number to reach another person, regardless of their location or the specific telephone system they are using. This has enabled the telephone network to grow and become incredibly valuable, as people can easily communicate with each other, regardless of their geographical location or the specific phone system they are using.

    Similarly, in other networks, such as the internet, a unified namespace (e.g., IP addresses, domain names) allows different devices and users to communicate with each other, enabling the network to grow and become more valuable.

    In both cases, the unified namespace provides a common language and addressing system, making it easier for users to find and connect with each other, and thereby increasing the value of the network, as described by Metcalfe’s law ( n2 ).

  • US NIST: Interoperability is the problem

    A report, made for the U.S. National Institute of Standards (NIST),

    Similar results to what Eclipse IoT finds in their survey.

    Foundational gaps that hinder the scaling of IoT:

    InteroperabilityThe lack of interoperability hinders different devices and systems from integrating, communicating and sharing information with each other.
    Barriers to achieving interoperability include limited focus of standards initiatives, resistance to open and industry consensus standards, regional standards and standards implementation errors and deviations.
    Cybersecurity
    Privacy
    ConnectivityConnectivity challenges are multi-dimensional in nature and challenging to solve due to various factors, including the need for substantial infrastructure investment, the lack of a “one size fits all” approach along with issues in market economics, funding and incentives, “last acre” coverage and spectrum.
    Data management
    challenges
    As IoT scales, so does data management complexity. The IoT data collected comes in a variety of types, formats and sizes. Some data are time-sensitive and must be processed immediately while others are stored for future actions. Data may be required to comply with industrial, state and national regulations. Robust data management is foundational for artificial intelligence systems. Data management challenges are complicated by exponential growth in data volume and velocity, privacy considerations, cybersecurity factors, interoperability concerns and regulatory compliance requirements.
    IoT data ecosystemThe future IoT data ecosystem is envisioned to be a highly interconnected network where data generated by IoT devices and systems is seamlessly shared, monetized and utilized across various sectors.
    There are, however, a number of technical challenges to building such an IoT data ecosystem. These include data quality, interoperability, privacy and security, data sovereignty, scaling and standards for data management.
    Communications and
    network infrastructure
    Industry analysts estimate that there will be 55.9 billion IoT devices generating 79.4 zettabytes (ZB) of data by 2025. Current communication networks and architectures are not designed to manage the needs of IoT at this scale.
    New processes and technologies for configuring, managing, operating and maintaining the hyperconnected network will be necessary. Representative areas of infrastructure innovation are needed to support real-time autonomy and complex IoT applications, be fault tolerant and resilient and defend and heal against threats.

    Good quotes:

    P. 28: There are millions of legacy and OT systems in use today, from manufacturing machinery to Programmable Logic Controllers (PLCs) and SCADA (Supervisory Control and Data Acquisition) systems. While some of these legacy and OT systems may offer data collection and control capabilities, they were not designed to connect to and communicate across the Internet.
    It is neither feasible nor practical to replace all these legacy and OT systems with new connected “smart systems”. Some of the legacy systems must be retrofitted with IoT technologies to enable them to connect, communicate and be interoperable with existing systems and modern smart systems. An ecosystem of solutions providers who build “bridging” solutions is required.

    p. 29 (Edge computing / Problem with device-to-cloud): In a traditional IoT architecture, data is routed from the device to a remote cloud data center for processing and storage. Not all data collected needs to be or should be sent to the cloud for processing. In mission critical applications or in those where connectivity is intermittent or unreliable, processing is performed at the gateway or by the device itself. A 2022 survey of 910 IoT developers’, conducted by the Eclipse IoT Foundation, found that the top computing workloads performed at the edge were artificial intelligence (38% of respondents), control logic (34%), data exchange across multiple nodes (22%) and data analytics (20%).

    P. 30: While this approach leads to lower development and ownership costs, continuous innovation, improved code functionality, performance and resilience and mitigation of vendor lock-in concerns associated with proprietary software, there are implications for long-term maintenance and cybersecurity updates.

    P. 35: Interoperability

    • In transportation and logistics, the lack of universal standards for freight systems hampers data exchange, causing supply chain delays and increased costs.
    • Smart cities, filled with IoT devices and systems owned by different entities, struggle with interoperability, locking cities into specific vendor solutions and preventing efficient data exchange between systems.
    • Healthcare also suffers from fragmented, incompatible systems that make it difficult for medical devices to communicate, slowing down care and reducing operational efficiency.
    • In renewable energy, interoperability is crucial for grid reliability, but inconsistent standards and policies hinder the integration of distributed energy resources (DERs) and energy management systems into the grid.

    Ultimately, across these sectors, the lack of universally adopted standards and reliance on proprietary systems prevents seamless communication and data sharing, leading to inefficiencies, higher costs and reduced innovation. Without a concerted effort to adopt open standards and improve interoperability, these industries will continue to face challenges in maximizing the potential of their technologies.

    As IoT scales, so does data management complexity.

    Robust data management capabilities simplify these challenges and help unlock the value of IoT by enabling massive amounts of data to be collected, processed, stored, discovered, queried and analyzed.

    P. 37: However, data management faces a variety of challenges:

    • In the construction industry, the integration of IoT with Building Information Modeling (BIM) systems is challenged by siloed and fragmented data from various contractors, their reluctance to share data and a general lack of trust in a fragmented value chain.
    • In transportation and logistics, managing the vast data generated by IoT systems in the global supply chain is essential for smooth operations. With data flowing through multiple touchpoints such as manufacturing, transportation and warehousing, the challenge for supply chain and technology managers lies in handling decentralized and diverse data formats, while adhering to various regulatory standards. The exponentially growing volume of data requires businesses to have a scalable infrastructure capable of processing and storing information effectively.

    P. 38 Industrial: In manufacturing, factories are faced with a challenging wireless environment, where machinery, metallic surfaces and concrete structures interfere with signal propagation and create latency. Many manufacturing operations require real-time data monitoring, making on-device or local gateway processing essential. Additionally, many factories lack the necessary network infrastructure to support connected operations, further hindering their deployment of edge computing solutions.

    The high cost of implementing IoT presents a major barrier to their adoption across industries like manufacturing and retail. A large manufacturer may operate multiple factories containing thousands to hundreds of thousands of machines and production equipment of varying types. The cost to fit all production equipment in these factories with various devices can be cost prohibitive. The total IoT investment required is more than just devices, as hardware is usually around 30% of the initial implementation cost. Additionally, the total cost of implementing IoT includes not just the devices, but also infrastructure upgrades, professional services and recurring costs for cloud services. Small and medium-sized manufacturers face even greater challenges due to limited capital and outdated infrastructure.

    P 39: For smart cities, edge computing is crucial for managing the growing number of IoT applications that demand real-time data analysis and processing at the device, gateway, or local server level. However, technical advances are needed in areas like scalable architectures, energy efficiency, interoperability and cybersecurity to fully realize the potential of smart city technologies. Without these improvements, cities will struggle to support the vast amount of data generated by IoT devices, limiting the effectiveness and expansion of smart urban environments.

    P. 77: Interoperability: 6.1.1

    P 91: Connectivity – spectrum 6.1.4.3.

    6.2.1. Intelligence: Data management

    The problem with Interoperability has been solved by the royalty-free layer-3 protocol Z-Mesh: https://z-mesh.org/

    For IoT to succeed, it must be:

    1. Scalable (async to support offline devices): have a look at why Apache Kafka is a success.
    2. Interoperable (Network Effect): Any IoT-event (eg. temperature reading) must be accessible to any other client
    3. Support for both Constrained and TCP/IP devices
    4. Royalty-free: Not patented by anyone. It must be an equal market in which any vendor can contribute sensor or app or networking equipment – look at the success of the Internet
    5. Direct Devices-to-Device communication: To support Edge and low-latency AI applications, sensor-to-cloud technologies are not good enough.

    Download the report here: https://strategyofthings.io/nist

  • IoT architecture is 10 years behind

    The software world is 10 years ahead of the IoT world when it comes to architecture.

    In the 80’s and 90’s most companies had just one IT system and software was developed in-house. This changed in the 2000’s where companies realized that they could lower their IT-costs by purchasing standard software for ERP, billing, planning etc. So to move data between the standard software solutions, integrations between them was built. Of course these integrations were vendor specific, and communication was synchronous — so if the remote system was down, then the procedure call (like a HTTP request) would fail and if you wanted to replace the remote system, a new integration was needed (vendor lock-in). In software terms, this is called a tight coupling.

    Service Oriented Architecture and MicroServices

    To solve the vendor-dependency problem, the software world then invented the concept of loosely coupled systems and MicroServices; A Service Oriented Architecture in which the interface and responsibility of each microservice (system) is well defined and hides the implementation details. For example, to integrate your ERP system to your task-management system, you design a generic interface between them so that you can easily replace the task-management system with a new task-management system from another vendor. Another benefit is that all the other systems that has integrations to the old task-management system, don’t have to create a new integrations to the the new task-management system. In other words: Clearly defined interfaces: One new system, one new integration.

    While this solved the vendor-dependency (lock-in) problem and simplified the integration landscape, it still had the problem of being a synchronous — if a system was down then a retry-policy and/or error-handling needed to be put in place. Also, updates to systems needed to coordinate with every other dependent system in order to avoid downtime.

    Message Queues — asynchronous communication

    So the software world introduced Queues. Now, every time a system sends a request/message to another system, it will put it into a queue. This de-couples the systems completely and allows for asynchronous communication: The sending- and receiving system does not need to be online at the same time. Suddenly, taking down a system for maintenance could be done at any time and no service would be interrupted or no data would be lost. When the system came back online, it would then continue emptying the queue from where it left off.

    IoT-architecture today

    Most IoT solutions today are vertical: They have their own proprietary:

    • Sensors
    • Connectivity solution
    • Analytics system or application
    • API

    They are integrated directly into their customers systems — that is, their APIs are used/invoked directly. In software terms, this is called a tight coupling.

    You can probably see where this is going…

    The IoT architecture of tomorrow

    Building owners are realizing that verticle IoT-solutions does not scale and have too many drawbacks. Surveys also shows that. Think about it: It does not make sense to have multiple motions sensors in each room (alarm system, lighting, room-usage%, adaptive cleaning). Or multiple temperature/CO2 sensors… That’s not all, building owners must have multiple connectivity solutions, one for each vertical IoT-solution.

    Obviously, having ONE SINGLE motion sensor (or temperature/CO2 etc.) and use that output for all data-consuming solutions (alarm system, lighting, room-usage%, adaptive cleaning), have a long list of benefits:

    • One-time installation for sensors
    • Simpler integrations (only interface/integration to data-consumers change)
    • Save batteries (no multiple motion/temp/co2 sensors)
    • No need to replace the entire sensor-system if vendor goes bust
    • Sensors are commodities — replace with any vendor
    • Covering your building with wireless IoT-connectivity only need to be done once
    • You own and control all the data generated from the sensors
    • Enables innovation: Extremely simple to build a find-me-a-meeting-room-now app (motion sensors are already there)
    • No vendor lock-in — just tell your adaptive-cleaning-vendor to use your existing sensors
    • Environmentally friendly: Re-use or re-purpose old sensors (today you can’t re-use a proprietary motion-sensor from one vendor and use it in another vendors system)

    What about queuing — asynchronous communication?

    Scholars agree (and Xerox PARC (for you history nerds)) that the future of IoT networking belongs to Content-Centric Networking (CCN). In a CCN network, the endpoints communicate via name-based data instead of IP-addresses. The object of encryption is the data, not the connection. Routers (called Content Stores) in a CCN networks caches the data (message queuing) so that data-consumers can retrieve it at any time. This enables wireless offline devices to retrieve commands/data when they wake-up. This is true asynchronous communication. CCN networks, like IP networks, has no single point of failure.

    Z-Mesh, an Open Source, CCN-inspired IoT networking protocol, is solving the problems discussed in this article today. Z-Mesh combines the naming-scheme and publish/subscribe feature from MQTT with the CCN features and is thus familiar to use. Anyone can build devices, routers and device management systems. If you want to help build on this vision and eco-system, please contact me.

    Aer Networks is a company that delivers Z-Mesh IoT sensors, wireless forwarders and device management software and is sponsoring the Z-Mesh initiative. Please contact me if you are interested.

  • Interoperability: Multicasting on heterogeneous networks

    Scaling is the greatest problem with IoT and Interoperability is the cause.

    Before describing how to achieve interoperability on heterogeneous networks, lets start with defining a few requirements for what an IoT must support:

    • Multicasting: Events must be sent One-to-Many in a pub/sub fashion
    • Heterogeneous networks: Must support TCP/IP, Wireless, Cellular
    • Content naming: Consumers must be able to access any content (Network effect)

    Multicasting and Multihoming

    The Z-Mesh architecture seamlessly handles the dissemination of content to a single consumer or to a group of consumers. Z-Mesh manages all the interfaces over which data are sent or received with components named Faces. Each Face can be connected to higher layer entities, such an application, a physical network interface or even a virtual link, such as a TCP/IP connection. As a result, Z-Mesh supports multicast communications out of the box.

    Furthermore, Z-Mesh overcomes the well-known problems that IP architectures presents with multihoming. In fact, since Interest routing (request for data) is based on the Content Name, Z-Mesh works out of the box in heterogeneous network environments with multiple channel technologies (wireless, wired, TCP/IP, Cellular) and is capable of aggregating Interests received from different faces, so that only one Interest per content is sent over a shared communication link. Furthermore, nodes with multiple network interfaces, called Content Stores, are also capable of caching. When a Data packet is sent back to the consumer, Z-Mesh leaves a copy of the message in the Content Stores of all nodes along the reverse Interest path. Popular content is then automatically made available close to its consumers.

    Heterogeneous networks

    IoT encompasses a wide range of devices, from small sensors to complex industrial equipment. Supporting heterogeneous networks allows IoT systems to connect and integrate these diverse devices, ensuring seamless communication and data exchange.

    IoT deployments often need to scale up or down, and adapt to changing requirements over time. Heterogeneous network support provides the flexibility to incorporate new technologies, protocols, and devices as they emerge, without having to completely overhaul the existing infrastructure.

    IoT systems often need to integrate with various legacy systems, enterprise applications, and cloud platforms. Heterogeneous network support enables seamless interoperability, allowing IoT devices and systems to communicate and exchange data across different platforms and technologies.

    By supporting multiple network technologies, IoT systems can leverage redundant communication paths and failover mechanisms, improving the overall reliability and resilience of the system. Heterogeneous network support allows IoT systems to select the most appropriate network technology for each use case, optimizing performance, energy efficiency, and cost-effectiveness.

    Content Naming

    In order to achieve interoperability and scale (the Network Effect), each piece of content must have it’s own unique Content Name.

    So how do we achieve interoperability?

    Metcalfe’s law when applied to IoT, says, the value of an IoT network (solution) is proportional to the square of the number of addressable pieces of Content Names. That is: If all Content Names can be retrieved by any Consumer, you have maximum value.

    Just to be clear; Consumers must be able to address (get) ANY piece of Content. This is called a Unified Namespace. It is why technologies like MQTT has enjoyed so much success.

    Having a Unified Namespace allows Content Producers and Consumers to be properly routed and connected within the network, enabling communication and data exchange between them. Without unique a Unified Namespace, the network would not be able to effectively route and deliver Content, limiting the overall connectivity and value of the network.

    As the network grows, the number of connected devices and systems increases exponentially. Robust addressing (naming) schemes are necessary to accommodate this growth and ensure that the network can scale effectively, maintaining the network effect as the number of connected entities expands.

    A Unified Namespace enables interoperability between different devices and systems within the network. This interoperability is crucial for realizing the full potential of the network effect, as it allows diverse entities to seamlessly communicate and collaborate.

    A Unified Namespace facilitates the management and control of the network, allowing administrators to monitor, configure, and troubleshoot individual devices or systems. Effective network management is essential for maintaining the stability and reliability of the network, which is a key factor in achieving the network effect.

    Conclusion: Use Z-Mesh

    The Z-Mesh IoT protocol, being an Information-Centric Networking architecture, allows for interoperability and scale because all content is accessible through the Unified Namespace, while also supporting heterogeneous devices on different types of networks.

  • The IoT-enabled Economy

    Quotes from the 2024 paper: Internet of Things (IoT) Advisory Board (IoTAB) report

    Quote:

    By fostering symbiotic relationships and co-opetition among participants, platform-based IoT business ecosystems drive innovation, monetization, agility, and scalability through open architecture, governance, and network effects,45 as proven by trillion-dollar platform giants.

    The Internet facilitated the development of digital platform business models. A platform-based business model “creates value by facilitating exchanges between two or more interdependent groups, usually consumers and producers. To make these exchanges happen, platform-based solutions harness and create large, scalable networks of users and resources that can be accessed on demand. Platforms create communities and markets with network effects that allow users to interact and transact.”

    Platform-based IoT business ecosystems are comprised of complementary partners, resources, standards, and tools. These have long been advocated by business scholars for their proven ability to fuel economic value by leveraging scalable digital platforms as the foundation for dynamic and interconnected business networks. By fostering symbiotic relationships and co-opetition among participants, platform-based IoT business ecosystems drive innovation, monetization, agility, and scalability through open architecture, governance, and network effects, as proven by trillion-dollar platform giants.

    Orchestrated business partnerships

    Partnerships are critical to the development of the IoT-enabled economy. End-to-end IoT solutions across industry ecosystems are inherently complex, and involve multiple companies, technologies, and standards. By forging IoT business partnerships with complementary stakeholders, organizations can leverage each other’s strengths to develop integrated solutions and accelerate the creation of data ecosystems.

    Orchestrated platform-based business ecosystems bridge industries

    While existing large-scale platforms have excelled in various domains, there remains a noticeable void in multi-stakeholder collaboration platforms across industry ecosystems. One of the main strategies for achieving hyperconnected growth is to encourage platform-based business ecosystems that link IoT value chains.

    Findings:

    • IoT systems depend on chips sourced through vulnerable global supply chains.
    • Establishing trust in IoT requires a multi-dimensional ecosystem perspective, extending beyond cybersecurity and privacy.
    • Privacy concerns undermine trust in IoT and are a significant barrier to widescale adoption.
    • IoT cybersecurity concerns are a major barrier to widescale adoption.
    • IoT modules built by Chinese companies dominating our market poses a serious security and economic risk.
    • Interoperability is a key challenge for IoT across multiple industries.
    • A variety of connectivity challenges are hindering IoT adoption, operation, and scaling.
    • The IoT-enabled economy is unlocked and accelerated with platform-based business ecosystems, which require multi-stakeholder collaborative partnerships to be successful.
    • Equity in access, opportunities, benefits, and outcomes is necessary for the sustainable integration of IoT into all aspects of the national economy and civil society.

    Interoperability

    A significant barrier is the inability of devices to communicate with each other or with the broader enterprise, legacy systems, and operations technology systems. In some cases, the lack of interoperability is caused by a lack of standards and protocols. In other cases, there are multiple competing standards as each solution provider creates “walled ecosystems”.

    Complexity and Integration

    IoT consists of sets of disparate technologies offered by a fragmented ecosystem of hardware suppliers, software platforms, and connectivity service providers. It is not a “one size fits all” solution, and components must be assembled to create a solution that meets the specific requirements. In addition, IoT implementations often require integration with existing systems and infrastructure. Integrating IoT devices and platforms with legacy systems is a significant barrier, costly, and requires technical skills that are in short supply.

    These issues create “siloed” data trapped within a specific device or vendor’s ecosystem. As a result, integrating systems to enable communication and data exchange is complex and costly, requiring additional middleware and custom integration.

    This inability to integrate IoT with existing legacy and modern systems hinders innovation and the full benefits of interconnected, automated systems.

    Cost savings

    Buildings: The lack of interoperability in IoT systems prevents significant cost savings and revenue opportunities. For example, in [USA] healthcare, it could result in $35 billion in missed annual savings in the U.S.123 In renewable energy, achieving interoperability could save up to $10 billion by reducing transaction costs and increasing efficiency. Without it, there may be $59 billion in lost opportunities from innovative energy applications not being deployed in buildings.

    Transportation: In transportation and logistics [USA], improved interoperability and real-time data sharing could reduce global freight emissions by 22%.

    Vendor lock-in

    The lack of interoperability in IoT creates vendor lock-in and switching barriers, resulting in a fragmented market of “walled garden” solutions. These solutions only work with a limited set of compatible equipment, reducing choices and forcing buyers to stick with specific vendors. IoT technologies based on proprietary standards do not work with other systems, compelling buyers to continue using the same vendor and its partners, often leading to higher costs, fewer innovative features, and limited capabilities. Migrating from these systems to other lower cost or more innovative alternatives is difficult and may require significant switching costs.

    Learn from the history if the Internet

    The Internet connected people with people, businesses with businesses, and people with businesses. In doing so, the Internet facilitated the development of digital platforms and business models and services enabled by connectivity.

    A platform-based business model “creates value by facilitating exchanges between two or more interdependent groups, usually consumers, partners, and producers. To accelerate adoption, platform-based solutions harness and create large, scalable networks of users and resources that can be accessed on demand. Platforms create communities and markets with network effects that allow users to interact and transact.”

    IoT as foundation for the digital economy

    To advance the IoT digital economy, it is crucial to build a foundation of connectivity and IoT platforms that promote interoperability, digital transformation, and collaboration across business ecosystems.134 History shows that platform-based economies accelerate the evolution of such ecosystems.

    Multi-stakeholder IoT partnerships enabled by platforms are key to accelerating widespread adoption and contributing trillions to our GDP.

    Source: https://cloud.aernetworks.com/s/PEAmrSHqGi8oFYG

  • Internet of Things for Smart Cities: Interoperability and Open Data

    TLDR:

    1. Interoperability is key
    2. TCP/IP is not an option.
    3. Municipalities MUST invest in truely open systems (no part can be patented)
    4. Constrained low-power battery-driven devices are hard to support
    5. Open system allows third-parties to innovate

    Link to paper: https://cloud.aernetworks.com/s/4STReMBdYET96rr

    Quote:

    Interoperability and Open Standard Development
    With the popularity of IoT devices, many IoT protocols and standards have been developed. In contrast to ordinary computers, IoT devices are normally constrained when it comes to memory space and processing capacity. In addition, IoT devices might be deployed where there’s limited or no access to continuous power supply, which means that they need to operate under power supplied from batteries or small solar panels. As a consequence, power-efficient communication protocols with small memory footprints and limited demands on processing have been developed to support IoT devices. Traditional TCP/IP protocols haven’t been designed with these requirements in mind.

    Standard protocols are important to guarantee interoperability of different IoT devices.
    However, using open standards doesn’t automatically result in open systems. In our context, an open system means an integrated open IoT infrastructure solution for smart cities, providing access to open data and APIs for cloud services. In many cities, that infrastructure will be paid for, at least in part, by the city authorities using public funding. To motivate this investment, and get the most benefit for society, we argue that any smart city IoT infrastructure needs to be a truly open system, where equipment from many vendors can be used, and where the generated data can be more or less freely used by anyone to develop new services, based on low-level as well as processed sensor and IoT data. This kind of system will maximize innovation in the IoT domain, much as the Internet has done for information and communication services. Many current IoT systems — for example, for air quality monitoring or the smart home — are either incomplete systems with limited functionalities (that is, in terms of sensing, storage, and analytics), or are closed, proprietary systems dedicated for a particular task. The latter are vertically integrated systems, sometimes called stove pipes or vertical silos, which can’t be combined or extended easily with third-party components or services. The result is that once invested in a particular system, you’re locked into that vendor’s system. Vertically integrated systems are particularly problematic for the public sector, because this prevents fair competition in public procurement and is less suitable for large-scale data sharing.
    Patrik Fältström (7) argues similarly that market forces work against open interoperability, specially in the IoT domain where, for example, a smart lighting system from one vendor only works with light bulbs from the same vendor. Systems are designed as end-to-cloud-to-end, where the cloud part is vendor-controlled with limited possibilities for third parties, and where the IoT devices often speak proprietary protocols to the cloud. Fältström argues that this lack of interoperability severely limits the market growth (for example, with smart light bulbs). Also, the dependence on a cloud service might render the device non-functional, should that cloud service for any reason, temporarily or permanently, disappear.
    Instead of these stove pipes, we need horizontally designed systems with well-defined interfaces and data formats that can unleash the potential of open data, and that enable third parties to independently develop new applications and services, possibly combining several data sources. Providing open data has huge potential for innovation in digital applications and services, resulting in very large economic values. These interfaces (APIs) through which the IoT data can be accessed at multiple levels of refinement — from raw data directly from sensors, to highly processed data — also need standardization. The challenge is to provide an open system that lets users access the open data and cloud services without being locked by a particular platform. The open system should also allow third-parties to innovate based on the open data and open APIs.

  • Kafka vs Z-Mesh

    Both Z-Mesh and Apache Kafka are both distributed event streaming platforms. Both can be used for real-time data processing, data integration and message queuing. But what are the differences?

    TLDR: Z-Mesh uses a request-response pattern and supports publish/subscribe. Kafka uses one-way message queues that supports event-streaming.

    Z-Mesh

    Z-Mesh is a distributed network of interconnected Forwarders, device and systems.

    Message pattern: request-response

    Kafka

    Kafka is a message queue / broker.

    Message pattern: One-way

    Consumers are systems like space-, cleaning or energy-optimization systems, Facility Management- or analytics- or manufacturing intelligence systems.

    Z-Mesh request-response pattern

    In Z-Mesh, the network is tasked with the responsibility to return the requested content (event). There are two scenarios, depending on whether the network has stored (cached) the content or not:

    • If stored, the network returns the content
    • If not stored, the network forwards the request to the producing node, which then responds with the content

    Z-Mesh data-frame

    Content (events) in a Z-Mesh network are pieces of data that are labeled with a Name and a sequence number. IoT-devices and systems (nodes) can request content by using it’s name and sequence number. Content produced by IoT-devices and systems are stored in the Forwarding nodes (Forwarders) for a set period of time (eg. 24 hours) and can be requested at a later time. This enables battery-driven IoT-devices to wake up from sleep and request or publish content.

    Event Streaming or Publish / Subscribe

    Event streaming or publish/subscribe is initiated by the requesting device, by sending a request to the network with a wildcard sequence number and a long timeout. When content is published, it is instantly forwarded to all the requesting nodes.

    Direct data-exchange

    Content between two nodes can be exchanged directly; Two wireless Sub-GHz nodes can exchange directly, without the need for a Forwarder/network being present.

    Kafka one-way message queuing

    In Kafka, producers (devices or systems) sends data/events to a queue (Kafka) which consumers then either receive or retrieve at a later point in time. Content flows one-way and as the request, by the consumer, is served by Kafka, it never reaches the producer.

    Encryption and authorization

    In Z-Mesh, any node (that has access to the network) can request any piece if content at any time, however only the nodes with the encryption key can decode it. The object of encryption is the content (event/data), not the connection like in TCP/IP.

    Kafka is built on top of TCP/IP and so the connection between the sender/receiver and Kafka is encrypted (TLS). It means that data is stored unencrypted in Kafka and that access to data (authorization) is implemented at the connection level.

    Application layer

    The Z-Mesh Device Management protocol defines data-formats for many types of events. This means that when consumers are given access to an event-stream (by providing them with the Content Name and Encryption Key) they are also provided with the data-format, so they know how to interpret the received data. The Z-Mesh network itself has no knowledge of the data-format.

    Like MQTT and Z-Mesh, Kafka is agnostic to the data-format. There is no standard that defines how producers and consumers “speak together”.

    Device Management

    The Z-Mesh Device Management protocol authorizes devices and systems to participate in the network. Each Content Name has it’s own encryption key and data-format and the Z-Mesh Device Management protocol defines how they are managed.

    Kafka, like MQTT, is a transport layer and does not define Device Management.

    Addressing content

    Any device or system in a Z-Mesh network is able to request (address) any piece of content.

    The value of a telecommunications network is proportional to the square (N2) of the number of connected compatible communicating devices
    — Bob Metcalfe

    Kafka is not designed as a network is was designed as a one-way event streaming platform. Addressing content is done by connecting to a message queue (topic). But you could argue that it supports addressing any piece of content by using the Message Queue Name (topic) as the address.

    A note on distributed systems

    Distributed systems has no single-point of failure. Z-Mesh is a distributed network. MQTT is an example of a centralized network.

    Comparison table

    Z-MeshKafkaMQTTLoRa
    PatternReq/ResOne-way
    ArchitectureDistributedDecentralizedCentralizedCentralized
    Central Device ManagementXX
    Application layerX
    Direct commsX
    Group comm.X
    Centrally manage secure data-flowsXX
    Offline supportXX
    Vendor neutral hardwareXXX
    Software update possibleXX(?)
    Uses 20% of battery.
    No vendors support it.
    Royalty freeXXX

  • Why Event Streaming for IoT?

    Event streaming is particularly well-suited for the Internet of Things (IoT) for several reasons, primarily due to the nature of IoT data and the requirements for processing that data in real-time.

    Here are some key factors that make event streaming an ideal fit for IoT applications:

    1. High Volume of Data:

    • Continuous Data Generation: IoT devices generate vast amounts of data continuously, often in real-time. Event streaming platforms can handle high-throughput data streams, making them capable of processing the large volumes of data produced by numerous IoT devices.

    2. Real-Time Processing:

    • Immediate Insights: Many IoT applications require real-time data processing to derive insights, trigger actions, or make decisions. Event streaming allows for the immediate processing of events as they occur, enabling timely responses to changing conditions (e.g., alerts for anomalies, automated adjustments in smart systems).

    3. Decoupling of Data Producers and Consumers:

    • Loose Coupling: Event streaming architectures decouple data producers (IoT devices) from data consumers (applications, analytics engines). This allows for greater flexibility in how data is processed and consumed, enabling multiple applications to subscribe to the same data stream without direct dependencies.

    4. Scalability:

    • Dynamic Scaling: Event streaming platforms can scale horizontally to accommodate increasing numbers of IoT devices and data streams. This scalability is crucial as IoT deployments grow and evolve over time.

    5. Data Integration:

    • Unified Data Pipeline: Event streaming platforms can serve as a central data pipeline that integrates data from various IoT devices and sources. This integration allows for a holistic view of the data and facilitates analytics, machine learning, and other processing tasks.

    6. Event-Driven Architecture:

    • Reactive Systems: IoT applications often benefit from an event-driven architecture, where systems react to events as they occur. Event streaming supports this paradigm, allowing for the development of reactive applications that respond dynamically to incoming data.

    7. Support for Complex Event Processing:

    • Advanced Analytics: Event streaming platforms enables the detection of patterns, trends, and anomalies in real-time. This is particularly valuable for applications like predictive maintenance, fraud detection, and smart city management.

    8. Durability and Reliability:

    • Data Persistence: Many event streaming systems provide durability and reliability features, ensuring that data is not lost even in the event of failures. This is important for IoT applications where data integrity is critical.

    Conclusion:

    In summary, event streaming is well-suited for IoT due to its ability to handle high volumes of real-time data, support for scalable and flexible architectures, and capabilities for immediate processing and integration. These features enable organizations to leverage IoT data effectively, driving insights and actions that enhance operational efficiency and decision-making.

  • EU Landscape report on Interoperability

    Quotes:

    The energy system is therefore progressively evolving from a system centrally operated and optimized into a federated system … As a consequence the associated digital infrastructures need to accelerate their transformation from few central monolithic control room environments … towards new orchestrated platform architectures taking advantage of IOT, edge computing as well as hybrid private and public cloud architectures where real-time data exchanges, interoperability and Open Application Programmable Interfaces have become critical technology building blocks to enable plug and play data interfaces.

    Enterprise service bus (ESB) or other integration solutions were used for the deployment of first Service Oriented architectures to decouple systems. However, the number of connectors, as well as the growing requirements for heterogeneous data consuming interfaces feeding a variety of data interfaces through real-time has led applications, system components and enterprise services to remain closely intertwined, as legacy SCADA platform vendors did not provide sufficient reengineering efforts to establish interoperability across data interfaces although the emergence the IEC CIM standards. This situation has limited the possibility to integrate platform and application from different vendors as well as expand control room platforms with modern technology stacks developed through the opensource community.

    Event streaming platforms:

    This recent transformation is progressively heading towards the migration of original Control Room platforms into new event streaming platforms leveraging events as a core integration principle and orchestrating most of energy management and control business processes around real-time data streams. This new architecture is designed in view of the event data flows while data processing is orchestrated on data while they are in motion. It has the following key benefits:

    • Event-based data flows is a foundation for (near) real-time and batch processing as required in most of flexibility management processes. In previous SOA architectures, applications were built on data stores (data at rest), which was making it impossible to build flexible and agile services to act on data very close to real-time.
    • Scalable architectures for all events shared across infinite source and sink processes. As opposed to centralised monolithic applications, the architecture is built on scalable, distributed infrastructures by design for zero downtime, handling the failure of nodes and networks while being able to roll out upgrades online. Different versions of infrastructure (like Kafka) and applications (business services) can be deployed and managed in an agile, dynamic way. This approach minimises dependencies across application which was particularly complex to manage through historical SOA architectures.
    • Openness to any kind of applications, system components and microservices. Technology does not matter. The streaming environment connects anything: programming languages, APIs like REST or MQTT, open standards, proprietary tools, and legacy applications which reduces the need to redesign existing legacy applications while benefitting from highly scalable cloud containerized environments enabling rapid prototyping of new applications. It also allows to minimize processing speed constraints for real-time control applications.
    • Distributed storage for decoupling applications. The platform data streaming environment allows to store the state of a microservice instead of requiring a separate database.
    • Stateless service and stateful business processes. Business processes typically are stateful processes. They often need to be implemented with events and state changes, for which remote procedure calls and request-response as considered through SOA architecture is not optimal.

    The solid Open Source approach deployed around Apache Kafka makes it the preferred choice…

    Link to the report

  • Digital sovereignty for Europe

    Where is EU is heading in the IoT space?

    Quotes from the PDF

    1. Europe’s ability to act independently in the digital world
    2. Reliable digital infrastructure and services are critical in today’s society
    3. There are calls to build a European cloud and data infrastructure to strengthen Europe’s data sovereignty and address the fact that today, the cloud and data storage market is almost exclusively dominated by non-European suppliers.
    4. Furthermore, investment in frontier technologies, including artificial intelligence, IoT,…
    5. In the long run, building a genuinely sovereign EU digital environment will also require addressing the current lack of coordination between regulators in this field. … Such mechanisms would be critical for instance to ensuring a coherent EU sovereign approach in many areas, such as applications management (e.g. apps or IoT devices in the data privacy field), or platform regulation.

    An open IoT platform approach is absolutely critical and will succeed in the long run.

    What EU is saying

    Rolf Riemenschneider – Head of Sector IoT at European Commission:

    Quotes:

    • 4:38 Aver the air updates
    • 5:00 Avoid Vendor lock in
    • 6:40 Support IoT – Platform approach in the centre – demonstrate orchestation across application sectors (application agnostic)
    • 7:22: So a Platform approach to enforce an open ecosystem
    • 8:00 Open Core – 8M EUR to attract Open Core Open source developers
    • 8:36: Open Calls for a vibrant IoT ecosystem
    • 8:58: Interoperability – most important. Open Interface, Open standards
    • 10:10 Lack of cross sector orchestration
    • 11:20 Data sharing (quite a challenge) – we must go back to the platform strategy/standards

    Max Lemke – Head of Unit for IoT

    Quotes:

    • 1:56 Competitiveness and Standardization are mentioned again and again in the Draghi Report
    • 2:40 Geopolitical tensions … we are either smached in the middle or we find our way … underscoring the need for unified IoT standardization in Europe
    • 3:33 By fostering collaboration and sharing we can better navigate these challenges
    • 6:01 We [must] merge things of different ownership [vendor] into a integrated infrastructure
    • 7:40 We need to look at Open Platforms … not driven by one actor
    • 8:32 It means a multi-sided market place with different actors … and standards are crutial for scalability
    • 9:00 They enable devices from different manufactures … to ensure interoperability …