Manage Learn to apply best practices and optimize your operations.

IoT, 5G and the need for smart streaming

In today’s digital world, the reach and power of IoT is a given. Not only is IoT increasingly relevant in consumers’ everyday lives, but it is also continuing to shape and expand within almost every industry. With everything from Google Home gadgets to industrial IoT (IIoT), the number of connected devices is exploding. Gartner forecasts 20 billion internet-connected things by 2020 — and with this increase in connected devices, we can also expect a massive surge in data generation.

The much anticipated emergence of 5G is happening now and will play a critical role in accelerating the realization of an increasingly connected world. This is going to change the way data is used from collection and analysis for predictive and prescriptive purposes to real-time decision making processes. The data value migrating to real-time is affecting industries enabled by 5G and traditional communications service providers. Organizations are already beginning to think about their systems in terms of an event-decision-notification-action loop. Real-time data architectures need to scale in order to manage this unparalleled increase in data traffic and go beyond simply ingesting data. Ultimately the architectures will drive actions by making intelligent, dynamic decisions across multiple data streams.

Combining state management and stream data processing

With IoT and 5G putting more stress than ever on traditional databases and streaming technologies, organizations will need to adjust their streaming architectures accordingly. When low latency processing is the key for 5G success, cobbling together disparate technologies is not going to work. A unified architecture that brings together the state management of a database and the ability to handle streams of data contextually becomes necessary. As decision-making requirements are moving closer to the edge, a distributed architecture of data processing that can reside in low footprint edge data centers is key. Disparate technologies operating in unison to ingest, process and store data, not only results in higher latency, but also leads to complications with scale, handling complex workloads and hardware sprawl. Streaming data processing has gone beyond just processing and reprocessing to active decision making. This requires thinking beyond the Kappa architecture frame of reference.

Next-generation applications can be classified by three essential elements: increased volume of data, short timeframe for response and complexity of decision making on complex data models that requires streaming data analysis along with stateful atomicity, consistency, isolation and durability (ACID) transactions. Organizations must move away from traditional architectures to harness the potential of 5G-powered applications.

Keys to a successful stateful streaming implementation

Data is organized in two broad manners: state machine and ledgering. State machine is where the last known state of the entities in a system is maintained. Ledgers are where the series of state mutations are recording. Real-time decisions on streaming data require both the state machine and a contextual ledger to play together while the forever data is used to retrain the machine learning layers in the data architecture. A key requirement for these to operate together is for the stateful stream machine to integrate bi-directionally with the learning machine. The learning machine receives new training data from the stateful streaming machine, while the stateful streaming machine receives updated insight, rules and business logic from the learning machine. Eliminating manual intervention in this exchange of raw data and intelligence between the layers is vital.

Organizations must go beyond the traditional store-and-query model of data processing designed for batch and reconciliatory processing and switch to an intelligent unified state and stream processing architecture that  transactionally makes real-time decisions on incoming event data. By decreasing latency and architectural complexity, organizations will be able to deliver value quicker and more intelligently by analyzing and acting on incoming data. This will drive better monetization of real-time data, ultimately meeting the IoT and 5G real-time analysis requirements of today.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.