Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

The reality of IoT edge processing: We're getting warmer

Sometimes we wander in the right direction but may not even realize why. That is likely the case with the market movement embracing edge processing for IoT. To be clear, we seem to kind of know why — we know there is increasing need to do computational work including significant filtering at the edge.

We’re getting warmer, but not warm yet.

We know there is increasing need for augmenting security models at the edge, although there can certainly be debate there. Now, getting warmer still. We also seem to know the edge is the place to change the communications protocol in many instances.

We’re getting warmer and warmer.

Some are beginning to contemplate and incorporate persisted stores for contextualizing data from a variety for devices and other sources behind the edge. We’re getting hot now.

There are but a few organizations that are going beyond that, but this will change. The migration of emphasis to the edge will include all of the above, but the game changer will come in the form of data ownership, privacy, governance and stewardship considerations. This will equate to an architectural rendering that can use the utility value of data. This, in a nutshell, is pay dirt for the edge emphasis. There are plenty of edge offerings out there — and they’re growing by the week — but the ones that make the privacy and governance model easy, or easier, will allow for the key benefits of the edge architecture realizable.

What does it mean to use the utility value of the data?

It’s a simple concept that was really brought to life by Edgar Codd in 1970 and first moved to the market by the System-R team in 1974 in the form of a relational database. This means the creation of the data is separated from the consumption of the data. In 1974, that was likely interpreted to be the banking transaction record being consumed by the bank’s DDA system, but also by marketing, treasury management, CRM or other systems and constituencies. In fairness, this illustration was much more likely to be seen in 1989 than 1974, but 1974 is when the concept emerged.

Today, in the cyber-physical world, it would be the temperature reading from the grill in the fast-food restaurant being sent, in some form or fashion, to the fast-food company’s local store, regional headquarters and corporate headquarters, as well as the manufacturer of the grill, supply chain partners and perhaps the FDA or other regulatory bodies. It’s still the same temperature reading.

IoT is a holistic proposition. In 2017, I had the good fortune of publishing a book about the direction of the market along with my good friends Emil Berthelsen from Gartner and Wael Elrifai from Hitachi. The message in the book is the same message here: The more the market matures, the more it will embrace edge processing. And the more it matures and embraces edge processing, the more it will incorporate the idea of the “first receiver” to use the utility value of the data. The market is steadily moving in the right direction, although in most cases, the realization of value is incomplete.

But the pace of change is increasing. The buzz around AI and machine learning will accelerate this direction for one very important reason: While we hear every day about new AI and machine learning startups, as well as how important this technology is and how necessary it has become, most also understand that the best analytics in the world are only as good as the underlying data set.

Getting the data curated in the best way, which is 100% a function of a thoughtful architecture, means everything.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.