IoT Meets ML

What happens when the IoT begins tapping into patterns?

popularity

AI and machine learning are the next big things, and they’re going make a huge difference in the adoption and capabilities of the IoT.

Unlike previous technology approaches, AI, machine learning and deep learning are based on patterns. In effect, they raise up the level of abstraction for data. An image of a cat can be megabytes of data, and a cat taken from all angles may be gigabytes of data. But that’s only for the training algorithms. For the inferencing portion of this equation, that really doesn’t matter. A cat is a pattern, and as AI/ML/DL migrate toward probabilistic approaches, the amount of data necessary to draw that conclusion will decrease significantly.

For many applications, these data sets can be slimmed down even further. While it may be important for a car to distinguish between a cat and a toddler, for most IoT applications having 80% or 90% accuracy is good enough. How much data does a smart toaster really need, and perhaps more important, how quickly does it need that data? Time may be even less critical for a smart washing machine or a smart sprinkler system.

This has big ramifications for designing IoT devices. Because those systems are processing less data less often, designs can be much more energy-efficient. Bit-level accuracy is highly compute intensive. But as computing evolves increasingly toward processing patterns based on good-enough probabilities, these devices don’t have to work as hard. Time between battery charges can be greatly increased. That, in turn, opens the door for energy harvesting, which today doesn’t produce enough energy to power most devices. It also means that chips will last longer and perform more reliably, because they will be used less intensively.

The bigger implications are upstream, though. Machine-to-machine (M2M) communication in the data center already is having a profound effect on networking architectures. With fewer or no users involved, networking architectures can be flattened out, so traffic can be optimized for much more consistent statistical distributions.

And that’s just working with what’s already in place. Add in probabilistic pattern recognition, rather than trying to pick out keywords or phrases, and suddenly data can be moved and mined orders of magnitude faster. This is particularly important as 5G networking comes into play, because the back-and-forth communication between edge devices and hyperscale data centers all will be moving at speeds that were considered improbable even a year or two ago.

There are still major issues to be resolved around privacy, security, and regulation. And there are technical challenges involving placement and type of memory, how to identify bugs and improve reliability. As with all technology, not everything will evolve at the same rate, and there inevitably will be frustrating glitches along the way.

But take a step back and looking at how these pieces can fit together—particularly with neuromorphic system architectures and possibly the introduction of quantum computing over the next decade—and the overall picture looks radically different than the initial IoT concept. The technological underpinnings are falling into line, and that will open the door for connecting many different market segments, such as real-time and continuous health monitoring, predictive M2M maintenance, and a variety of new capabilities that have yet to be discovered or monetized. So while the IoT as it initially was conceived seems rather simplistic and limited, the possibilities for what the IoT can become as AI, ML and DL are added into the picture, is looking much more interesting.



Leave a Reply


(Note: This name will be displayed publicly)