Adding AI To The IoT

Putting complex algorithms into billions of things radically changes the landscape.

popularity

The Internet of Things is about to undergo a radical change, fueled by vast number of things coupled with an almost pervasive presence of AI.

The IoT today encompasses a long list of vertical markets, all of them connected to the Internet but not necessarily to each other. The concept of the IoT really began taking off in 2015, when a combination of data analytics, high-speed, affordable and almost ubiquitous connectivity, and a variety of both valuable and ridiculous applications became available on a mass scale. For the next several years there was a lot of trial and error in the market to figure out what people would buy, what they actually found useful, and what they would re-purchase after the first generation either became obsolete or stopped working.

Still, this was more evolutionary than revolutionary. While the “Internet of Things” phrase was coined back in 1999 by Kevin Ashton, co-founder of the Auto-ID Center at MIT, the idea of connecting things together can be traced back long before the Internet or ARPANET were developed. Progress in technology often has been about connecting together different developments, and the IoT allows computing to happen everywhere.

In fact, the overriding vision of the IoT is less about individual things than the ability to process data in multiple places. For decades, compute technology has been about shrinking the box, from the mainframe to the smartphone to the smart watch. Computing has now evolved to the point where it is less about a box and more about the ability to cobble together processing elements across a wide variety of devices or things.

The key ingredient to making this really useful, though, is AI (and machine learning and deep learning), and this wasn’t on most people’s radar when the IoT hype cycle began. Unlike early applications of AI, which required enormous computers, sparser machine-generated algorithms now can be embedded into anything with a processor to provide a distribution of acceptable machine behaviors. That could include everything from a robotic vacuum that can determine where the dirtiest spots are on a floor to a medical device that can pick up anomalies through constant monitoring. The vacuum still can’t clean itself, and most medical devices still can’t determine when you really do need to call for an ambulance, but that’s certainly no more far-fetched than a car that can drive itself through traffic at the push of a button.

So how will people react to more things, particularly as these devices begin interacting in more nuanced ways with their surroundings? And how will people interface with these connected things? Natural language communication has come a long way, in part because of AI algorithms that allow words to be considered in context of other words. Voice commands are far more likely to result in an accurate response than even a couple years ago, when talking to a car’s infotainment system produced frequently unexpected and distracting results.

Put this all into the context of IoT market growth, overlay it with AI, and a different picture begins to emerge. Cisco estimates that 30 billion devices will be connected to the Internet by 2020. By 2030, that number is projected to increase to 500 billion. As of now, 5 quintillion bytes of data are produced every day. Most of that data is useless. But sifting through more data for patterns is more useful than sifting through less data, and as more things are connected more data becomes available.

All of this is part of the IoT’s next evolutionary phase, and this is where we are today. The more things are connected, the more data that is generated. That creates better training data for all of those things, and on the darker side it also creates more accurate ways to track the users of those things. But in either case, it shifts the focus from the things to the data, and that will have big implications for what technology is developed, by whom, and what ultimately will be successful in the market.



2 comments

Gil Russell says:

Ed, We at WebFeet Research agree.

Cognitive Computing systems that mimic the way the human brain works are thought by many to still be in the early stages of development. An interesting scenario has developed that may completely shatter this attitude. Several low-profile, long term research efforts underway for well over a decade are now nearing their market entry milestones. These systems are based on principles of the human neocortex and represent the first semiconductor devices able to perform near real-time learning and pattern identification on heterogeneous data streams. Webfeet forecasts that these market entries will radically change the complexion of what now is considered to be the High Performance Cognitive Computing market within the next several years. These entries will not only spawn the ability to process vast amounts of heterogeneous data but will, in addition, be able to attain predictive, temporal time sequenced data event extraction (something only dreamed of at present) again, all within the next several years. We project this same technology will produce a major revision in how the semiconductor market responds to Artificial Intelligence, especially where the von Neumann architecture intersects the memory-like architecture of pattern computing.

Machine Learning is NOT Machine Intelligence, a nuance that hopefully will lead to Artificial General Intelligence. In many circles, DL is beginning to ‘go winter’ as it is now considered passé as regards the an algorithm type worthy of the ‘applique’ of General Intelligence, but maintains a place within an overall panoply of useful AI solutions. This can be recognized by a subtly changing , as in “Old Brain” versus “New Brain” and “Brain Inspired” thus denoting the segue of the new algorithm based on the algebra of Pentti Kanerva and Numenta’s HTM. Again, this change is subtle and not well explained yet. We expect announcements that will produce a lot more detail than is available at present over the next 8 months.

Harshali Patel says:

Hello Ed,
Thank you for this article. I am a college student and want to learn more and more about AI and IoT. The article helped me with the topic of interlinking of AI with IoT.

Leave a Reply


(Note: This name will be displayed publicly)