Major transitions in technology are driven by simple changes in perspective, and there’s one happening right now.
A very long time ago, I was a student at MIT, programming with card decks in APL on IBM mainframes and studying AI in a class from Patrick Winston (who took over MIT’s AI lab from the legendary Marvin Minsky). I kept the text book as a reminder of where the world would go. Over four titanic shifts, mainframes/card decks became VAX/VT100, thence to IBM PCs and PC clients tied by Ethernet to corporate servers. Then the smartphone revolution hit, and computing—in the palm of your hand, no less—became truly personal.
Each phase of this journey was marked by big, fundamental shifts that were enabled by semiconductor innovation and were a byproduct of the relentless march of miniaturization, process-yield improvement, tools advances and processing efficiencies that started in the ‘70s. I believe they will carry through to the distant future.
But the biggest transitions in technology were also driven by simple changes in perspective. Smart phone vendors didn’t see just a handy telephone you could cart around with you; they envisioned computing wherever you want to go. That’s a different perspective.
So today, we are witnessing another change in perspective. Computing in the devices that surround people, at home, on the road and at work can do radically different things than help us just surf the web or text message our friends. The new perspective is that technology can learn about us, it can drive our cars, it can monitor our families and keep us and our data safer in a threatening world. As its capability increases, edge computing has the ability to take on a range of adaptive and intelligent machine learning (ML) and AI workloads that will deliver tremendous value to people.
In IDC’s just-released report, Worldwide Enabling Technologies and Semiconductors 2018 – Top 10 Trends, AI-driven edge computing is actually two of the 10 key predictions. The first is that 2018 is the beginning of artificial intelligence at the edge and the second is that AI-optimized SoC solutions will emerge for mobile and edge applications.
The authors write:
“This year, suppliers are rushing to position their existing products or are adding new elements to handle AI workloads and programming frameworks. Arm has optimized its leading generation of CPU and GPU IP cores to perform AI-inferencing tasks more efficiently. Qualcomm, Rockchip, MediaTek, Google, and other semiconductor suppliers have turned to DSP cores to accelerate computer vision and other AI tasks integrated in either their general application processors or as co-processors.”
AI-related workloads are moving to the edge for simple reasons: Latency and security. In the old model, we captured data at the edge and shunted it back to a data center for processing before the answers were sent back to a device. This took time and consumed lots of power in the aggregate. Moving compute closer to where the data is captured and machine learning algorithms are run addresses this issue and enables the right amount of compute at the point where it’s most efficiently applied.
For example, voice recognition on a smart speaker would run better if all the processing were done on the device. It would work faster, keep the data more secure and would not have to rely on a good WiFi signal. Similarly, running facial recognition to unlock a smartphone is most sensibly done on the actual product. Even a security camera capturing gigabytes of data would benefit. Rather than streaming all data to the cloud, running ML on the device will filter out erroneous information so only suspicious data is sent to the cloud for further analysis or to trigger an alert.
Distributed computing, then, is taking shape before our eyes. We wrote here last year of one example, Lumo Bodytech whose wearables (edge devices) run some machine learning software to optimize for the wearer’s behaviors. It’s only going to get more interesting and challenging as we move from a world of just over 8 billion connected devices to one with 1 trillion.
This is now an unstoppable trend and it will lead to ML in use in billions of devices as edge computing capabilities grow.
But AI and ML aren’t the only areas that will be interesting this year, according to the IDC Market Perspective report. Not surprisingly, security remains a priority and there are plenty of other compelling predictions for 2018, ranging from wireless and automotive-electronics trends to China’s expanding role all the way to datacenter dynamics.
It’s funny to see my youthful academic life take 40 years to come to reality. My classes on natural language processing, machine learning, machine vision, object detection and interpretation are finally being realized on a mass scale.
It’s a matter of constant changing perspective. I love it.
To read more about the report, visit our landing page.
Leave a Reply