Pace Quickens As Machine Learning Moves To The Edge

More powerful edge devices means everyday AI applications, like social robots, are becoming feasible.


Artificial intelligence applications are rapidly changing the way society engages with technology. It wasn’t too long ago that your smart phone couldn’t recognize your face or your thumbprint. It also wasn’t too long ago that Alexa wasn’t helping you navigate your day so easily.

And not too long ago, odds are, you weren’t developing an application or device that had AI/ML as its centerpiece.

But today is different, and developments in AI and ML are a precursor to what’s happening in the future. Take, ElliQ, for example. ElliQ is an engaging robotic companion that learns its owner’s behavior patterns and then suggests activities, music, videos, and ebooks to engage with as well as family and friends to connect with through social media. In short order, robots have evolved from bumping around our floors vacuuming up dust and dirt to serving as human companions.

This is all thanks to clever engineering combined with relentless silicon and software improvements that are enabling more and more AI and machine learning applications to be done in edge devices.

ElliQ, which is designed by Intuition Robotics, use Arm-based technology in the Qualcomm Snapdragon 820 system on a chip, with machine learning functionality enhanced by an accelerator. It’s all done in service of combating one of the biggest problems of aging populations: Loneliness.

Many senior citizens feel intimidated by the technology that could provide a vital connection to mitigate feelings of loneliness. Intuition Robotics is harnessing the power of technology to overcome the digital divide that’s changing the way new and old generations communicate, and helping the elderly stay active and engaged. Its social robot, ElliQ, is an autonomous active aging companion that uses machine learning to understand the preferences, behavior and personality of her owner to promote an active, healthy lifestyle. (You can read a case study about ElliQ’s design considerations here).

This is just one example of not only advancing applications through machine learning but leveraging and better compute power within edge devices to improve responsiveness, security and the user experience.

It’s also an example of the kind of innovation in AI/machine learning and edge computing that can be found in this year’s Arm TechCon, which is being held Oct. 16-18, 2018, in the San Jose Convention Center.

Here are some highlights….

  • How would you go about deploying real-time machine vision and deep learning on small devices? Laurent Itti of JeVois Inc. will present on this topic.
  • What tradeoffs do developers need to consider when implementing computer vision at the edge and the cloud? Jeff Bier of the Embedded Vision Alliance has some answers.
  • How can you speed up your AI designs with dedicated hardware? Arm’s Ian Forsyth has some suggestions.

These are just three of 18 Arm TechCon sessions devoted to edge computing and machine learning – the most-packed track of any of the eight Arm is highlighting at the event. Other topics include automotive, system design methodology, embedded software development and trust and security.

As the pace of design quickens in the IoT era, it’s more important than ever to stay current, so come join us this fall. If you need more reasons, check out our justification toolkit. There are resources including sample letters to your manager to help you come join us in San Jose.

Register now and get discounted rates. I’ll see you in October.

Leave a Reply

(Note: This name will be displayed publicly)