The Smartphonification Of Things

How The Internet of Things is changing the game in electronics and EDA.


By Ann Steffora Mutschler

The term, ‘Internet of Things,’ was first coined more than a decade ago by technology visionary Kevin Ashton but has slowly trickled down to the world of chip design and is now mentioned constantly in conversation. The reason is simple: System-level design tools are getting sophisticated enough to handle the intricacies required by devices in an Internet of Things.

Herein lays the potential for EDA to play a role in this emerging market.

According to Wally Rhines, chairman and CEO of Mentor Graphics, “A lot of people would look at it and say, ‘The Internet of Things’ means low-cost sensors. The world of complex systems says you have to be able to design and simulate large numbers of things tied together and interacting. The complexity increases at least as the square of the number of components. If you count the interactions, as long as you have a limited number of air-pressure, light and motion sensors, you can have dedicated signal processing for each of those sensors. But as they start interacting with each other, those interactions have to be verified and analyzed. When you think about the Internet of Things, we now have more Internet nodes than people. The amount of verification required will go up as the square of the number of nodes.”

The Internet of Things is a really interesting and exciting area that will drive some substantial changes in how and which chips get built and maybe more importantly how they get used, according to Drew Wingard, chief technical officer at Sonics. “To think of it as a new market I don’t think is right. There are some places where it will be a new market, but I think it’s really an extension of existing markets. People talk about the Internet of Things, they talk about machines getting connected in for the most part and these machines already exist and they already have chips inside them. The interesting question is how do those machines get connected, or get connected in a more useful way than how they have been before.”

Enabling this all to happen are the communications protocols. The goal is to simplify how devices get connected to the Internet of Things, just as nodes on a network now connect almost seamlessly to the Internet.

“This is heavily focused on wireless standards,” said Kurt Shuler, vice president of marketing at Arteris. “And from a network standpoint, it’s about lots of heterogeneous traffic and the segmentation of that traffic so you can prioritize it.”

The Internet of Things is a big opportunity for the makers of NoCs and buses such as ARM’s AMBA on the chip. It’s also a huge opportunity for routing big data, which is why companies such as Cisco have been so actively promoting the Internet of Things.

That leads to the second aspect of the Internet of Things. If they are connected, what features do they enable, and which ones are most important and to whom. This will be particularly important in an automobile, where engine and braking functions will need to take precedence over a phone call.

Some rather bizarre ideas have surfaced for the Internet of Things, such as recipes communicated to a refrigerator or a touchscreen with a Twitter feed on a washing machine. At the very least, many of these screens will be color and they will try out new functionality that may or may not catch on.

Sonics’ Wingard called this the ‘smartphonification’ of consumer devices. “We start looking at chips that look like many smartphone chips. When we look at what companies who’ve been exiting the smartphone application processor space say they’re going to do next with these assets, they always talk about the Internet of Things. What’s fundamentally different about this class of design is the unit volumes for an individual targeted device are lower. If you look at the total number of washing machines they sell a year, it’s small compared to the total number of smart phones they sell. It’s not that there are any fewer people interested in owning one, but you may not have four of them in your house.” What this means is that the ability to build these Internet of Things devices must be able to be built for less than the $200 million it costs today to design and build an applications processor.

This is where EDA comes in, Wingard asserted. “We have to come up with mechanisms for building these chips, which are in many ways every bit as complex as an application processor from 5 or 10 years ago, but we’ve got to do it for $10 million so we need more automation. This is one of those places where this is an opportunity for this ‘platform’ word that we used to use 5 or 10 years ago to really come into play because there probably will be more similarities than differences between how these different things are going to work.”

Those closest to being able to do this today are the companies that are building high-end microcontrollers. They have a cost model for how to do their designs that matches this, they’ve already been migrating away from the 8 and 16 bit processors into 32 bit processors. “In some applications like the pin pads you interact with when you check out at the grocery store. They’re beginning to add graphics into their stuff so they’re actually in a pretty good place to get there. Still, these are more complex chips and there are a lot more ways they can fail. The mixed-signal content on most microcontrollers today is pretty simple in comparison with what you’d have to do to actually handle the real conductivity requirements of these things. You’re going to find some really interesting work for the EDA companies,” he added.

Frank Schirrmeister, group director for product marketing of the System Development Suite at Cadence, breaks down the requirements for the Internet of Things into three areas.

First is ‘the Thing’ itself that requires very intricate design pieces. “AMS design is very important. Design for low-power is crucial because those things need to be there as a thing independent. And in the context of the smart dust concepts, you need to be able to recharge potentially because if the thing is somewhere and you don’t exactly know where it is, you can’t just go and change the battery that easily. You want to do things like renewable energy, things based on movement and what have you. Those are all items that influence the design of the Thing.”

This will all require additional simulation, verification, AMS design along with its software, specific and very targeted low-power software implementations, along with virtual platforms and the software executing on it.

Second is the question of what to do with the data, which falls under the realm of big data and how to analyze it to make decisions based on it. However, it is unclear as to whether EDA will play are here, he pointed out, since it is more of a server, data center, algorithm crunching activity.

The third area is the networking/conductivity aspect to the Internet of Things. “Interestingly enough,” Schirrmeister pointed out, “that may cause a revival of the type of tools that in the past were used for network analysis/ performance analysis.”  Back in the 1990s, Cadence has one such tool called BONES (block-oriented network simulator), which was subsequently sold.

These tools could play a role in the Internet of Things and could be used for network planning, which is mildly connected to the type of analysis done when looking at chip interconnect, he pointed out.

While none of the EDA vendors has tools today for interconnect analysis because it has been traditionally a pure networking task, it could be an area for the industry to focus on, joining companies like Thingworx.

Parallels with chip design

Cary Chin, director of marketing for low-power solutions at Synopsys, said the technologies behind the Internet of Things, including RFID and ZigBee standards, mimic what’s been going on in chip design over number of years.

One of the clear parallels he sees is that as everything has become digital, which occurred with digital audio and more recently with digital video. The problems are transformed into digital problems. That then hits the knee of the curve and things take off because now they’re part of the Moore’s Law equation, he said.

“I see very much this idea of the Internet of things associated with the two big changes that we’ve seen in computer technology in chips in the last 10 years or so: communications and power. With communications the change from analog kind of cellular to digital cellular created an explosion in the technology, and that’s happened over the last 10 years. With power—it’s really within the last five years, and is still going on—it’s looking at the concept of when does power become digital, and digital to the extent that we can manipulate it with our digital tools.”

What he expects to happen is that technology is going to start explode with the ability to view communications and power as part of computing. The power piece really hasn’t gotten there yet and I think that’s why power is so interesting to many people these days even given the big of answers in the last few years is because it still hasn’t crossed over.

The problem today is that power is much more global and we don’t have a good way of making it much more hierarchical, more self-contained, more automatable so tools can be used to do it rather than having to contain it all within our brains which are limited in complexity, Chin said. “That’s really the trick, and there’s a lot that EDA can learn from these kinds of things. We’re at the point on the chip side where we are starting to develop some very interesting technologies at the system-level, which start to encompass layers of software, which start to encompass all of the these interesting pieces at a very high level. We’re starting to treat the software as part of what we used to consider part of the chip, too. You view the whole thing and you can create a model of the whole thing and check its behavior. That kind of thinking and bringing that level of complexity to the system-level design for this other area of the Internet of Things will be very interesting.”

What used to be just standards for bar codes a decade ago also is now starting to be shared.

“There are all of the standards in the ZigBee realm that allow us to build the complexity that we are used to in chip design, but really learn something from those guys in looking at a high system level view,” he added. “There’s going to be information to be passed and to be learned back and forth between the ideas of completely self-contained system-level design. That would be a cool problem for EDA because we’re not used to thinking that way. We end at the edge of the chip and we’re just starting to think beyond that.”

Leave a Reply

(Note: This name will be displayed publicly)