The need for concurrent hardware-software design and verification is increasing, but are engineering teams ready?
The barriers between hardware and software design and verification are breaking down with more intricately integrated systems, bringing together different disciplines and tools. But there are lingering questions about exactly what this shift means design methodologies, team interactions, and what kind of training will be required in the future.
Playing heavily into this is the fact that today, depending on the vantage point, members of the engineering team look for and interpret data differently. In debug, for example, the engineers doing simulation, emulation, and FPGA-based prototyping each look for different things.
“There is some level of overlap so you will run regressions, and you want to run them as fast as you can so you switch to emulation or FPGA-based prototyping,” said Frank Schirrmeister, senior group director, product management in the system and verification group at Cadence. “But then you want to get coverage data collected from all of these engines, as well. Debug is an animal in which everybody looks at different things. The simulation engineer will look at IP blocks and look at signals deep down. The emulation engineer will look more at the systemic aspect, at what’s going on with this bus, what’s going on in those memories, what’s going on with that bandwidth. Is this buffer over-running? They are looking more at topology system integration aspects. The FPGA-based prototyping engineer may not look at these items at all anymore. You find out in an FPGA-based prototype that you misconfigured a subsystem and it is taking too much bandwidth.”
That can cause real trouble, so the search shifts to software and the hardware-software interaction from the software side, he explained. But organizationally and knowledge-wise, there’s still more to be done to have these different core engines be applicable to all users, leveraging a verification ‘fabric’ for improved efficiencies.
To Cadence, the verification ‘fabric’ is comprised of a number of components. It includes verification management, where coverage data is contributed from the various engines and combined to determine where the design stands in terms of the project goals. It also includes verification IP, assertions, and finally the portable stimulus.
“Is the user actually ready to digest if you have a flow where everything is seamlessly connected? Is the user actually ready to receive it? Are they organizationally and/or educationally set up in a way that they cannot really get any advantage from it? This is not so easy because you have to moderate between the different participants,” Schirrmeister said.
Some of today’s hardware/software tool “continuums” include Cadence’s System Development Suite, Synopsys’ Verification Continuum, and Mentor Graphics’ Enterprise Verification and Vista platforms.
A tool continuum approach to hardware/software design and verification requires engineers to understand more of what’s happening in general, while still being somewhat of an expert within their domain, said Tom De Schutter, director of product marketing for physical prototyping at Synopsys.
“We started off in a very analog world,” De Schutter said. “Then we went to an analog-digital world, to an all-digital world, and now to digital and software. So things are continuously shifting. Within that now are the tools and methodologies to deal with software, digital content, analog content all coming together. That’s where in the continuum all these pieces need to be connected very tightly. Otherwise the product [being designed] is not cohesive. This is changing what engineering teams need to understand, and sometimes it works well because some people like to be more generalists. But for other people it is a big change because they were the most expert in a certain domain, but that’s not cutting it anymore,”
What engineers need to know today
Particularly in areas like the Internet of Things, it is critical to intricately understand the connection between hardware and software, De Schutter stressed. “This is driving all the pieces in the continuum. Before, you really had two separate domains and the hardware team didn’t really have to talk to the software team and they liked it that way; and the same was true on the other side. Today, that practice will not result in good products. And while this movement has been happening for a while, more and more while developing a product, you cannot only look at the hardware part or the software part — you need to look at both of them together because that has implications throughout the design cycle.”
Next, engineering teams need to see how that combination of things can drive new products and applications.
And then, particularly in areas like IoT, he said, that’s where things like sensors and external factors come into play. “So far, we’ve been used to creating devices that interact with people, and that results in a certain way of doing something. If you design a computer, if you design a mobile phone, you always assume the interaction is with a human. Now the interaction is becoming more general: it’s devices communicating with each other, it’s sensors providing input so the entire environment interacts with the device. This is a new inflection point. It’s not only how I’m going to create a device that works well for a human interface; it’s now designing something where you have to think about all the external factors that can influence this device.”
He expects this will probably drive a different type of continuum, and a different type of verification and validation on top of this hardware/software layer — one that more closely connects to the external factors.
Education is key going forward
It’s one thing to innately understand hardware and software, but it’s another to impart that knowledge to others, or to gain it from a straight EE major in university.
“My biggest complaint on education is that, especially once you get into grad school, people get into these very narrow silos, so you will get somebody who just understands place and route or just understands this particular narrow part of the overall making of a chip or the making of a system,” said Kurt Shuler, vice president of marketing at Arteris. “It even goes to the hardware guys who don’t know how to program, and to the programmers who don’t know what the hardware is. They don’t even teach that anymore. For our business it’s really hard to find people who understand all of the stuff. There are very few people from a technical standpoint who you can hire to architect the systems because you’ve got to know hardware, you’ve got to know software, you’ve got to understand problems that people are going to have, the chip manufacturing process, and things even beyond that. It’s great for those people who have that experience. They can make tons of money. But it slows down innovation.”
To this end, one of the problems is that some EEs have never written C code. As a result, they don’t understand what’s happening to the system. They only look at their narrow part of an entire system. “Sometimes those people never graduate to become the person who helps to define the system because they don’t know how the system is going to be used,” Shuler said. “They never cared. Also, companies will force you into a narrow silo if you let them because they believe you’re going to be more productive if you stay focused in your silo. And that’s bad for the engineer because if there’s a new way that power management is done or the need for it goes away, then they are out of a job. This is important to for companies to figure out because it’s difficult to develop leaders and managers if they don’t understand the bigger picture.”
Cadence’s Schirmmeister suggested that larger companies can do a lot of cross education between domains, such as having the simulation team talking to the hardware and the formal teams to understand the interaction.
Synopsys’ De Schutter sees this as something of a moving target. “First you had the swing of everybody wanting to be a hardware person, then there was a swing that everybody wanted to be a software person. With IoT, with external factors coming in, the hardware side will regain some of its importance. Software will not go away, will continue to be important, but right now what we don’t have is a lot of people who have a sense of the applications that we could do with those sensors. The difficulty with education of course is that in most cases, it is very reactive. As a college student, you look at what is hot right now, what should I do, which means you’re basically 4 or 5 years late because that’s the time it takes to actually get into the market. Nowadays, that’s actually too late. 10 or 15 years ago, 5-year lifecycles were not that long, but today that’s an eternity and everything has changed in 5 years.”
Keeping internal teams up to speed is something that all engineering teams struggle with, and in some cases it’s more straightforward than in others. If it is specific domain knowledge such as learning a certain programming language or a certain scripting language or a hardware description language – that one is fairly easy to control. Companies can offer courses to employees to do that. But when it comes to understanding use cases and getting into the mind of a user persona in order to create devices, it can be quite another challenge altogether, he said.
Interestingly, De Schutter continued, IoT specifically is leading to a different type of hardware design. “Now it is all about miniaturization and communication, so even within the hardware community today there is re-education required for people who know very well how to create complex designs and work on this treadmill of ever more powerful devices to designing smarter devices that use data in a specific way to bring new applications. The main skills are the same, but how to apply them is quite different.”
Much of that re-education will come down to inter- and intra-team communication, because ideally within each team there will be thought leaders who go out and educate themselves by going to events and trying out new things. Eventually that needs to flow back into the team, he said. Within the engineering community he doesn’t see as much of a re-education in the sense of going out and sitting through university courses. The focus is on learning on the spot and from peers. He also sees re-organizations within certain groups or divisions that try to bring new knowledge together to foster education across the team, leveraging domains in a different way so that learning happens in a more natural, organic fashion.
In the end, a more organic education approach within existing teams may be the most successful as design shifts left, pulling together hardware/software design and verification according to natural inclinations of designers to delve into the technologies that interest them.