Experts at the table, part 1: Hardware and software must be developed at the same time these days to shorten the time-to-market for advanced devices and electronics.
Semiconductor Engineering sat down to discuss parallel hardware/software design with Johannes Stahl, director of product marketing, prototyping and FPGA, Synopsys; , director of models technology, ARM; Hemant Kumar, director of ASIC design, Nvidia; and Scott Constable, senior member of the technical staff, NXP Semiconductors. Part one addresses the overall issue of hardware-software co-design. Parts two and three will address automotive and security impacts.
SE: Hardware-software co-design has been a topic of discussion for many years, trying to get hardware and software together. The whole hardware/software co-design concept goes back to the 1990s. Software and hardware have to be built concurrently these days. What trends and ideas do you think are important now?
Stahl: It’s interesting that you mention that term, hardware/software co-design. It’s really an old term. We can do both. It’s really people with different job functions that do different pieces of any given embedded device. You have to make people work together, but they have distinct job functions – software developer, there’s a hardware architect, there are hardware designers. They’re using some tools to communicate. What co-design said at the time is let’s try to design things together, so they’ll work well together, but also let’s try to save some time. Today, you find more people saying we want to change the project schedule a little bit by making the software earlier available or developing software earlier. The co-design term is really gone.
Neifert: What was interesting is when they first came out with the term, there were really only one or two layers of software that used on these things. Most the times you weren’t booting Linux or some other OS. There might be some small RTOS. A lot of times it was just the lowest-level firmware or drivers. Today, any device that comes out has so many layers of software stack that are done on top of it, and the approaches that need to be done to get those on board aren’t always necessarily the same. You don’t want to use the same methods and co-design techniques for bringing up an OS-level application as you would on the lowest-level driver, for example – very different speed and accuracy requirements. The biggest shift that I’ve seen, in the time that I’ve been doing this is that typical teams will use multiple techniques in order to get each one of these various needs addressed and to get the hardware designed to the software throughout the process.
Kumar: One of the things that I’ve seen changing, looking at it a different industry perspective, is in a PC you have clear BIOS separation to OS separation. But now, if you look at Android space, there’s no separate BIOS. Second, if you look at earlier devices, the software layer is pretty much fixed. You’re not upgrading. You’re not adding applications. Software is not changing, while on your smartphone applications are changing. With your hardware/software co-design, you’re starting with one level of software. During the usage of the device, software has changed. Hardware/software co-design now is not the same as having fixed-function software, where software is not being upgraded. That’s one of the other sea changes that’s happening nowadays.
Stahl: Much more software is now pre-existing. Most of the design teams are struggling with how much of the stack they can quickly port again—and then how much of the applications they can quickly try out or figure out if it still works, if the performance is right, because there are so many things that exist in the market that have to work perfectly.
Neifert: We look at it in terms of workloads. When we bring up a new device, or a new core, there’s a standard set of workloads that we have to bring across, and we will choose the best technique or best tool given the status of those workloads and the status of the design underneath it, be it emulation, virtual prototypes, simulation, or whatever that goes underneath it. It’s just representative of the huge amount of software and stuff that may be pre-existing or may be being developed, which needs to constantly run against it, that’s not just the lowest-level verification, it’s a software, system-driven verification aspect.
Constable: In the semiconductor companies, we’re getting a lot more software-centric, too. It used to be just hardware, and we had a couple of software guys. Now we have just as many software guys as we do hardware guys. We definitely need to accelerate that and shift that schedule left, as much as possible.
SE: It seems like Intel’s software business has been growing by leaps and bounds. They’re putting a lot more resources into software development, software design.
Constable: This hardware/software co-design has always been a great idea, it’s always been on the table, and it’s been a PowerPoint idea, right? Everyone says, ‘Oh, yes, of course we should do this, and of course we should all work together.’ It’s only recently we’re getting to the point where our prototypes and our emulation can keep up with that, because we’ve done things at the higher level, with fractional SystemC, which is good for certain algorithms and implementing algorithms on this top end. But actually breaking out the hardware and figuring out a real system solution and making sure that your hardware system solution works requires the software guys to be involved early.
Stahl: And they don’t always want this. Because they have no motivation really to say, ‘Hey, I want to get involved early. I want to have more problems.’ Their attitude most of the time is, ‘Oh, I’ll wait for the silicon—if I’m allowed to wait for the silicon.’
Constable: Right, because it gets messy when you’re doing it in parallel, because you’re seeing the hardware bugs, the integration bugs, the builds. They would rather wait for the perfect thing. For the overall good of everyone, the common good of the schedule and profit, you have to do it as fast as you can, and that means everyone working in parallel as much as you can.
Kumar: One other trend is that it used to be that verification is a long pull. Now, software is a long pull. It just shows how much software is there. There is a lot more code offline in software nowadays, a lot more complexity, a lot more stage space that needs to be covered.
Stahl: You mean, in terms of getting out to the tape-out?
Kumar: To the tape-out, to the final product.
Neifert: It’s amazing now how many products will ship with the software not done. I got my last TV, and it basically came with a note in it said, ‘Don’t try and use the wireless, you have to use the wired.’ They shipped it out like that because they had a schedule they wanted to meet. Obviously, using the wireless is a kind of an important aspect of TVs nowadays. They wanted to meet a schedule and they shipped it out without it done. Luckily, you can get the software to catch up with it later. Thankfully, there hasn’t been a big bug that got exposed. If there was, I’d have a TV I couldn’t use on my wireless network, right?
Kumar: Yes, it’s quite common for the products to have the moment person who purchases it, grade it, from day one, and then use it.
Constable: With the firmware and the product thing, that’s fine, you can update your firmware, but from the hardware side, if you have a hardware bug that breaks it, then you can’t really update it. So you have to make sure your software works on that firmware. There’s a lot more standards now for software testing. When I talk to the guys that I deliver an emulation or a prototyping model to, they’re like, ‘Okay, I need a month for software testing,’ or ‘I need three months for standards testing and certain protocol testing.’ They have a much more stringent list of requirements that they have to go through. That just takes time. Even beyond the software development, they need these testing routines.
Stahl: Organizationally, the software guys are integrated into the flow now. They have to make sure to meet these deadlines, no matter how long it takes them to do this deadline.
Constable: That’s correct. That’s been a relatively recent development in the last five years.
Stahl: That’s a gigantic change for them. For the software world, they’re kind of independent. I’ll fix it when I get to it.
Constable: Right. The first time we go through that, it’s a pain point, because they have their schedule, that’s the normal schedule. We have to break it, to shift it left, and nobody has the resources to do that. So there’s some pain there to shift it left, but once you do that everyone sees the benefit of it—usually.
Neifert: I remember when I first started selling into this space from EDA, we would always try to sell to the hardware and the software teams. On our initial sales call, we’d go in and you’d see the hardware and software guys shaking hands, because they never met each other before. Thankfully, you don’t see much of that anymore. It wasn’t all that difficult to find that, even five, six years ago. I haven’t seen that in a while now, thankfully. Companies can’t develop like that anymore.
Constable: One of the things when you’re developing the software, the software guys demand that we have this higher-level co-design/co-ware type stuff, where there’s hardware/software stuff. But then when they get to the real silicon, they have to change their software a lot on the way down. There hasn’t been this easy flow where I’ve done this at a SystemC level and put it into hardware. It’s not that same link. They really need a model where they can put their real software on, and not change it at all, when the silicon actually comes. With the increases in FPGA size and capabilities, that’s really helped us in prototyping on a pure FPGA and prototyping from an emulation space, too, because they’re using those FPGAs in their emulators.
How will semiconductor industry as TIER1 SUPPLIERS to AUTO OEMS get affected by AIAG-VDA ALIGNMENT FMEA?
The 5th Edition AIAG MANUAL is being issued in near future.