Systems & Design
SPONSOR BLOG

Will Top-Down Hardware/Software Co-Design Ever Happen?

How much further left can we shift?

popularity

Hardware/software co-design has been talked about, and predicted to be a problem, for at least two decades now. Why has the hardware/software development world not come to an end? In 1999, Wilf Corrigan—LSI Logic’s CEO at the time—said that the most pressing need for new EDA tools was a better methodology that would “allow software developers to begin software verification more near the same time that chip developers begin to verify the hardware.” While it would be nice to claim that EDA has simply provided a solution for this issue, it is more likely a combination of three things: improved raw performance of EDA core engines, smarter combination of said engines, and also new hardware abstraction layers and programming models that may minimize the need for top-down hardware/software co-design.

It’s January. It’s time for my annual trip to the garage to look at predictions from 5, 10 and 20 years back. (Oops. The garage has been re-modeled and is a gym now.) The IEEE Spectrums that I have been reviewing 10 and 5 years ago, respectively, in “Back to the Future” and “10 Years Later—Will Project Delays Stop Faster Technology Innovation?”, have been replaced with online archives. Times change.

What strikes me first when reviewing articles from that time is: Really, have we achieved so little over the last 20 years? Wilf Corrigan’s comment from 20 years ago could have been said by somebody last month in a December 2018 preview. The January 1999 EDA forecast article highlights the topics of designs moving to higher levels of abstraction, how design reuse ramps up slowly, and how hardware and software “join hands early on.”  Ah, yes, we had the “Virtual Socket Initiative Alliance” in full swing at that time, standardizing formats to enable IP re-use. The January 1999 Forecast on Software Engineering deals with Java and Sun’s efforts to keep control of it. How interesting that 20 years on Oracle will stop the free updates and starts charging for Java this month. Internet web standards were an issue in 1999, and, of course, the alleged Y2K problem that would have airplanes crash in the desert when the clock turned over to the year 2000. What a consulting feast it was in 1999 for experts that had specialized in it.

I am starting to feel a bit relieved that progress has been made and the last 20 years didn’t fly by without progress. Concurrent hardware/software design is topic that EDA actively drives, albeit somewhat different than we thought it would be 20 years ago. At the time the article claims that “…a concurrent approach is in growing demand” and that “co-design and co-verification tools bridge the two development environments, so that interface errors between hardware and software are detected and the functionality of both segments is verified earlier in the design cycle.”

While there still is no obvious tool for which all hardware/software co-design aspects are modeled top-down, then analyzed and decided on, verification has made huge progress. Interface standards like IP-XACT have found that their adoption and virtual prototyping is mainstream, albeit not always abstracting everything in a system or system-on-chip to the transaction level, but making use of the raw performance improvements that simulation, emulation and FPGA-based prototyping provide—as I described in last month’s forecast article “Verification Throughput Is Set to Increase by Leaps and Bounds in 2019”. The EDA industry has coined the term “shift left” for all this ability to virtualize and allow continuous integration of hardware and software at stages as early as possible during a project. I wrote about this in “The Great Shift To The Left” a while back.

So how much further left can we shift?

There is room to start even earlier. It may not even be EDA solving that problem. iOS and Android development kits work very well without any connection to EDA and are forming a hardware abstraction layer that enabled hundreds of thousands of software developers to create apps and develop software independent of hardware. There is a layer of hardware-aware software necessary, but the number of software developers dealing with it is much smaller. But once enabled, middleware like OpenGL can be verified early, as shown by NVIDA in various examples, on top of hybrid combinations of virtual platforms and emulation, for instance.

And, coincidentally, I contributed recently to Brian Bailey’s article “Taming Concurrency,” in which we not only discussed the role of portable stimulus using software for verification, but OpenCL, CUDA and OpenMP, that are “extensions of existing languages with a purpose,” like using an FPGA as accelerator of specific functions. It looks to me like they bring us closer to that next level of design entry that was discussed 20 years ago, and at which analysis and optimization between hardware and software can happen.

After being initially skeptical about progress made in the last 20 years, I must conclude that we have come a long way. And there is more to come. Lots of it. I can’t wait to write my January look-back articles 5 and 10 years from now. The path will be exciting.



Leave a Reply


(Note: This name will be displayed publicly)