How accurate were predictions about interoperable models and designing in the cloud?
This marks the 10th DAC that I have covered as a blogger. At DAC 2008 in Anaheim, the industry had just come together behind the SystemC TLM 2.0 standard to enable virtual platforms, finally getting to model interoperability. System design is the common thread that is also present in this year’s DAC in 2018 in San Francisco. But a lot has changed. Big data analytics, artificial intelligence and machine learning for EDA were not even visible on the horizon 10 years ago. Cloud was a topic back then, but the view on its use in EDA was a lot foggier than today (pun intended!).
The past is an important data point for the future, so I am often going back a decade (or two) to see how accurate our thoughts about the future were, what items came out of left field, and which are new. The introduction of the TLM-2.0 standard was a big deal ten years ago. The new APIs for direct memory access and the quantum-based simulation execution allowing decoupling of models were moving software development on virtual platforms enabled by the EDA industry to the next level. During the decade prior, the “three V’s” (Virtio, VaST, and Virtutech) combined with ARM-AXYS, and CoWare had basically saturated the early adopters of virtual platforms. What was hindering the market’s ability to further proliferate was suddenly no longer an obstacle—the interoperability of models. At the time, I likened the introduction of TLM-2.0 to the standardization of Verilog and VHDL, which eventually led to the demise of proprietary HDLs with the following picture:
I suggested that with the introduction of TLM-2.0, the proprietary APIs for fast virtual platform development will eventually be replaced with SystemC TLM-2.0.
10 years on, this simply has happened. SystemC TLM2.0 has become the backbone for virtual platforms. A lot of high-level synthesis is based on SystemC. Meanwhile, we have added more capabilities enabling interoperability for “configuration, control and inspection” (CCI) of registers with a new Language Reference Manual being released just this month. This is not to say that the core of the models needs to be SystemC. This is about the interfaces allowing them to interact. We have users today happily integrating Fast Models from ARM or models developed using the OVP APIs, like Imperas showing this year for RISC-V, with IP provider-supplied models of interface IP, like PCIe and USB, and, of course, their own home-grown models. SystemC has enabled that, and proprietary models are things of the past—and this year’s DAC confirms this. Looking at the list of some of the presentations my team and I assembled this year (see the end of this blog), Arm and Vayavya Labs are talking about SystemC-interoperable models that integrate with emulation.
Another area just benefitted from standardization—the area of portable stimulus (PSS). John Cooley pointed out in his troublemaker panel that now that a standard has been chosen that is based on Cadence and Mentor contributions, Breker has “to re-write a lot of their code” to become standards-compliant. Qualcomm was presenting on PSS at the Cadence DAC Theatre to give more insight in their usage of portable stimulus. We are now entering the next era here, in which the tools can compete on capabilities like the constraint solving ability, etc.
The panel that SemiEngineering’s Ann Mutschler moderated on DAC Monday (see picture above), called “Smarter and Faster Verification in the Era of Machine Learning, AI, and Big Data Analytics,” focused on the two other items I mentioned above: machine learning and the cloud. This topic deserves its own write-up, but the discussion was quite fascinating. The core verification engines formal, simulation, emulation and FPGA-based prototyping produce an enormous amount of data. As Paul Cunningham pointed out, EDA has just learned how to actually deal with that vastness of data and has hired expertise outside of EDA to do so. This is a great time to be a data scientist—you will be in demand in EDA and other industries. We are just starting to collect these data as described in the panel by David Lacey from HP, and now the next step is to make sense of them and utilize them to improve verification productivity. All this data needs to be stored somewhere. Jeff Ohshima from Toshiba Memory Corporation presented on trends in the flash storage market, including its verification aspects.
The cloud announcements at this DAC brought a complete set of new exhibitors to DAC—like Google, Amazon, and Microsoft—and, in a sense, Jim Hogan already set up the topic for next year’s panel when he stated that this year is about the data collection and early existing techniques, like merging coverage data from different engines. Next year will be about AI and machine learning really optimizing verification, and a lot of it will happen in the cloud. The cloud announcements at this DAC brought a completely new set of exhibitors to DAC.
Below is a list of system design and verification activities from this DAC. Check back at our DAC web page, where the presentations will be posted.
Oski Technology, Architectural Formal Verification of a Coherency Manager – an NVIDIA Case Study
Panel: Smarter and Faster Verification in the Era of Machine Learning, AI, and Big Data Analytics
Moderator: Ann Steffora Mutschler, Semiconductor Engineering
Panelists: Jim Hogan, Vista Ventures, David Lacey, Hewlett Packard Enterprise, Shigeo Oshima, Toshiba Memory Corporation and Paul Cunningham, GM Cadence
Arm – Combining Virtual and Physical Worlds – Processor Models and Software Debug with Emulation and Prototyping
Microsemi – Pre-Silicon SW/FW Testing with Protium S1 Platform: A Case Study
Samsung – Modem UVM Acceleration with Palladium Z1 Platform
Qualcomm, PSS – A Disruptive Technology for Stimulus Reusability
Netspeed, Architecting and Verifying Your Next-Generation Coherent SoC Design with NetSpeed and Cadence
Microsemi, Debug Use Cases with Virtual Debug and Palladium Debug Advantages
ARM, Configuring and Implementing a High-Performance Arm Infrastructure System
Vayavya Labs, Enabling Virtual Platform/Emulation Hybrids using Efficient TLM Modeling
Cadence, Xcelium Parallel Simulation
Tortuga Logic, Enhancing Your Existing Verification Suite to Perform System-Level Security Verification
Leave a Reply