Systems & Design
SPONSOR BLOG

Applications, Ecosystems And System Complexity Will Be Key Verification Drivers For 2020

Look for a resurgence of classic system design in the coming year.

popularity

In my predictions blog last year, I focused on verification throughput and its expected growth in 2019. The four areas I predicted we’d see growth in during 2019 were scalable performance, unbound capacity including cloud enablement, smart bug hunting and multi-level abstractions. In 2018, the five key verification drivers that I identified were security, safety, application specificity, processor ecosystems and system design enablement. All these areas will be still relevant for the upcoming year, too, but the key items that I have in mind for 2020 are application specificity, the power of ecosystems and a resurgence of classic system design.

There are a couple of interesting predictions that are already available for specific applications as well as for semiconductor IP areas, one of the items being the need for verification. In “IP Status 2010-2019—What’s Next for 2020-2030?” Eric Esteve from IPnest talks about the roaring 2020s in the semiconductor IP world requiring new IP needs to enable data center growth, which removes smartphones as the major driver and points to AI, automotive and industrial IoT as the new key drivers. For CPU, GPU and DSP, dubbed as “what’s after the peak?”, the RISC-V architecture is identified as promising with its 102 licenses in the last 18 months at SiFive. The “catastrophic” consolidation for display GPUs (Apple, Samsung/AMD) is counterbalanced by the upcoming GPUs for AI.

Application predictions like “Dawn of a new decade: Looking at 2020 and beyond” and “Arm’s 2020 predictions” point to the transformational changes that 5G will enable—smart(er) cities, data-driven healthcare, gamifying technology and ambient computing—as well as unique changes in IoT, AI, security and XR (virtual and augmented reality (VR/AR)).

All this makes it a perfect time to focus more on verification! As the above challenges continue to grow, here are the top three challenges on my mind for 2020.

First, application specificity will become even more crucial than it already is, and verification will cause teams to take a much broader view than in the past. For instance, consider 5G networks, where the data rates are increasing and so are the protocols. Teams are adding more and more antennas, and as a result, the need to verify across the RF, analog, mixed signals and pure digital domains is increasing. Pure divide and conquer, trying to separate issues, will become more and more difficult. Integrated application-specific solutions that consider verification challenges more holistically across domains will be required to beat the competition. Beyond 5G, consider automotive applications, where ISO26262 and safety issues require specific flows. Aero/defense also has compliance to take into consideration and is also driven by sheer system complexity. Mobile and IoT still have very specific low power needs and also need to consider mixed signal. As I had written a while back in “How The Internet Of Things Drives More Diverse Design Considerations,” different applications domains prioritize design constraints differently.

Second, ecosystems will gain even more importance. They can be centered around cornerstones like Arm servers, with the ecosystem working through issues like SBSA compliance that ideally are verified pre-silicon using compliance suites using a collection of self-checking, portable C-based tests. Ecosystems also can rally around open-source architectures like RISC-V. The main technical issue in the RISC-V domain is not about it being open source, but about RISC-V derivatives and modifications, and the need to re-verify designs once they have been modified. 2020 may be the year in which users will realize that verification is a huge challenge, especially in light of the changes in the design that end users can make. Once an end user makes changes, they must be verified. For EDA, the solutions stack to allow integration of processors into the SoC covers software bring-up, use case testing, profiling, debug, integration, performance analysis and protocol verification as indicated in the figure below. Most likely, it will be driven by demand and updated for new ecosystems, so watch this space!


Solutions Stack for Processor Integration into Systems on Chips

Finally, due to increasing system complexity the classic electronic system-level (ESL) design technologies will find a resurgence. As I am a recovering ESL-holic, as I have admitted before, I am fascinated to see the changes to come that allow users to do what we used to call in the ‘90s, “hardware/software co-design.” Adding instructions to processors and trading their impact off with hardware needs is best done with tool assistance. The classic dilemma of “as an architect, I would like the models to be fast, early and accurate” is still as over-constrained as it was in the 90s. But these days, “brute force,” i.e., faster simulation, and emulation do allow faster execution at higher fidelity than 20 years ago. It will be interesting to see how this dilemma that I wrote about in “Will Top-Down Hardware/Software Co-Design Ever Happen?” will play out this time around. With systems also getting more and more complex, so system design needs to become much more intelligent to handle them. At Cadence, “Intelligent System Design” has become an integral part of our strategy.

Lots of fun ahead next year! Happy Holidays.



Leave a Reply


(Note: This name will be displayed publicly)