Applying verification in a smarter fashion in the new year.
In my prediction piece last year, I made seven trend predictions. Looking back, I did very well compared to what actually happened. For 2018, I am cutting it down to five trends that will impact EDA, but in my mind a lot of the trends will be driven by the ever-evolving ecosystem of design chains from IP though semiconductor to systems and to OEMs. While HBO’s ‘Game of Thrones’ comes to a conclusion in 2018, the Game of Ecosystems that is unfolding in electronics and is driving EDA requirements will be interesting to watch—get your popcorn ready—but is unlikely to come to final conclusions. Timelines in electronics are simply a bit longer.
2017 was definitely the year in which the internet of things carried on beyond the hype, and its implementation is now in full swing. We are discussing a future of a trillion devices connected through low-latency high-bandwidth networks, intended to make our day-to-day lives easier across all areas—from health, through transportation, to consumer convenience. Some of these aspects will be visible at the February 2018 Winter Olympics in South Korea, and the next milestone will be the 2020 Summer Olympics in Tokyo. In this ever-more-connected world, verification more than ever plays a key role to keep our lives reliable, safe and secure.
As a result, pretty much all of my predictions from last year came true. Deployment of parallel simulation is in full swing after our announcements in February. They also triggered a re-thinking of differentiation for FPGA-based prototypes that has updated the messaging. Performance remains key, but the time to prototype and ease of use for software developers is on everybody’s mind, as predicted. Surprisingly, emulation saw only one new hardware announcement last year and, as predicted, the differentiation is rapidly moving into what can be done with the hardware engine. One vendor calls it apps, we call it use models. It’s all about the software stack on top of emulation and what it enables. The predicted horizontal integration of verification engines runs through the messaging of all the three main vendors, and the vertical integration of flows moved into mainstream adoption both for performance and power—combining detailed implementation data like cycle-accurate processor models with more abstract simulation as well as combining .lib-based, technology-driven power information with activity information derived from emulation. Application specificity definitely drove EDA flows in domains like automotive, server and aero-defense, and my last trend prediction—accelerated changes in the processor ecosystem—was very visible in 2017 with the public attention to processor architectures like Tensilica and even open-source approaches like RISC-V.
That last one becomes the key guiding principle in 2018, all the key trends can be derived from it. I have been writing about this for quite some time, in blogs like “Game of Eco Systems.” While previous source data that I used came from IDS, this year Daniel Mandell and Chris Rommel from VDC did extensive research in “Ecosystems Built on Processor Architectures.” It’s a must-read on design chain dynamics driven by processor architectures. Figure 1 shows the main graph, courtesy of VDC Research.
From here I derived my five key verification trends in 2018 for EDA. The common theme is that they are all about applying verification in a smarter fashion.
It’s a great time to be in electronics and its enabling technologies like EDA! The amount of data that development teams are dealing with will push verification to new levels of intelligence including smarter debug, smarter execution and machine learning applied to the verification flow.
Leave a Reply