Design Chains Will Drive The Top 5 EDA Trends In 2018

Applying verification in a smarter fashion in the new year.


In my prediction piece last year, I made seven trend predictions. Looking back, I did very well compared to what actually happened. For 2018, I am cutting it down to five trends that will impact EDA, but in my mind a lot of the trends will be driven by the ever-evolving ecosystem of design chains from IP though semiconductor to systems and to OEMs. While HBO’s ‘Game of Thrones’ comes to a conclusion in 2018, the Game of Ecosystems that is unfolding in electronics and is driving EDA requirements will be interesting to watch—get your popcorn ready—but is unlikely to come to final conclusions. Timelines in electronics are simply a bit longer.

2017 was definitely the year in which the internet of things carried on beyond the hype, and its implementation is now in full swing. We are discussing a future of a trillion devices connected through low-latency high-bandwidth networks, intended to make our day-to-day lives easier across all areas—from health, through transportation, to consumer convenience. Some of these aspects will be visible at the February 2018 Winter Olympics in South Korea, and the next milestone will be the 2020 Summer Olympics in Tokyo. In this ever-more-connected world, verification more than ever plays a key role to keep our lives reliable, safe and secure.

As a result, pretty much all of my predictions from last year came true. Deployment of parallel simulation is in full swing after our announcements in February. They also triggered a re-thinking of differentiation for FPGA-based prototypes that has updated the messaging. Performance remains key, but the time to prototype and ease of use for software developers is on everybody’s mind, as predicted. Surprisingly, emulation saw only one new hardware announcement last year and, as predicted, the differentiation is rapidly moving into what can be done with the hardware engine. One vendor calls it apps, we call it use models. It’s all about the software stack on top of emulation and what it enables. The predicted horizontal integration of verification engines runs through the messaging of all the three main vendors, and the vertical integration of flows moved into mainstream adoption both for performance and power—combining detailed implementation data like cycle-accurate processor models with more abstract simulation as well as combining .lib-based, technology-driven power information with activity information derived from emulation. Application specificity definitely drove EDA flows in domains like automotive, server and aero-defense, and my last trend prediction—accelerated changes in the processor ecosystem—was very visible in 2017 with the public attention to processor architectures like Tensilica and even open-source approaches like RISC-V.

That last one becomes the key guiding principle in 2018, all the key trends can be derived from it. I have been writing about this for quite some time, in blogs like “Game of Eco Systems.” While previous source data that I used came from IDS, this year Daniel Mandell and Chris Rommel from VDC did extensive research in “Ecosystems Built on Processor Architectures.” It’s a must-read on design chain dynamics driven by processor architectures. Figure 1 shows the main graph, courtesy of VDC Research.

From here I derived my five key verification trends in 2018 for EDA. The common theme is that they are all about applying verification in a smarter fashion.

  • Security: As the IoT becomes real, the race to create security in our connected world will require smarter verification across the design chain. All aspects of the network need to be secure, so no portion of the design chain can introduce a weak link. The industry is rallying to combine software-based and hardware-based offerings to allow design teams to build secure systems. You can see this trend already starting with recent collaborations in this domain and Simon Segar’s call to action at Arm TechCon to put hackers out of business. The Security Manifesto is a great step forward.
  • Safety: Safety is rapidly becoming a key aspect not only in automotive but more generally in other industries like aero and defense applications. Verification and even certification of compliance to safety standards will become even more important in 2018. A smart combination of formal, simulation and hardware-assisted techniques is required to sufficiently cover safety. Watch this space in 2018!
  • Application Specificity: Significant knowledge of application-specific requirements will be required to create smart verification flows that span application domains from mobile and server though networks, hubs and edge-node devices that interact with us consumers. We are already seeing a bifurcation of designs that can stay at less aggressive technology nodes for cost reasons and trailblazing designs in domains like machine learning, servers and AI that push the envelope towards the most advanced technology nodes. This trend will accelerate further in 2018 beyond the joint IoT, mobile and server flows we already presented in 2017. Stay tuned for new, refined and updated flows in 2018 that will be directed at specific application domains.
  • Processor Ecosystems: Verification needs to consider the processor architecture ecosystems it serves. All processor architectures, including new open-source variants and configurable processors, play in most application domains and require smart variations of verification to enable processor-specific verification. With a different balance of priorities in requirements, specific, optimized flows will emerge in 2018.
  • System Design Enablement: New levels of system design—way beyond the traditional EDA ESL (electronic system-level design)—will open up new markets for automation, leading to smarter verification at the system level, including software-based verification and even multi-abstraction and multi-domain execution of designs that combine the key core competence of the different participants in the design chains. Hybrid execution of abstract processor models with emulation, co-execution of abstract MATLAB models with verification and software development with simulation of mixed-signal circuitry are just the beginning. We will see new levels of co-execution that smartly combine electromechanical, software and electronic verification.

It’s a great time to be in electronics and its enabling technologies like EDA! The amount of data that development teams are dealing with will push verification to new levels of intelligence including smarter debug, smarter execution and machine learning applied to the verification flow.

Leave a Reply

(Note: This name will be displayed publicly)