Does Hardware/Software Verification Have To Be Broad And Deep? Check Out DVCon 2017

As verification expands to cover more aspects of system development, application-specific demands drive interesting changes.

popularity

DVCon 2017 is upon us next week and even though it is called the “Design and Verification” conference, it is rising more and more to the system level. One of the aspects of interest is how verification seems to simultaneously become broader—covering more aspects to verify like software, power and performance—while also becoming more deep when it comes to application domains and their specific needs. Plus there is a free lunch! I am looking forward to Tuesday’s lunch-time panel session on this topic.

Cadence has been spearheading the extension of EDA into system development with our direction towards System Design Enablement. As shown in the related picture, it extends well beyond verification and across application domains.

SystemDesignEnablement

With hardware/software verification being a key part of the system integration portion of System Design Enablement, DVCon will be a good example of these trends. For instance, you can hear Samsung talk about “Emulation-Based Full-Chip-Level Low-Power Validation at Pre-Silicon Stage” outlining how verification extends into low-power validation. You will see a lot of software-development-related aspects being talked about and shown. For example, our booth will show both emulation and FPGA-based prototyping. Yours truly will be a presenter in a tutorial happening on Thursday morning called “Reinventing SoC Verification – It Is About Time.” I will talk about software enablement and verification in both emulation and FPGA-based prototyping. We will also talk about performance verification in that same tutorial, in the context of the Cadence Interconnect Workbench offering.

That’s how “broad” verification is becoming. Very broad.

In terms of depth, the application specificity is driving interesting changes. Spanning across most application domains as indicated in the graph above, the internet of things (IoT) changes requirements and prioritization for hardware/software development well beyond the classic considerations of just performance, power and cost. (I wrote about this topic a while back in my blog post “How The Internet of Things Drives More Diverse Design Considerations.”) While in the past power, performance and cost were the main drivers helping design teams to prioritize designs, in the age of IoT they are now joined by other priorities like security, connectivity and in-field upgradeability. As a result, verification now increasingly is organized in application-specific flows. We have flows for functional safety in automotive, we at Cadence have flow variations already for mobile, server and networking, and even domains like aero/defense are now growing more application-specific verification flows.

Which brings me to the free lunch. We will discuss some of these aspects during a lunch panel I helped organize and that Ed Sperling will be moderating. The panel will be on Tuesday, February 28, and is called “Application-Specific Verification from Edge Nodes Through Hubs, Networks, and Servers – Are the Requirements All the Same?” Between Jim Hogan, Intel’s Chris Lawless and HPE’s David Lacey, I promise that the discussion will be interesting and entertaining. Right after that we will join you to listen to Tuesday’s keynote given by our own SVP and GM Anirudh Devgan, called “Tomorrow’s Verification Today.” Anirudh will “review the latest trends that are redefining verification from IP to system level, with an increasingly application-specific set of demands changing the landscape for hardware and software development.”

So with verification growing both in breadth and depth, DVCon will be a great gathering to discuss verification as part of System Design Enablement. I am looking forward to seeing you there!