Top 5 Trends For 2014

Smarter and faster electronics system design and verification lead the list.

popularity

My daughter’s and my traditional yearly cookie baking party last weekend reminded me of two things: There is still no easy recipe for system design and verification and – of course – the year is almost over again. Ouch.

Let’s look back at 2013 first. Earlier this year we held a System to Silicon Verification Summit in San Jose, with an interesting technical keynote by Brian Bailey and customers Zenverge, NVIDIA and Broadcom outlining how they tackle their system design and verification challenges. It was very telling that several of the customer-presenters actually represented the software teams of those companies. The summit also was a great showcase of two key improvements we have made to system design and verification — making it faster and making it smarter. This is very much in line with some of the predictions I made last year.

For one, customers were showing how PXP II hardware-assisted verification helped them to get to their verification targets faster. They also described how they applied the same tools they use for hardware verification for software development and verification as well, enabling them to interact with their customers up to six months earlier.

In addition, customers were also showing how with smarter approaches they brought their verification to the intended goals faster. Two examples were Embedded Testbenches that execute in hardware-assisted verification and virtualize the chip environment, as well as hybrid approaches that combine RTL execution with transaction-level execution in smart ways, allowing faster bring-up of operating systems (OSes) and accelerated execution of tests once the OS is booted.

The following figure summarizes the trends.

FasterSmarterDesignVerification
Faster and Smarter System Design and Verification

The traditional RTL execution in simulation is accelerated with hardware (Emulation, FPGA), and advanced techniques such as Advanced Debug, Verification IP, Metric-Driven Verification and Testbench Automation help augment both types of execution. Qualified as smarter and faster are the hybrid combination of the execution of RTL and more abstract representations using transaction-level models (TLM), as well as the use of software as an actual instrument for hardware and hardware/software verification. Earlier this year we coined the term “Software Driven EDA” and that is definitely a key trend.

So, for 2014, what will be the key trends to look out for?

  • Software Driving Functionality and Verification: At this point nobody will disagree that software has become crucial for system and chip design. The interesting trend that will get significantly stronger in 2014 will be the use of software not only to define the actual product functionality, but also verify it. Increasingly, there will be two types of software. Production software will actually execute on the product while in the hand of the end user, while test software has been used to verify the chip hardware itself but never leaves the developer’s desks.
  • Use Model Versatility to Increase ROI: All the engines, from virtual platforms at the TLM level to the three core engines executing RTL — simulation, emulation and FPGA based prototyping — cannot be “one-trick ponies” anymore. Besides the actual execution of the design, they will have to enable a larger number of use models. For instance, PXP II emulation is not just executing the design itself to verify its functionality. It also enables many other use models such as regression runs, dynamic low power analysis, software development/validation for drivers, operating system (OS) bring-up and middleware, performance validation, post-silicon debug, test pattern preparation, failure reproduction and analysis and virtual silicon support, making the “chip-to be” available to customers for early access and several more. Furthermore, the engines themselves need to be able to interoperate—for example, enabling hybrid use models of RTL simulation and RTL hardware acceleration (called Simulation Acceleration) and hybrids of RTL hardware acceleration and TLM simulation as described by NVIDA and Broadcom at our System to Silicon Verification Summit.
  • System-Level Design and Verification Blurring: Having been in the traditional electronic system-level (ESL) space myself for quite some time, it is fascinating to me that abstraction works in some cases and not in others. For example, 2013 has clearly shown that making decisions about complex on-chip interconnect is likely best made using RTL, not using TLMs. However, there are simply not enough verification cycles out there to verify everything using RTL, forcing a shift. Whenever abstraction is combined with automation to synthesize and then checked by equivalence checking later, even verification actually can move from RTL to TLM. We have several customers actively driving toward flows in which they verify much more at higher levels of abstraction and then automate the implementation from there.
  • Proper Representation of the System Environment: At whatever scope design and verification is happening — blocks, sub-systems, systems on chip or full systems — the representation of the environment in which the design under verification resides, requires very careful consideration. Users demand a spectrum of solutions, from actual hardware test-equipment connected to FPGA-based prototyping and emulation, to virtualization of the system environment to allow efficient test generation and analysis for protocols such as Ethernet and USB. That includes mixed scenarios in which emulators like Palladium XP II are connected using rate adopters to the system environment for some peripherals, while others are represented as synthesizable testbench.
  • Automation of Testbench Creation: There are simply not enough verification cycles around to verify all aspects of designs. Verification was, is and always will remain an unbound problem. On the flip side, one major bottleneck is to create all the verification content, the actual test benches itself. 2014 has the potential of becoming the beginning of a new era, in which more mainstream users start automating test benches at the chip and system level based on constraints. Automating the creation of scenarios verifying the content of chips and systems-on-chips will likely be a new growth area for system verification, and has the potential to impact the system design aspects, as well.

That’s it, folks. Happy Holidays and have a great start of 2014!



Leave a Reply


(Note: This name will be displayed publicly)