Systems & Design
SPONSOR BLOG

Electronic System-Level Design: Are We There Yet?

After 20 years, ESL concepts have been robustly adopted.

popularity

I am writing this while attending NI Week in Austin and am admittedly wowed by National Instruments’ open test platform. NI Week’s theme is “Developing the Future Faster.” The Tuesday keynote included guest speakers from Mazda, Honeywell, and NXP, and these were great examples of system-level design of different scopes, from cars to distributed systems to chips enabling them. Personally, I am here to present at the Test Leadership Forum about some of the activities that I wrote about recently in my “It Takes a Village” blog, and thoughts of 20 years of electronic system-level (ESL) design in EDA run through my mind. Are we there yet? Let’s take a look.

Electronic system-level design in EDA had been introduced by the late Gary Smith in 1997. We are now two decades in. Gary structured ESL into four main areas: architecture, hardware design, verification, and software design. I think they still roughly apply.

The architecture portion of ESL is intentionally independent of hardware and software and reigns above those implementation details. I often joked with Gary that between me and my R&D lead, we personally know all 117 architects worldwide. Overstated, yes, but true at its core. The architecture space was, is, and always will be, a relatively small number of users that use every tool in their arsenal, with Excel being probably the most used. Some valuable sub-areas certainly have emerged. Gary counted The Mathworks and CoFluent in this space, the latter was acquired by Intel. National Instruments, The Mathworks, and ANSYS are the main players in the classic system modeling space. A bit further down in abstraction, cycle-accurate SystemC or RTL-based analysis of hardware architectures is well adopted; Cadence Interconnect Workbench, Mentor with the results of their Summit acquisition and Synopsys Platform Architect play here.

The classic architecture analysis dilemma remains an issue: decisions must be made as early as possible to be effective; and to make effective architecture decisions, architects would like the accuracy of models that are only available once the implementation is decided. As a result, there is a clear bifurcation in the architecture space with very abstract models that are used pre-implementation on the one hand (using languages like The Mathworks M or graphical definitions like in National Instruments LabView and cycle-accurate representations in RTL or SystemC on the other. In fact, a lot of architecture decisions for interconnect are done at the RT-Level—either with models automatically derived from RTL (Arm Cycle Models) or using plain RTL simulation. Luckily, the creation of the RTL and SystemC models is also automated by Arm, Sonics, and Arteris for their respective interconnects; otherwise, the creation of models would take prohibitively long.

In the ESL area of hardware design, the main areas are pre-RTL simulation and “classic” high-level synthesis. Pre-RTL simulation has pretty much standardized on SystemC now. One could argue that this simply has become an outgrowth of RTL simulation—all major vendors combine SystemC with Verilog, SystemVerilog, and VHDL in natively integrated simulators. That blurs the boundaries somewhat, as RTL developers are simply integrating SystemC or C with their RTL models from the bottom up. Classic high-level synthesis is widely adopted at the block level. Xilinx has integrated it into their Vivado tool suite as a result of their acquisition of AutoESL and is promoting OpenCL as a programming model independent from hardware and software as design entry. Catapult has been out of Mentor and is back in, now part of Siemens. Cadence acquired Get2Chip in 2002 and nurtured it until in 2014 Forte was added in; the combination is now Cadence Stratus. Synopsys was a pioneer with Behavioral Compiler in the 90’s and then acquired Synfora in 2010. Both Synopsys and Cadence have moved high-level synthesis organizationally into the same team that holds logic synthesis, making high-level synthesis a natural add-on to logic synthesis, raising the level of abstraction to SystemC. Cadence Stratus even allows for managing ECOs from the implementation level all the way up back into Stratus.

The verification space of ESL—in Gary Smith’s definition at the time—included transaction-based acceleration and emulation. This is a well-known space today, with Cadence, Synopsys and Siemens/Mentor being the main players. The products in this space simply bridge the classic verification space into the system level—one could argue by “brute force”. You want to boot an operating system like Android or Linux and you need the hardware implementation detail, hence cannot use abstraction? Here, use my emulator or FPGA-based prototype that runs in the MHz or 10’s of MHz range, respectively, compared to the Hz or low KHz range in host-based RTL simulation that would take weeks to boot the OS. This area is somewhat of a gray space because the primary use of emulation is hardware verification, and the primary use of prototyping is software development—but the lines are blurry as emulation extends into software, especially with virtual-platform-emulation hybrids, and prototyping extends into hardware verification for regressions once the RTL gets more stable.

Finally, the software design area of ESL included algorithm design and optimization, multi-core and model development tools and is dominated by software virtual prototypes. The three virtual platform V’s from the 90’s—Virtio, VirtuTech, and VaST—eventually found homes in Synopsys (2006), Intel (2010), and Synopsys (2010), respectively. The other two players, AXYS Design Automation and CoWare, were acquired by Arm (2004 later spun out to Carbon in 2002, and is now back in Arm since 2015) and Synopsys (2010), respectively. VaST, Virtio and CoWare together led to what is Virtualizer today. Cadence won the product of the year award with its in-house developed VSP in 2012 and uses virtual platforms as an extension in SystemC for the verification tools emulation and simulation for hybrids. Imperas and ASTC/VLAB Works are the independent players left today. The important common thread here is similar to the ESL hardware design space. SystemC, here with its SystemC 2.0 TLM extensions (released as standard in 2008) has become the backbone for assembly of virtual platform execution.

The rest is mostly about models of the IP blocks, processors, peripherals and interconnect, productivity tools for fast assembly and debug of the models and software running on the virtual prototype, for which standard debuggers like GDB, Arm DS-5 and Lauterbach Trace 32. Lots of users simply forego dedicated platform tools and simply use SystemC as part of their simulation environments to create virtual platforms. The added benefit of SystemC as a backbone is the natural, native connection into the implementation world of Verilog, SystemVerilog, and VHDL.

So where does all this leave us?

It’s time to close the ESL chapter in EDA and declare victory. Most of ESL in Gary Smith’s definition has simply happened while we were not looking. Hardware design’s simulation and software design’s virtual platforms are using SystemC, are firmly grounded in the familiar verification tools for Verilog and VHDL, linking both “hardware and software species” of developers. High-level synthesis is firmly connected to hardware logic synthesis flows. A certain class of architecture decisions is made at the RT-level of abstraction these days. And for pre-RTL architecture analysis, Cadence is partnering with vendors like National Instruments as part of our System Design Enablement initiative. The next frontier from here is system integration and connecting system-level flows vertically to EDA’s classic implementation flows.



Leave a Reply


(Note: This name will be displayed publicly)