Reflection On 2017: Design And EDA

Second of two parts: Progress on 2017 predictions was slow and steady, apart from one big miss in the standards area.


People love to make predictions, and most of the time they have it easy, but at Semiconductor Engineering, we ask them to look back on the predictions they make each year and to assess how close to the mark they were. We see what they missed and what surprised them. Not everyone accepts our offer to grade themselves, but most have this year. (Part one looked at the predictions associated with semiconductor manufacturing and end markets.)

If there was one key aspect to the predictions for 2017, it was that ‘systems’ would become a greater focus, both for design and verification. “Systems and IP will be the watchwords in 2017 as the semiconductor design ecosystem advances from chip-centric (integration of transistors) design to system-centric (integration of functional blocks) design,” predicted Bob Smith, executive director of the Electronic System Design Alliance (ESDA). He notes that “the move from chip-centric design to system-centric design is happening, and commercial products in the system design automation segment are starting to emerge. Another example of system-centric design is IP morphing into chiplets or hardened IP that’s characterized and tested before SoC design begins.”

One such tool was announced by Bob Ledzius, president and CEO of Concertal Systems, whose company had been prototyping such a system. “At the beginning of the year we already had prototyped the difficult aspects of the technology, so the team was confident in our eventual success to productize it. The biggest issue we faced was deciding what features are critical for a first release, and deferring the rest of the ideas for a later time.”

New technologies always face hurdles when it comes to acceptance. “It reminds me of the time when synthesis appeared and designers just couldn’t believe that a tool could do the task better than they could do manually,” adds Ledzius. “Fortunately for us, there is a whole new audience of system developers, who aren’t chip designers, and have a real need for simply moving from concept to full chip simulation. Then designers can concentrate more on building the best IP possible, and better streamlining the back-end flow without worrying if they’ve really captured what their future customers want and need.”

Mike Wishart, CEO for efabless, expected to see “the emergence of an online marketplace that connects demand with a global community of IC and IP designers.” He says that “an online marketplace now connects 1,450 IC and IP designers from more than 50 countries. The marketplace offers 60 distinct designs, including 11 created by the community. Among the designs are a ‘soft’ PicoRV32 RISC-V processor, created by Clifford Wolf, the noted developer of RISC-V microcontrollers and author of YOSYS.”

In fact, the RISC-V appears to be creating a lot of excitement. Rick O’Connor, executive director of the RISC-V Foundation, notes that “in 2017, the semiconductor industry witnessed a consolidation slowdown with new startups offering free, open solutions for today’s design challenges – not to mention established companies moving away from closed architectures. There is a growing interest in open-source instruction set architectures (ISAs), such as RISC-V. The portability and flexibility of the RISC-V architecture has driven innovation in a number of applications, addressing the increasing demands of our connected world from big data to the IoT. This newfound freedom in silicon design also has encouraged collaboration across the ecosystem by fostering a system-level approach to SoC design.”

2017 was a big year for embedded FPGAs. “Embedded FPGA IP will continue to emerge as an exciting new technology for chip designers from MCU to Networking,” said , CEO of Flex Logix. He says that many organizations have publicly declared plans for eFPGA, including DARPA, Sandia, SiFive and Harvard, which integrated an eFPGA into a TSMC16FFC edge AI chip to enable real-time iteration of algorithms for faster learning. “More companies moved forward with eFPGA without making their actions public. There are now even more suppliers of eFPGA in the market with Efinix’s announcement in 2017, and eFPGA is now available on even more process nodes and from more foundries based on announcements through 2017 by multiple competitors.”

And systems extend beyond the chip into the board and system. “PCB design is increasingly becoming an art that is hoisted onto engineers, as opposed to pure PCB design people,” says Lawrence Romine, global head of field marketing for Altium. “A common pain point in the industry is that the lines of communication have become unclear between each step of product development, which is critical to the success and realization of a product. As designs continue to become more complicated and communication becomes even more important, the industry is working to address this issue with software solutions that accelerate and verify electronic design.”

Romine notes that 2017 has seen the industry introduce more advanced software solutions that bridge the gap between collaboration among teams. “In the next year, we’ll continue to see a focus on software that enables a clear visualization of design workflows, and tools that make it easier for engineers to make component selection and placement decisions as form-factors become smaller.”

All eyes have been on the Portable Stimulus Standard (PSS) from Accellera, which was expected to be released in 2017. Steve Brown, product marketing director at Cadence notes that “PSS v1.0 draft specification was released in 2017, but an approved standard will not be completed before the end of the year. Despite this delay, customers are pragmatic about the availability of the standard and are not limiting adoption. We are working with customers on methodology surrounding portable stimulus, such as system level coverage and verification planning to fully utilize the various verification engines.”

Coverage was one of the areas of PSS that , CEO of Breker, had identified as being important. “This still remains an area of the standard that requires more work. The draft specification enabled users to get a better feel of what the standard would offer them, and this has resulted in considerable input being fed back into the committee. It is great to see users and vendors working together on a standard that will be a vital part of future verification flows.”

As a gauge of the potential adoption rate of Portable Stimulus, Michiel Ligthart, president and COO for Verific Design Automation, a company that makes language parsers for many of the industries design and verification tools, notes that “few industry watchers had anticipated the enthusiastic welcome of the Portable Stimulus Standard. We should all keep an eye out for that one going forward.”

In fact Frank Schirrmeister, senior product management group director in the System & Verification Group of Cadence, says that even without the standard being released “the space of portable stimulus and software and scenario-driven verification has become one of the hottest areas in the verification space in 2017. Two of the big vendors, Cadence and Mentor, a Siemens Business, in combination with start-up, Breker, are the main drivers here.”

Coverage is a connection point between several aspects of a verification flow. “The emerging Portable Stimulus standard has a notion of scenario coverage that applies to all the tools to which it ports,” points out David Kelf, vice president of marketing for OneSpin Solutions. “This helps provide a common coverage model that provides the right level of feedback on the verification of design requirements, not just code testing.”

At the beginning of the year Schirrmeister had predicted that “we will see further market specialization in 2017 – mobile, server, automotive (especially ADAS) and aero/defense markets will further create specific requirements for tools and flows.” He also had singled out security as an area that needed more attention. Today he says that “the trend of further market specialization, from a verification perspective, has certainly come true and progressed. Security is a key discussion theme at industry conferences, and we are even seeing security manifestos proposed. The industry is coming together to further address security issues. This will be a year-long battle, but the industry is willing to see it through—otherwise we would be in serious danger in a fully networked world.”

, CEO for Mentor, said that emulation has been the fastest growing sector of EDA for the past five years and that “EDA companies have accelerated their spending on emulation hardware and, more recently, on the software tools that use emulation data to analyze problems.” Schirrmeister added to this, saying “the various use models for core emulation like verification acceleration, low-power verification, dynamic power analysis, post-silicon validation—often driven by the ever-growing software content—will extend further.” He responds saying “All three big vendors have announced their variations of “suites,” combining formal, simulation, emulation and FPGA-based prototyping. The race is on to see who integrates engines the fastest. Also, use model versatility is indeed a key differentiator for the hardware engines. Expect much more to come in 2018 and beyond.”

Another person who foresaw this investment was Lauro Rizzatti, a verification consultant. “In hindsight, I should have anticipated that not only new use modes but also a new generation of emulators might be launched in the course of the year. Indeed, Mentor, a Siemens Business, introduced the third-generation emulator built on a new emulator-on-chip device encompassing increased emulation resources.”

Another theme within the predictions for 2017 involved integration between verification engine vendors. “Single company flows will be met with resistance as the mix and match approach takes hold and end users won’t be restricted to using the formal tool that works with their simulator,” predicted Kelf. “I believe that we have seen a number of examples where this mix-and-match approach has become important to end-users. Companies have insisted on joint flows. This is partly driven by the fact that they do not want to give up their preferred simulator, but wish to use a formal solution that has specific capabilities and performance for their design scenarios that they cannot find elsewhere. There is still a need to bring these flows to a more complete level of integration, but the willingness from some vendors is there.”

Power has been a driver for many integrated verification flows. “Emulation integrated with implementation-based power estimates is entering the mainstream with several vendors promoting flows in this domain,” says Schirrmeister. “Given the complexity of design and the variability in implementation choices, vertical integration will only become more important going forward.”

Accellera and the IEEE have had a busy year. “As predicted, the new UVM standard was released in 2017,” says Verific’s Ligthart. “In addition the release of SystemVerilog-2017 is trying to make it before the end of the year.”

Leave a Reply

(Note: This name will be displayed publicly)