Sometimes predictions are interesting by what is not said. That is certainly true for this year and possibly indicates a period of change ahead.
This year more than 26 people provided predictions for 2015. Most of these came from the EDA industry, so the results may be rather biased. However, ecosystems are coming closer together in many parts of the semiconductor food chain, meaning that the EDA companies often can see what is happening in dependent industries and in the system design houses. Thus their predictions may have already resulted in their investment in tool development.
The predictions are segregated into four areas: Markets, Design, Semiconductors, and Tools and Flows. In this segment, predictions related to tools and flows are explored.
It is interesting that almost every one of the predictions this year has to do with verification. While this may be expected, given that verification now reportedly consumes more than 50% of the time and resources, it is surprising that few people thought that new design tools would be necessary for any of the emerging markets, including the (IoT), and almost nobody talked about power-aware design this year, even though articles during the year would suggest that this is far from a solved problem. One possible explanation is that a new generation of tools is needed for verification and analysis and automation cannot exist until these analysis tools are in place.
This view is at least partially supported by Brian Derrick, vice president of corporate marketing for Mentor Graphics: “EDA grows by solving new problems as discontinuities occur and design cannot proceed as usual. In semiconductors and electronics, discontinuities happen at such a fast rate that it seems almost continuous.”
The industry may be facing a larger discontinuity than most seen in the past.
“EDA is all about solving new problems, both in established markets, and for new customers and industries,” Derrick notes. “Developing new EDA solutions for markets in transition, like automotive, aerospace, the broader transportation industry, and the IoT, will fuel the growth of the design automation industry long into the foreseeable future.”
The Big Shift Left
It appears as if everyone has gotten tired of the term (ESL) and the industry has created a new name for it – The Big Shift Left. Chi-Ping Hsu, senior vice president, chief strategy officer for EDA and chief of staff to the CEO at Cadence, defines it as “Shift Left means that steps, once done later in the design flow, must now start earlier. Software development needs to begin early enough to contemplate hardware changes.”
The term is also used by Marc Serughetti, director of product development for the solutions group within Synopsys. He says “the tasks of testing, validating and verifying systems for safety and security are critical. It can no longer be addressed by adding more development resources and as a result there is increasing demand to deploy EDA automation tools and prototyping technologies that will help shift development and testing left.”
All of the big three EDA companies see the benefits of virtual prototyping and bringing together the hardware and software aspects of the system. Virtual prototyping provides the ability to shift left in several ways, such as enabling early interaction between groups and companies. Both Mentor and Synopsys have made significant investments in software tools, often required for development in automotive and other safety critical industries. “Software testing automation is required to empower organizations to build quality and security testing into the development process at the earliest stage for fast, resilient and predictable software delivery,” Serughetti says
“There will be more tools for hardware/software co-design to help manage the tradeoffs for system-on-chip design,” predicts Ken Karnofsky, senior strategist for signal processing applications at The Mathworks. “Consequently, there will be an increased focus on simulation to explore tradeoffs and verification of both hardware and software implementations. The challenge will be developing a methodology that provides the required level of verification without causing a performance bottleneck.”
Some see SystemC as the language of choice for higher levels of abstraction, but not everyone is in agreement. “The huge number of gates in a chip design dictate that we must move to higher levels of design,” said Randy Smith, vice president of marketing at Sonics. “For the industry to be successful, the design flows above the old ASIC netlist handoff level must be standardized. To move to higher levels in 2015, I predict we may need new high-level design languages to describe design requirements at the architectural level.”
Others see the need for languages that span multiple levels of abstraction. “Today, SystemC is heavily used for the development of blocks that process specific algorithms,” says David Kelf, vice president of marketing for OneSpin Solutions. “Transaction-level models of the algorithmic processing components in a chip are modeled using C, with a SystemC wrapper. These are verified and then synthesized into Verilog.” Kelf feels that this is too many languages.
“For years we’ve been hearing about the benefits of high-levels of abstraction,” says Brett Cline who was with and now in the system level design group of Cadence. Benefits he lists include better productivity, faster verification, and architectural exploration. “ (HLS) has been a key contributor to this move to a higher-level of abstraction and has delivered on these values and more.”
Abstraction, by definition, removes details to focus on general concepts. However, at smaller technology nodes, the details are what actually matter to make smart decisions. “Dealing with every little detail in a billion gate chip is intractable,” points out Cline. “In 2015 we will see the expansion of higher abstraction design and verification in mainstream flows by linking in lower-level tools. These lower-level tools provide the detailed information needed to make better decisions at the high level. Intelligently using greater detail when needed without losing the benefits of the higher-level of abstraction is the trick.”
We have seen this trend in the past when logic synthesis started to include physical details in order to make more accurate and informed optimization. “Vertical co-optimization, an element of the overall Shift Left trend, has been an important innovation in advanced node enablement,” says Hsu. “Essentially, it turns a process that was a series of steps into a parallel process. While the serial process was easy to manage because each step was self-contained, it yielded sub-optimal results. Vertical co-optimization offers opportunities to see how decisions in one realm affect the other elements of the enablement ecosystem. This change enables mutual trade-offs for optimization versus the old model of optimizing one step, holding it fixed, and passing the outcome onto the next step.”
Kelf also sees change happening in a bottom-up manner. “Increasingly, more control components that go with these algorithmic parts are being coded in SystemC rather than Verilog. Due to more intense event modeling required of these control sections, we will see increased use of SystemC at RTL. Indeed, a handful of companies are executing projects with RTL SystemC and attempting to bypass Verilog all together except for netlist descriptions. As such, in 2015 we will see more SystemC RTL usage and this will, in turn, drive SystemC tooling and an emergence of many of the verification capabilities we see in Verilog turning up as SystemC variants.”
Cline says what is new “is the drive for realtime feedback across much wider gaps of abstraction. An intelligent mix between high and low, fast and slow, and abstract and detailed information will help drive the next 5 years of hardware design.”
Verification
Bernard Murphy, chief technology officer for Atrenta, provides a summary of the changes he expects for verification: “I’ll continue to predict more scaleable approaches to verification and no doubt I’ll continue to be wrong, but at some point I have to be right. So I’ll stick with the long bet: IPs have to be more completely verified and system integration verification has to become more reliant on architectures designed for verifiability using static and constrained formal methods. System verification and validation will continue to move towards software-driven verification techniques. Also, given limitations of validating software on proxies for silicon, expect more 2-pass validation – first silicon will be to fully debug hardware and software, 2nd will be the final silicon.”
Another proponent for scaleable and software-driven techniques is Adnan Hamid, chief executive officer of Breker. He says “the ongoing move toward portable tests, driven in part by the Accellera Portable Stimulus Working Group, is an example of users demanding that verification solutions be portable ‘vertically’ from block to SoC and ‘horizontally’ from simulation through emulation and FPGA prototyping to actual silicon in the lab.”
Many of Murphy’s predictions have been happening, but new markets are adding additional or modified requirements. Serughetti provides some examples from automotive that include: traceable, documented verification from component to system; verification of all potential faults and system response; and automation of FMEA (failure mode effects analysis)/FTA (fault tree analysis) through the complete component to system verification cycle.
Emulation
One segment of verification that has been doing very well recently is emulation. Lauro Rizzatti, a verification consultant provides his view of the market. “Hardware emulation may be well over 20 years old, but the prediction that one day a software emulator will ring its death bell was never realized. It will be at the foundation of every verification strategy because no other verification tool is able to bridge the two disciplines like hardware emulation can.”
With growing adoption, engineering teams are finding more uses for hardware emulation. “It’s not hard to imagine that 2015 could be the breakout year for the tool,” says Rizzatti.
Derrick also sees an organizational change helping emulation. “Emulation is moving out of the lab and into the data center, making it accessible by everyone. It is not just for the hardware team anymore; software teams are discovering they can move their embedded software development and debug onto emulators.” He believes that this helps explain why the emulation market has doubled in the past five years, with a three year compounded annual growth rate of 23%.
But there are things that the emulator is not good at. “Simulation remains the main workhorse for IP and subsystem verification while emulation has taken center stage for SoC hardware verification, hardware/software integration, and OS bring-up,” said Hsu.
Murphy sees that adoption as a simulation replacement still faces some hurdles. “We already see a push to make the emulation methodology more debuggable. Customers spend millions on emulators and they supply a great service. However they map your design into a mass of FPGA’s (or specialized code), which changes the internals of your design, and you have limited insight into those changes. If something goes wrong (a design bug, a testbench bug or an emulator setup bug) you spend a lot of time figuring out why. No one can afford $5 million emulators sitting idle while engineers debug a problem.”
Formal Verification
Another area that has seen growing acceptance in the past few years is . “Formal continues to be deployed in situations where preset applications work well,” says Kelf. “More advanced applications such as system security analysis and system-level verification testing will emerge.”
Kelf also predicts that the most important development for formal in 2015 will be “a proliferation of the technology to designers where it may be used to speed up the early checking of design code before it is submitted into the verification regression environment.”
“The adoption of formal technology will be at even greater speed in 2015,” says Jin Zhang, senior director of marketing for Oski Technology. “Formal is becoming a key methodology to ensure the quality of IC chips used in all devices. Right now, there are less than 200 formal experts in the industry. We project that number will grow exponentially in the next few years.”
Design and IP
As systems get bigger, the amount of design and verification data is growing. “In 2015, design flows and tools will embrace the 3Vs of Big Data,” says Harnhua Ng, chief executive officer for . “These are Volume, Velocity and Variety. This will require tools to be reworked for large server farms where hundreds of processors can work on a problem in parallel.”
Paul Pickle, president and chief operating officer for Microsemi, sees areas in which new tools are needed. He cites the areas of timing/synchronization and FPGA-based SoC design and development tools. “Managing the entire design flow from design entry, synthesis and simulation, through place-and-route, timing and power analysis, with enhanced integration of the embedded design flow, is becoming increasingly important.” Pickle also predicts IP will become increasingly important.
IP is clearly becoming a vital aspect of the design flow. “2015 will see an increasing focus on verification effectiveness with regard to IP reuse,” believes Shiv Sikand, vice president of engineering for IC Manage. He asks who owns verification of the IP? Who will guarantee that it works? “Improving the verification efficiency associated with IP reuse will involve both best practices and design management technologies to help manage the interdependencies.”
New markets are also pacing tighter demands on IP. “The use of proven hardware IP that meets automotive safety requirements will accelerate time to market by reducing the development costs,” says Serughetti. Additional demands that he cites are performing circuit isolation and redundancy testing, inclusion of built-in-self test (BIST) and high tolerance testing for EMI (electromagnetic interference), noise, and ground level radiation.
The only mention of more traditional design predictions for 2015 comes from Zhihong Liu, executive chairman for ProPlus Design Solutions. “With the increasing needs of low-power applications, EDA tool accuracy will become critical due to smaller supply voltage and the impact of process variations,” said Liu. “Embedded memory, which may occupy more than 50% of the chip die area, will have a significant impact on chip performance and power. Designers will require high accuracy through the whole design flow for memory designs for both small and large block design, characterization and full-chip verifications. Designers will struggle with traditional EDA tools, such as for large memory characterization and verification, that don’t give them enough reliability and accuracy. EDA vendors need to respond.”
Business
What can we expect to happen at the corporate level for 2015? Kelf sees that Cadence will be busy pulling together its verification strategy, particularly their formal tools combined with simulation and that Synopsys will release a long-awaited new formal technology.
Graham Bell, vice president of marketing at Real Intent expects that a retooling trend may be underway. He points to Calypto’s recent announcement of its Catapult 8 product that shows a new hierarchical approach that works both top-down and bottom-up, and a new database with 10x capacity improvement. He also points to Real Intent’s third-generation offering of its (CDC) verification software that he says shows a very similar profile. “Its new hierarchical sign-off flow does not compromise accuracy at the various levels of the design hierarchy, and it provides the greatest flat capacity through its new database approach.” Because of this “I expect to see broader adoption of third-generation high-level synthesis and RTL verification tools.”
[Last year, Semiconductor Engineering reviewed the 2014 predictions to see how close to the mark they came. You can see those in part one and part two of the retrospective. We will do the same with their predictions this year.]
Leave a Reply