Predictions For 2016: Tools and Flows

Why EDA will be the cornerstone of the IoT, and where the biggest changes will be.

popularity

Seventeen companies sent in their predictions for this year with some of them sending predictions from several people. This is in addition to the CEO predictions that were recently published. That is a fine crop of views for the coming year, especially since they know that they will be held accountable for their views and this year, just like the last, they will have to answer for them. We believe that this makes them think a little harder before making bold statements that they may feel foolish about later. If you want to check out how they did last year you can find the retrospectives here and here. This year, the predictions are divided into the following segments: Markets; Semiconductors, Manufacturing and Design; and Tools and Flows, which are included in this segment.

EDA has often been seen as the poor cousin of semiconductors, but , managing director for Lanza techVentures, says that “the EDA tagline, ‘Where Electronics Begins,’ is no exaggeration and will be the first stop in the development of (IoT) devices. EDA has been a community of the smart, technical, innovative and creative and that’s the kind of thinking that’s moving the IoT revolution forward. It’s up to us to do things in a different way, to commercialize and monetize this technology in a creative, positive way that will have long-lasting social and economic success. This is especially important for areas of the world that are not as successful or thriving as Silicon Valley.”

While we may not see many startups in the field anymore, it does not mean that there isn’t plenty of innovation happening. “Identifying ways to differentiate for a competitive advantage has been a driving factor in the semiconductor industry since its inception and that will continue into 2016,” says Michiel Ligthart, president and chief operating officer for Verific Design Automation. “No company wants to use the same solution as the next. As a result, the differentiation has moved down over the past few years to the design flow where many companies are extending their design and verification flows with homegrown improvements.”

The big EDA companies have to innovate and transform as well. “Upside in the systems market comes with lots of new challenges for traditional EDA companies, says Brian Derrick, vice president of marketing for Mentor Graphics. “They must learn what drives purchasing decisions in new markets and execute well across a shifting range of sales channels and geographies. The near-term opportunity for the broader systems markets is to bring design automation tools and flows to specific applications where electronic content is rapidly increasing, such as automotive, mil/aero and IoT. Products and flows are half the equation. To scale, companies will need expertise in systems engineering and global reach to support customers in geographies not currently serviced by the largest technical software companies.”

As a counterpoint, Graham Bell, vice president of marketing for Real Intent says that “verification companies like OneSpin Solutions and Real Intent will become more visible in 2016 as alternatives to the Big Three. Their turn-on-a-dime support will distinguish them from the ‘take a ticket and wait’ model of the larger companies. In addition, verification companies will expand the kinds of failures they can quickly identify.”

Over the past few years, we have seen once-separated parts of the food chain coming together, and it appears this trend will continue and potentially accelerate. “In 2016, expect the worlds of EDA, IP and embedded to become even more closely linked than they are,” says Bob Smith, executive director for EDA Consortium. “Verification tools, such as hardware bridge embedded software and hardware, and we should see more collaborative relationships with vendors in each segment as we move through the year. Innovative packaging such as 3D IC design and system on wafer will begin to accelerate as a different approach to preserving the pace of .”

Verification
As has been the case for a few years, most of the visible innovation and advancements have been happening in the verification space. Michael Sanie, senior director of verification marketing at Synopsys, provides the reasons why this is the case. “With the rise of the IoT, Web 2.0 applications and social media comes the demand for devices that are smaller, faster and consume less power, despite being equipped with increasing amounts of software content. As a result, SoC designs have grown tremendously in complexity. Advanced verification teams are now faced with the challenge of not only reducing functional bugs, but also accelerating both software bring-up and time to market. The process of finding and fixing functional bugs and performing software bring-up involves intricate verification flows including virtual platforms, static and , simulation, emulation and finally, FPGA-based prototyping.”

Up until very recently, each step in the verification flow has been isolated and many discontinuities existed between them. Many attempts are now being made to integrate and unify them. “In 2016, the industry will continue to strive towards greater levels of verification productivity and early software bring-up,” Sanie says. “This will be achieved through the introduction of larger, more unified platform solutions that feature a continuum of technologies enabling faster engines, native integrations and unified compile, debug, coverage and verification IP. With this continuum of technologies being integrated into a unified platform solution, each step in the verification flow is further streamlined, and less time is spent in transitioning between steps. The rise of such platforms will continue to enable further dramatic increases in SoC verification productivity and earlier software bring-up.”

2016 also will be an exciting year as we welcome the first standard for a new verification methodology that could bring efficiency back into the verification process. “Accellera’s (PSWG) will release a standard defining a single abstract, graph-based specification that can be used to automatically generate test cases (stimulus, results, and ) for multiple verification environments and platforms,” says , chief executive officer of Breker. “These test cases will be tuned for efficient execution in each target, ensuring “vertical” portability from IP block to system-on-chip (SoC) and “horizontal” portability from simulation through emulation and FPGA prototyping to actual silicon in the lab.”

Steve Carlson, low power solutions architect at Cadence, agrees with the potential for this emerging standard. “There is a lot of work to be done, but vendors and customers like the idea of this.” This could be one of the biggest opportunities for verification innovation and lead to a significant retooling within the industry.

As RTL simulation struggles, emulation has been racing to fill the void and become better integrated into the flow. “According to a recent survey, the average design size is now way over 100 million gates, with processor, graphics and networking designs rapidly moving toward the 1 billion gates threshold,” says Lauro Rizzatti, a verification expert. “Recently, two large customers, one in the processor business, the other from a major networking company, stated that HDL simulators were performing in the single-digit cycles per second or less. They both use emulation and could not verify their designs without it.”

Lots of investment pours in when markets are expanding. “Emulation has gone mainstream and will continue to deliver growth rates that exceed those of the overall EDA industry,” points out Brian Derrick, vice president of marketing for Mentor Graphics. “Traditional emulation criteria, such as capacity, speed, compile time, and easy hardware and software debugging, remain important to customers, but so do an expanding list of new criteria, including transaction-based verification, multi-user/multi-project access, live and off-line embedded software development and validation, and the ability to access the emulator as a centrally managed resource in the data center rather than a standalone box in the testing lab.”

Part of the reason for emulations rise in importance is the software integration challenge. “Emulation is at the forefront of a methodology shift in the way power is analyzed and measured,” says Derrick. “Complex SoC designs are now verified using live applications that require booting the OS and running real software applications. Real-time switching activity information generated during emulation runs are passed to power analysis tools where power issues can be evaluated. In 2016 there will be a steady stream of new emulator applications aligning with customers’ verification needs such as design for test, coverage closure and visualization.”

Software is a key part of the system development process. “Electronic companies differentiate through software, but it requires powerful hardware platforms to enable such differentiation,” explains Marc Serughetti, director of business development for system level solutions at Synopsys. “This interdependency needs to be considered from the start of the system design throughout the development process. This means that developers of such systems must move from a traditional and somewhat serial development process toward a more integrated solution that enables design and development with full knowledge of the interdependencies between hardware and software.”

Emulators are just one solution in the continuum of hardware execution engines. “Prototyping has provided a solution to address such requirements, although traditionally the solutions have been disconnected, with each solution addressing a specific problem,” Serughetti notes. “Companies need to evolve from point prototyping solutions to an end-to-end prototyping solution to enable the right SoC architecture, the shortest time to quality software, reduced risk with pre-silicon software bring-up and early validation inside real world conditions.

In fact multiple solutions need to be brought together that span engines and abstractions. “Connecting a virtual prototype to FPGA-based prototypes provides an end-to-end prototyping solution, while delivering the best solution for each design task and also building on each other throughout the design and development process. Hybrid prototypes need to support the integration of multiple prototyping technologies,” concludes Serughetti.

Many are looking at these integration challenges. “The issue is how to connect them so that you bring the benefits of each into a single environment while providing ease of use—and without big changes in existing methodologies,” states Zibi Zalewski, hardware division general manager for Aldec. “On one end you have slow bit-level simulation with the increased complexity of design and testbenches, while on the other end we have much faster emulation with the ability to run more and longer tests. In addition, working at the transaction-level requires learning new methodologies and migrating tests from the bit level to the transaction level.”

The testbench is one area where unification is required. “The adoption of UVM is growing and helps to integrate the simulator and emulator,” explains Zalewski. “Having the design tested by a UVM testbench, which is transaction-based, enables both the simulator and emulator to re-use the same testbench, keep the same testing methodology, and benefit from much faster emulation. That results in wider coverage of testing and shorter tests execution time.” One technology that enables this is the Accellera Unified Coverage Interoperability Standard (UCIS) and a significant effort to define a joint formal simulation methodology.”

Formal may be branching out into other fields as well. Zalewski says that “for 2016 I see more opportunities for formal verification in other areas. Qualification of output from tools could be such an activity, such as checking the design processed for emulation. In case of faulty emulation results there is always a question, ‘Is that a tool or design issue?’ Tool qualification gaps could be filled by formal verification in many different industry areas.”

Accuracy
EDA tools and flows rely on models, and the demands placed on those models have been changing. “SoC optimization has multiple objectives,” says Chi-Ping Hsu, senior vice president and chief strategy officer for EDA products and technologies at Cadence. “There is a power concern, a thermal concern and performance concerns. The integration, the form factors and the number of software applications required for real-time complex systems can pose challenges. Design teams need to consider these important considerations and a lot of it has to be done up front involving architectural tradeoffs. With all of the technology choices such as 2.5D and flex PCB, designers have to be mindful of the impact of their choices. The early analysis has to be accurate enough, otherwise you may over-design. The target applications drive optimization across multiple domains and with multiple fabrics. At the same time, noise issues and power integrity issues come up and they are all inter-related. So, the level of mixed technology, mixed-fabric and mixed-objectives is becoming increasingly a mainstream concern. There is a lot of integration between different industries and different tools that will gradually force the suppliers to work closer with the users.”

Tool speed is always being pushed to the limits, but this is made more difficult when additional accuracy is also required. “The need for EDA tool accuracy is acute,” says Zhihong Liu, executive chairman for ProPlus Design Solutions. “We predict that this trend will continue and become more and more apparent as more project teams move to advanced nodes with higher accuracy requirement and higher cost and risk to design and fabricate a chip. Additionally, we can predict the need for better interaction or a common tool platform between process development and circuit design. This will help designers evaluate, select and adopt new process platforms, and make more competitive designs at advanced technologies like finFET at 16/14nm, 10nm, or even 7nm.”

Miscellaneous
EDA has always generated big data, but the industry is now seriously looking at some of the new developments in this area, as well as data mining, analytics and more. While few people in the industry are ready to tip their hand in this area yet, there is a lot being invested and we may see early results in 2016. “Big data analytics is driven by the large amount of data from timing, physical verification, electrical integrity checking, etc., and a strong need for ensuring efficient use of on-chip power/ground networks prior to tape-out through early correlation of the data,” says Norman Chang, vice president and senior product strategist for ANSYS. “There is an increased need for the analysis to go beyond a chip to package and system as the whole design ecosystem further strengthens the case for big data analytics.”

Security is a growing concern and will be an area of growing investment. “One of the most complex challenges for designers will be to understand security options and implementations for connected devices and the market requirements for individual segments,” says Ron Lowman, strategic marketing manager for IoT within Synopsys. “There are many different options to implement some sort of security, however, implementing a design that is truly secure, inexpensive and flexible enough to meet market and regulatory requirements will be a challenge.”

And what about consolidation within the industry? As EDA, IP and foundries continue to get closer together, it would seem logical to assume that consolidation between them will continue. “The Big Three will continue to expand their portfolios for companies that are a part of the electronic system design ecosystem, but not focus on traditional EDA companies,” says Bell. “Automotive, embedded software, software design, and also design IP will be target areas that will receive attention in 2016, so we should see acquisitions there.”

(And don’t forget to check back in December when we take a look at how well they did in their predictions.)



Leave a Reply


(Note: This name will be displayed publicly)