2017: Tool And Methodology Shifts

Second of two parts: System definition to drive tool development, with big changes expected in functional verification.


As the markets for semiconductor products evolve, so do the tools that enable automation, optimization and verification. While tools rarely go away, they do bend like plants toward light. Today, it is no longer the mobile phone industry that is defining the direction, but automotive and the Internet of Things (IoT). Both of these markets have very different requirements and each creates their own pressures.

Part one of this two-part series about what’s expected to change focused on manufacturing and end markets. This part focuses on tools and methodologies. (As in previous years, Semiconductor Engineering will look back on these predictions at the end of the year to see who was right and who got it wrong. You can find the results of last year’s predictions here and here.)

Redefining systems
The definition of a system is always the thing above what you are working on. “Systems and IP will be the watchwords in 2017 as the semiconductor design ecosystem advances from chip-centric (integration of transistors) design to system-centric (integration of functional blocks) design,” says Bob Smith, executive director of the Electronic System Design Alliance (ESDA). He sees the alliance having to work across the entire design ecosystem as it makes that transition.

An important part of that vision is making IP more portable. “The semiconductor ecosystem and its customers will have a new resource in 2017 with the emergence of an online marketplace that connects demand with a global community of IC and IP designers,” says Mike Wishart, CEO of efabless. “The acute need for customized silicon for smart hardware products will be met by a community of unaffiliated designers on affordable, re-purposed 180nm nodes with libraries of proprietary and open-source processors and on-demand analog and mixed signal IP.”

Wishart is not the only person to share this view. According to Bob Ledzius, president and CEO of Concertal Systems, a startup that hopes to make a splash in the area of system design automation this year, “the ability to simply and quickly automate IP integration and verification with system use optimization capabilities will become a reality in 2017.” He also sees systems that allow for an open market between IP developers and integrators.

What goes into a chip constantly evolves. “The trend to integrate RF has been proven with multiple new IC releases but many have been a bit delayed due to the upcoming introduction of Bluetooth 5,” says Ron Lowman, strategic marketing manager for IoT at Synopsys. “Designs have been reluctant to take risks of integration when new interoperable standards are being introduced. The released Bluetooth 5 specification will expedite technology adoption of monolithic solutions and RF integrated into MCUs because it has broader appeal to satisfy additional applications including location services, smart home and beyond.”

Another new structure being integrated into SoCs are embedded FPGAs. “Embedded FPGA IP will continue to emerge as an exciting new technology for chip designers from MCU to Networking,” says , CEO of Flex Logix. “Embedded FPGA is emerging because of the increasing design time and cost of chip design meaning that chips need to be more flexible. Several companies are now offering highly differentiated embedded FPGA technologies combining soft IP, hard IP and programming software.”

Chips are parts of boards, which are parts of something bigger. “How do you connect and verify multi-board systems?” asks Wally Rhines, CEO of Mentor Graphics. “Big system companies that design aircraft, cars and trains have adopted enterprise design automation software. Large, complex organizations can now monitor the changes occurring in a design in real time, connecting the divisions of a company that once operated on an autonomous basis. When someone makes a change to a design in one part of the organization, how does it impact other aspects of the design? This is now monitored. Advanced enterprises know exactly what aspects of a complex system design are affected by changes in other parts of the design.”

Functional verification has become the biggest driver for the EDA industry. First it was with emulation, then and now in 2017 it will be technologies that surround the standard. “From a verification perspective, we will see further market specialization in 2017 – mobile, server, automotive (especially ADAS) and aero/defense markets will further create specific requirements for tools and flows,” says Frank Schirrmeister, senior product management group director in the System & Verification Group of Cadence. “The IoT, with its specific security and low power requirements, really runs across application domains. Verification will become a whole lot smarter.”

Pressed further on the issue of specialization, Schirrmeister adds that, “verification flows will become more application-specific in 2017, often centered on specific processor architectures. For instance, verification solutions optimized for mobile applications have different requirements than for servers and automotive applications or even aerospace and defense designs. As application-specific requirements grow stronger and stronger, this trend is likely to continue going forward, but cross-impact will also happen.”

For more mature technology areas, such as emulation and formal, integration is becoming the buzzword. “The fastest growing sector of EDA during the last five years has been emulation,” says Rhines. “It has grown more than 100%. EDA companies have accelerated their spending on emulation hardware and, more recently, on the software tools that use emulation data to analyze problems. The addition of software applications to existing emulation platforms was a major change in 2016. One of the most significant examples was the incorporation of power analysis in the emulator. Power analysis has become so critical in the design of large integrated circuits that UPF-based simulation is no longer adequate. So the EDA companies that provide emulators have incorporated power analysis using the leading power analysis tools.”

Software is becoming an important aspect of verification. “The various use models for core emulation like verification acceleration, low power verification, dynamic power analysis, post-silicon validation—often driven by the ever-growing software content—will extend further, with more virtualization joining real world connections,” says Schirrmeister. “FPGA-based prototypes address the designer’s performance needs for software development, using the same core FPGA fabrics. Therefore, differentiation moves into the software stacks on top, and the congruency between emulation and FPGA-based prototyping using multi-fabric compilation allows mapping both into emulation and FPGA-based prototyping.”

So what can we expect for 2017? “More use models will emerge in 2017 as hardware emulation moves to support the emerging automotive and IoT and safety and security SoC markets,” says Lauro Rizzatti, a verification consultant. “Tracking the interaction of the embedded software with underlining hardware is essential to perform accurate power estimation. Also, the insertion of logic in modern SoC designs calls for gate-level verification, adding another dimension to the complexity of the verification task.”

Another maturing area is formal. “Formal is already playing a central role in the RTL verification plans of most leading semiconductor design companies,” says Roger Sabbagh, vice president of applications engineering for Oski Technology. “What’s next in the evolution of formal is a move upward from verifying RTL implementations to verifying system-level properties of architectural designs. Using simulation to verify a full system at the RTL level is no longer practical. On the other hand, when operating on the architectural design, formal is a good fit for verifying system-level design requirements such as coherency, absence of deadlock, safety and security. This will be the next growth area in the adoption of formal verification.”

Formal is also being integrated into the flow in a tighter manner. “Tighter integrations between simulation and formal, including models, joint debug and functionality where the formal method can pick up from simulation and emulation scenarios for bug hunting, will drive adoption among simulation engineers,” says David Kelf, vice president of marketing for OneSpin Solutions. “Single company flows will be met with resistance as the mix and match approach takes hold and end users won’t be restricted to using the formal tool that works with their simulator.”

The big change for 2017
Formal and emulation expectations for 2017 represent evolutionary change. But there is a big change that will happen in 2017. “Standards groups, such as Accellera and the IEEE have been busy on the design automation front and we’ll see those results in 2017,” says Michiel Ligthart, president and CEO for Verific Design Automation. “These include a new UVM standard or IEEE 1800.2 that could be ratified in early 2017. The next SystemVerilog release will be voted on in early 2017 with an IEEE release later in the year, but the big one is the Accellera Portable Stimulus Working Group’s portable test and stimulus specification language.”

“2017 will be the year of deployment for portable stimulus (PS),” says Tom Anderson, product management director for Cadence’s System & Verification Group. “Many major semiconductor and systems vendors are already using this technology and several commercial EDA tools are available. Usage will grow dramatically over the next year, driven in part by the initial release of the standard, which will position this approach for mainstream adoption in 2018 and beyond.”

Portable Stimulus addresses the verification reuse issue. “Verification reuse between engines and cross-engine optimization will gain further importance,” says Schirrmeister. “Besides horizontal integration between engines—virtual prototyping, simulation, formal, emulation and FPGA-based prototyping—the vertical integration between abstraction levels will become more critical in 2017 as well.”

So what is Portable Stimulus? , chief executive officer of Breker explains. “The objective of PS is to be able to write your verification intent once, and be able to use it at all stages of silicon realization. From a PS model, it is possible to generate constrained random test-cases—just as in UVM. The generated tests can be self-checking, instead of requiring a separate scoreboard implementation. It is also possible to get a metric of design intent coverage directly from the model. Finally, a single PS model can be used as input to synthesize tests for a variety of target execution platforms.”

Portable Stimulus will thus act as the connection point for many aspects of the verification flow. “Verification planning continues to be important,” says OneSpins’ Kelf. “With the advent of better coverage techniques, an abstract testbench, in the form of Portable Stimulus, and more rigorous verification in general, verification closure will reach a new plateau. Coverage metrics that relate directly to the and tracked throughout the process across multiple platforms will become more meaningful and effective.”

If the user excitement that surrounds Portable Stimulus is a reliable gauge, this is a technology that will see quick adoption rather than the 10-year adoption cycles that are typical of EDA technologies. It may however, take 10 years to fully appreciate the things that could be driven from a model of verification intent.

Related Stories
2017: Manufacturing And Markets (Part 1)
The future of Moore’s Law, new architectures and packaging, and a spike in automotive, artificial intelligence and virtual/augmented reality.
CEO Outlook: Chip Design 2017
Political uncertainty, tempered optimism, continued consolidation, and concerns about capacity.
Formal’s Roadmap (Part 3)
The breadth of adoption of formal, technology breakthroughs and the challenges created by machine learning.

Leave a Reply

(Note: This name will be displayed publicly)