Verification and validation of IP has gone well beyond simple simulation leaving the industry scrambling for new solutions amid growing problems.
At the Design Automation Conference this year, the Designer and IP tracks were the stars of the show in many ways. These sessions catered to industry rather than academia and provided engineers with information they could directly use in their jobs. Many of the sessions were filled to capacity and Anne Cirkel, general chair for the 52nd DAC, was enthusiastic about the growing success of these tracks.
Paper sessions were short and to the point and often finished with a short panel.
One of those panels was titled Key Challenges of Verification and Validation of Modern Semiconductor IP and panelists included Tom Anderson, vice president of marketing for Breker, , CEO of S2C, Bernie Delay, group director for verification IP at Synopsys, , president and CEO of OneSpin Solutions and Frank Schirrmeister, group director, product marketing for System Development Suite at Cadence. What follows are excerpts from that panel.
Fifteen years ago, the industry recognized the importance of design IP and the tremendous productivity gains that came along with it. But design reuse without verification reuse has been called use-less. Where is the industry today with verification reuse and does it provide the same levels of productivity gain?
Anderson admitted that “no, and we are not even close. We have partially solved one of three problems. The Unified Verification Methodology (UVM) does address testbench reuse across projects at the same level of the hierarchy. We have not addressed vertical reuse from IP to sub-system and full chip neither have we addressed horizontal reuse.”
Horizontal reuse is of increasing importance with the inability of simulation to keep up with the necessary performance increases and UVM cannot take the same testbench from simulation to or the real silicon because of different execution requirement, mostly centered on the embedded processors.
Nakama talked about the increasing use-models required for verification. “We need system test reuse, and not just for hardware testing, but for software testing as well. We also have to worry about things such as power.”
Delay talked about how the challenges in verification go beyond the creation of tests but also into areas such as debug. It is common to see figures in the industry that show that half of design and verification engineer’s time is spend in debug.
Brinkmann talked about the value that verification IP can provide but admitted that “it doesn’t solve the IP integration problem.”
Schirrmeister was not in complete agreement with the other panelists, saying that “the verification reuse problem, as defined at the time, is solved. That was the pure IP level. But the challenge and the complexity have grown and new items of reuse have been added.” Schirrmeister also added an additional type of reuse that is needed, which is between people. “No-one tends to understand everything about a design, so you need reuse between specialists, such as someone specializing in the cache coherency problem, the power specialist, someone dealing with SoC integration etc.”
While the quality of design IP always has been an issue, and companies will thrive or fail based on bugs found in their design, an audience member wanted to know what high-quality verification IP means? This could well have been promoted by some of the earlier papers that talked about ways in which the testbench can be assessed for quality and the tools available to help with this problem.
Delay commented that you really have to look at this in the same way as you do design. “For a design you start with a test plan, functional coverage and tools that validate how well your testbench covered it. You do it exactly the same for verification IP. The advantage for verification IP is that it is used across multiple designs.”
Schirrmeister added that the coverage aspect is very important. “Time is also an important factor. Often with IP it is matter of how fast can you get to a standard.” Panelists also talked about the fact that verification IP for a new standard is often available before the design and acts as the reference model.
An increasing number of people in the industry want to know about verification in the context of the system. An audience member said that “It is not enough to verify that my block, such as PCI works. I need to know that it meets the intent and that I get a certain quality of service.”
Schirrmeister was concerned that “if you find issues at the system level that are in the IP, then something went wrong before that.”
Brinkmann added, “You need IP at different abstraction levels and to verify things at different levels of abstraction. You need to be able to bring things together at a more abstract level and analyze them there.”
Delay again related the problem back to the design flow. “When we do design, we start thinking about where we want to perform various types of verification. Where are we going to do what and how? Where do we want to use virtual prototyping, where do we want to use ?”
This progressed into a discussion about the new Accellera (PSWG). Anderson, the secretary for the committee, said that “the goal of the portable stimulus group is to define an abstract model that captures the high-level design intent and the verification space for it. It is a hard problem.”
Schirrmeister added another dimension to the problem: “A new set of metrics is required. What does coverage at the SoC level mean? We have a pretty good understanding about what functional coverage and code coverage mean, but what is at the SoC level? Scenario coverage?”
Recently, the industry has started to look at some radical departures from the design and verification flows that have been in use for a long time. In the software world there is test-driven development, behavioral development and other methodologies. In the hardware world it has traditionally been that the IP is developed and then the testbench. An audience member questioned if this should be reversed? While not mentioned, is one approach being seriously considered by some in the industry.
Delay responded that “because we are all hardware guys, it is natural that this is the way things evolved. Now we are beginning to look back and say maybe there should be other views of this from the top: from the SoC-level and down and more importantly from the software/hardware perspective and how these are tied together. If we look at the continuity of how we do software development and bring that down and start to merge the tools and methodologies, then that is the place to start.”
Schirrmeister tied the discussion back to the work of the PSWG. “When we start to bring together the software and hardware aspects we see exactly what we have today. Software is the only commonality across all engines from the virtual platform through to silicon. What is more interesting is when we start to think about the hardware domain and pre-silicon verification tied to software testing that happens afterwards. There is a lot of commonality between what you run pre-silicon and software and with the whole shift left concept. If this is something you need to do anyway, and can now do it earlier, then it is better, but it is a different mindset.”
Brinkmann does not see any immediate changes for the design community. “I don’t think we will change the way we are designing hardware. We will continue to do this bottom-up. I think the IPs will become more flexible, but I don’t see everything becoming a top-down flow.”
A question asked from the audience wanted to know how do these concepts apply to FPGAs?
Nakama was quick to point out that “design in an FPGA is slightly different because you can use trial and error.”
Schirrmeister put it into a career limitation perspective. “The impact on your career is very different. In software, there is always service pack number 2. The latest update on my iPhone helps to reduce power. For FPGA there is a mentality that is a little closer to the hardware world and verification is done in a more structured manner. With an ASIC, if it comes back and it is broken… When we look at the capacity of today’s FPGA devices, we are looking at 5 million-gate designs and this is not easily verified without using some of the structured approaches of ASIC design, but the pressure is not quite the same.”
“The time to market pressures are different with FPGAs but with a device that size, debugging is a nightmare,” added Brinkmann. “So you need a structured approach to ensure effective usage of your time.”
A final question was related to a prediction made by the late Gary Smith, who said that the size of the IP market would remain constant over the next five years. The panel was asked if the same would hold true for the verification IP market.
Delay believes that design IP and verification IP are at different points in their lifecycle. “All markets level off at some point, but the size and types of design IP keeps changing so that will keep things growing. While verification IP has been around for a while, its level of adoption is much further down the scale so the growth is still there.”
What is clear is that the entire verification world is about to be turned on its head with new tools, methodologies, metrics and languages required to solve the increasingly diverse verification challenges. It remains to be seen if the industry will collectively attempt to solve the issues, or if short-term political maneuverings will be used to capture short-term gains.
Leave a Reply