An action packed day at DAC where all extremes of the EDA problem space can be covered under one roof.
The industry and users have a love/hate relationship with UVM. It has quickly risen to become the most used verification methodology and yet at the same time it is seen as being overly complex, unwieldy and difficult to learn. The third day of DAC gets started with breakfast with Accellera to discuss UVM and what we can expect to see in the next 5 years. The discussion was led by Tom Alsop, principle engineer at Intel. Alsop’s first question to the panelists was, where do you see UVM in the next 5 years?
Warren Stapleton, senior fellow at AMD said that we are here because of its success. Now we have to make it more universal and we need to make it interact with collateral from other groups. The multi-language working group is starting to address this. It is good for IP level verification but starts to lose steam when you look at system level efforts.
Faris Khundakjie, senior technical lead in Intel and chair of portable stimulus effort said that there has been a unification of methodologies and today we have just one. It has increased the skill level of verification engineers and brought in good practices such as layering and packaging. We see great value in just-in-time randomization and has started to influence higher levels of verification. There are also limitations such as cross platform and that is what portable stimulus is attempting to address. UVM was not created to address these issues and was centered on simulation. UVM has to decide if it should continue to work in the same ways or if it should concentrate on shift left.
Jonathon Bromley, verification consultant at Verilab quipped that monks used to “sell” indulgencies. UVM has done that for us. It has been a catalyst for raising the level of discussion and has created good tools. UVM has failed in some small ways. It is not prescriptive enough and there is no well-established methodology. It is unfair to criticize the efforts but some parts came too little and too late. Do we want to extend into mixed signal? The level of discussion there needs to be raised.
Mark Glasser, principle verification engineer at Nvidia completed the lineup saying there are a lot of tools in various domains and portable stimulus is now gaining momentum. They are all focused on building the same piece of silicon. They need to work together and possibly become the same thing eventually. This includes SystemC models as well as portable stimulus.
The discussion itself was very much about the issues and problems surrounding UVM today and the need for recommended practices.
The day continues with both a visionary talk and a keynote. The visionary talk was given by Lou Scheffer, titled Learning from Life: Biologically Inspired Electronic Design. Want to know about worms and machine learning? Scheffer talks about how biology does it and reasons why we need to study the brain more and find out how they do things such as pattern recognition.
Sameer Halepete, Nvidia
The keynote followed. Sameer Halepete, VP for ASIC design at Nvidia, talked about driving the next decade of innovation in visual and accelerated computing. “Things are getting to be more fun now that the rapid pace of process node changes has slowed and the gains from going there are smaller.” He talked about the demand for computer performance and what might be able to use that kind of power. He focused on virtual reality and the demands it will place on compute power. Finally, he talked about the advances necessary in EDA tools and the impact that partnering with EDA tool providers had provided.
The middle of the day was filled with vendor discussions, roundtables and other data gathering operations to assemble the raw material for upcoming articles. In the afternoon, there were several interesting panels. The first panel set itself up to be “The Great Simulation / Emulation Faceoff.” The title is obviously click bait because it is evident that both are needed in addition to formal, prototyping and real silicon as part of the toolset necessary to conduct effective verification. The devil is in the details of such panels.
Running in parallel, Ed Sperling moderated a panel that looked into TSVs and the state of 3D-IC integration. One of the areas of discussion was the need to design the package and the silicon together otherwise they will run into a number of problems.
Immediately following that, a third panel attempted to define “What’s The Future of DFT?” Panelists were diverse in looking forward towards big data and multi-die integration and back to looking at modifying the design to do test better. “This is not just about adding scan chains,” says Jeff Rearick, senior fellow at AMD. “Can we make a tool that adds self-test to a design?”
Joe Sawicki, VP and General Manager for the Design-to-Silicon Division of Mentor Graphics, contends that designers do not like test because it is seen as a product engineering issue. “But this is changing. It is becoming a part of the specification and thus the designer’s responsibility.” Sawicki is looking into areas such as automotive where this kind of function is mandated during operation.
An interesting discussion looked at the intersection of test and security, two system attributes that would appear to be at odds with each other.
The day was completed by cocktails with ClioSoft and sitting on the bank of Lake Austin, courtesy of Breker, where no boats were allowed onto the water because of the extremely high water levels.
DAC Day One
Is the glass half full or half empty? The same can be said of opportunities within EDA, but it depends which glasses you have on.
DAC Day Two
Monday packs in some of the finest keynotes, panel and dining events including plenty of lessons about 10nm, finFETs and mixed-signal verification.
DAC Day Four
Art, a fire in the exhibit hall, risk management and team advice, car races and notions of open-source hardware.