Tuesday At DAC 2018

Where AI can go and the problems in getting there, plus how finance sees EDA.

popularity

The morning starts with the Accellera Breakfast. Accellera has made some significant progress this year and we can expect to hear about the approval of the Portable Stimulus 1.0 specification later in the conference as well as the initial release of SystemC CCI as well as a proposal for the creation of an IP Security Assurance Working Group, which will discuss standards development to address security of IP.  “There are security implications when users do not have enough information to fully understand the security risks that may be introduced by COTS IP,” said Mike Borza of Synopsys. The goal of the group is to create a standard method or mechanism by which to communicate that information. To date, they have rounded up 42 members from 19 companies to work on this and cover IP vendors, silicon providers and EDA companies.

Today’s keynote was given by Dario Gil, VP of AI and IBM Q at IBM Research, titled “Pushing the Limits of Physics, Architectures & Systems for AI.” He started his presentation by saying: “AI is the new IT. This creates a lot of hype but is also the most important trend of today. Enrollment in machine learning courses has gone from 40 a few years ago to several thousand today.” Gil pointed out that the inflection was in 2012 when we got enough computational power to address the problem, changing it from being feature discovery defined by a human to learning.

He believes that AI is poised to be the next frontier for design automation and talked about some internal tool developments within IBM and the improvements they are getting beyond what is possible by expert users. “We are in the infancy of where AI can go. We are still a long way from true learning systems. We must create AI that is less of a black box and have something we can explain, debug and can deal with errors. A black box is unacceptable to many professions and some product areas. We have to consider security and ethics. We have to ensure that we do not continue to propagate bias with our results.” The talk then progressed to the architectures and physics that will be needed to keep providing the necessary increase in computation, including quantum computers. Expect to see a full report on this keynote in the near future.

Jay Vleeschhouwer of Griffin Securities provided his view on the state of EDA. “There are two arms races going on: software development and silicon development. Microsoft and other companies have become devoted to silicon development. This is positive for the EDA industry and it has been reflected in how well the EDA companies have done over the past few years. The profitability has also been improving and this is essential to their ability to fund R&D. The tools available today are a result of prior investments and prior profits.”

He provided lots of charts and breakdowns of revenue across companies, geographies, product categories, etc. “Surprisingly, there has been an uptick in synthesis recently,” said Vleeschhouwer. “The systems industry is also becoming a bigger provider of revenue for EDA.” He also confirmed that the wave of consolidation in the semiconductor industry had resulted in a very small change for EDA after a short slowdown. A few interesting snippets include: EDA spending accounts for 5% of Intel’s R&D budget; Cadence has a larger profit margin than Synopsys; it is an R&D intensive industry and EDA continues to invest at increasing levels; ANSYS now has a 3.5% to 4% share of the EDA market; Cadence’s backlog is growing faster than Synopsys; Intel was almost 16% of Synopsys revenue and growing; looking at open job positions are a leading indicator for a company; China has become the 2nd largest market for IP surpassing Europe and Japan.

Lunch was again provided by Cadence with a session titled “Monster Chips: Scaling Digital Design Into the Next Decade.” “Coming up with a flow that is predictable is a challenge, no matter what size this chip is,” said Anand Sethuraman, senior director of ASIC products for Broadcom. “When we look at machine learning chips, one of the biggest challenges is power.”

“Reliability has not gotten a lot of attention, but for some industries you have to think about it day in and day out,” added Anthony Hill, fellow and director of backplane technology at Texas Instruments.

“Our designs are 1-3 million instances, but we would like to move up to 10, 20 or even 30 million instances,” said Patrick Sproule, director of hardware at Nvidia. “We lose efficiencies with small blocks. We know there are optimizations out there that we could get if we went with larger block sizes.”

Sethuraman added that as the number of blocks goes up, the QoR degrades. “There are times when we have to split a block in order to get it to work and then we see better results from the tools.”

Unfortunately, I had to leave this event early to prepare for a panel I moderated that asked, “Have Verification and Validation Engines Become a Commodity?” Panelists were Dave Oesterreich, fellow at Cypress, Mark Glasser, principal verification architect for Nvidia, Ram Narayan, director of verification for Arm and Faris Khundakjie, senior verification technical lead at Intel. A full writeup of the panel will appear and I can assure you there are plenty of controversial comments in this. What report card would you write up for the EDA industry?

Another panel followed: “The Automotive Digital Twin – Virtual or Virtually Impossible.” Vishal Kapoor of three legged stool opened by talking about the challenges associated with autonomous driving. Digital twins must exist throughout the development and production lifecycle and they must be 100% digital before any physical hardware exists. Bill Taylor of kVA posited that “designers typically think in the positive space and then think about how to achieve that function. But the caveat is that this is when it is operating correctly. What does it do when it is not operating correctly? The negative space is larger.”

Kurt Shuler of Arteris IP said that “there is good news and bad news. The good news is that we tend to do things at multiple levels of abstraction. We also have good verification/validation flows in place and that hardware functional mechanisms are already being implemented at very low levels and don’t require system-level knowledge. The bad news is that verification and validation of non-deterministic systems is still emerging. Hardware functions and functional safety mechanisms are not tightly tied to software and it is too difficult for integrators to understand/assess suppliers because technology is so specific and deep.”

Dwight Howard of Aptiv noted, “In my universe I see both the IC and system level and simulation is a standard part of the flow. But the accuracy of the models is a challenge. We also need to model software features. Tools are not fully there yet.”

Finally, Sanjay Pillay of Austemper Design opened with “the need to shift left the development cycle. There are two reasons for it. The increased complexity and the acceleration of features coming into the system is the first, and the second is the safety concerns and risk to life and property during the physical testing of autonomous driving systems that are pushing the industry to more virtual testing.”

Yet a third panel asked “Will the Era of AI Drive Emerging Technologies to Overtake CMOS?” An Chen from IBM Research pointed out that while there have been few novel technologies that have been shown to be able to surpass CMOS for Boolean logic, does AI create a different set of requirements that could make other technologies more likely? Geoffrey Burr from IBM believes the answer lies with new non-volatile memories which are essentially analog memories. Meng-Fan Chang from National Tsing Hua University thinks that we have to move beyond von Neumann architectures and most likely computing in-memory. Hsien-Hsin Sean Lee of TSMC said that it is only because of a new application that the answer may be different from when the same question was asked 10 years ago and the driver will be power. Finally, Kaushik Roy of Purdue University also looked at the power and efficiency gap that exists today and asked where that power is being unnecessarily consumed. More details about this panel will be coming to you.

Have you ever wanted to be a fly on the wall during a discussion with high-level people from multiple semiconductor companies? In the past, Semiconductor Engineering has been invited to be that fly during discussions with formal advocates, arranged by Oski Technology. The third of those will be published on Thursday. Tonight, I got to listen to VPs and other execs talk about design super bugs.

Check out DAC highlights from Monday and Wednesday.



Leave a Reply


(Note: This name will be displayed publicly)