Monday At DAC 2018

What is drawing the industry’s interest? Major themes begin to emerge.


DAC #55 started with rumors flying. Will this be the last DAC as we know it? Is there a huge chasm forming between academia and the industry? Will DAC be able to make it in Las Vegas where there is no local interest? Of course, those who have been in the industry know that this kind of speculation happens every few years, and in the 80s, Las Vegas was a very popular location for DAC. There was even a year when Cadence pulled out. The industry changes and morphs as the forces that push it fluctuate.

But DAC this year started differently. Sunday is usually fairly low key, with a presentation about the industry and a reception. All of that changed this year. For the longest time, Gary Smith provided his analysis, first when he was with Dataquest, then Gartner, then as an independent with Gary Smith EDA. After his passing, the torch was held by Laurie Balch, chief analysis for Gary Smith EDA, which has recently morphed into Pedestal Research. This year, the mantle was passed to Needham and Company.

Instead of an analysis from the inside looking out, it was an analysis from the outside looking in: how the “market” sees EDA. The presentation demonstrated how while they may understand the numbers, they fail to know the industry and the forces that move it. It talked about companies and the rate at which they went public and the rate at which they were acquired and the way the market views the industry. It did not see the changes that are beginning to happen within the industry or talk about how those changes may affect things in the future. It did recognize that the FAANG stocks are a new driver for the industry, but failed to recognize the way in which verticals are dominating the focus of the industry.

Following the presentation was the Heart of Technology (HOT) party – a fund raiser for the Gary Smith Memorial Scholarship Endowment. This is also where you get to hear some of the musical talent with the industry.

As if often the case, DAC revolves around eating and drinking, and today saw the perfect balance. Breakfast with Synopsys, lunch with Cadence, dinner with Mentor, and evening drinks and socializing with a number of the independent verification companies.

The Synopsys breakfast showcased the advantages that come from a three way partnership – an IP vendor, ARM, together with a foundry, TSMC. The three of them provided knowledge to each other, cooperatively worked on optimizations and made it easier for customers to get a quick start on their implementation. Synopsys highlighted some of their Fusion technology, which shares data between tools. In some examples they showed considerable improvements in area or leakage and showed how machine learning was helping them to find some of the most promising areas in which to tackle complex multi-parameter optimization problems.

The keynote was given by Sarah Cooper, GM of IoT Solutions at Amazon Web Services. Cloud is one of the main focus areas of this DAC. The title of the keynote was “Building Connected Devices that Learn and Evolve” and will be fully covered in a later article. She said that her job is to ask the questions that need to be solved. The fundamental requirement of the IoT is to get information about the real world and to use that data to help people. “So why has it taken so long to reach the proliferation of devices?” she asks. “The rate of adoption of technology has been increasing rapidly.”

She talked about the rate of improvement in existing products such as cars. “You buy technical debt – everything you buy will be replaced by better products almost immediately. We have to understand how buying habits are changing. Relationships are changing. We have to make products adapt and change after they have been installed. I also want products that learn about me and adapt to me.” Interestingly, she had some advice for product developers that may raise some eyebrows. “We are so used to concentrating on power and cost, but putting in a few Easter eggs is good for business.”

The focus of the talk turned to AI and handling data, with a focus on the home. She admits that pulling everything into the cloud is just not feasible, meaning that more processing has to be done on the edge including inferencing and in the future possibly training.

Monday is tutorial day, as well as the opening of the exhibition. Five morning tutorials were available and I sat in on “Harnessing Data Science for the Hardware Verification Process.” The speakers were Avi Ziv from IBM Research in Haifa, Israel and Eman El Mandouh from Mentor, A Siemens Business, in Cairo, Egypt. The verification of designs uses many tools and produces “tons” of data. “While more data is being captured, less of the data is being used effectively, managed, analyzed and made available to the tools and people who need it,” says Ziv. It involves visualization, machine learning and data mining to have a better understanding of the data. “The objective within the verification process is to get to closure faster,” says Mandouh. After talking about the methods and tools that are available, she first showed how they could be applied to the verification process and in particular formal verification engine selection. The second example was in the identification of anomalies in execution trace and the third was related to finding correlation between cover crosses so that coverage holes can be filled faster.

Lunch was supplied by Cadence and they put together a panel that was moderated by Semiconductor Engineering’s Ann Steffora Mutschler. I will leave it to her to provide a full report on this event. The title was “Smarter and Faster Verification in the Era of Machine Learning.” You may be starting to see a theme about DAC this year.

While there were several tutorials about machine learning in the afternoon, I decided to sit through 3 ½ hours of talks about design tools for verifying hardware security. Most of the speakers were from academia, implying that this is still an emerging area. It has certainly risen in importance in the past few years and is unlikely to become a solved problem any time soon. But given the smaller audience compared to some of the other tutorials, it’s not yet universally seen as a major problem.

Tim Sherwood, of UC Santa Barbara and founder of Tortuga, provided the introduction. The problem is that you can never do enough verification to both prove that the design does everything intended, let alone look for every vulnerability that would make the design do something that it was not meant to do. Sherwood talked about one example: “There can be many levels of abstraction that exist between the vulnerability and the access mechanism for it. How long did it take a team to build a deterministic remotely exploitable attack – 2 weeks. The low hanging fruit is often the underlying hardware and architecture. You have to respect your adversary because they have a big financial incentive to do this. Protecting against this requires a sea change and we have to be pervasive and precise in our thinking. There is no one answer – it requires best practices from design through deployment.”

After some meetings with vendors, it became time to start winding down for the evening. First up came a reception in the Moscone lobby, followed by dinner with Mentor. This is always a fairly relaxing event after a busy day. Wally always gets up and talks, but not the kind of things that you would expect him to say. He tends to make fun of himself, of Mentor, of the industry and looks for patterns where you shouldn’t be looking.  This evening was even more muted than normal. With dinner over, it was time to hit the dance floor with the new kids on the verification block. Thanks OneSpin, Breker, Avery and something like 14 other sponsors that got my feet moving, my heart pumping, and making me ready for sleep and the start of a new DAC day. Check in tomorrow for the highlights.

Check out DAC highlights from Tuesday and Wednesday.

Leave a Reply

(Note: This name will be displayed publicly)