DAC 2020 Day One

Reporter’s Notebook: Tutorial tip; observations on a DAC like none before; selected highlights from keynotes and talks.

popularity

DAC 2020 is like no other Design Automation Conference. It is virtual for this year — and hopefully only this year.

The COVID pandemic has proven that face-to-face meetings and conferences are invaluable for many reasons. But with none of the distractions of a traditional conference, focusing on the content was easy. And because the sessions have been pre-recorded, the speakers for each session were online in the chat feature, and attendees began to ask questions and make comments right away. That meant there was a very lively discussion that all attendees could see and participate in.

First off, a tip: Participate in any tutorial that interests you at its scheduled time, as these sessions are not recorded for viewing later. All panels and other sessions are, however, available for viewing until Aug. 1. (According to DAC’s help desk, the tutorials were only offered on Monday.)

Kicking off Day One was Philip Wong, chief scientist at TSMC, and a professor at Stanford University. He pointed out that future electronic systems will continue to rely on, and increasingly benefit from, the advances in semiconductor technology as they have had for more than five decades. “Semiconductor technology plays a truly vital role for humanity, not only in terms of economic development, but also in the way we live, the way we work, and the way we enjoy life. This vital role has been exemplified today by the deployment of high-performance computing, to study coronavirus protein docking in order to find a cure for the infection.”

With 21st century applications moving to be data centric, Wong stressed that data analytics and machine learning applications will dominate from the data center to mobile and IoT for connecting processing to curating the data to derive information, where many systems will need to learn and adapt on the fly.

Wong’s keynote focused on technology trends that have data analytics, machine learning and AI in mind, with attention to systems such as GPUs and accelerators — not on the mainstays of the last century, the CPU, for these applications.

He said a key part of bottleneck is data movement. Energy consumption for data movement is the problem to solve today because data movement is expensive, both in terms of energy, and latency. “A vast majority of the energy consumed is in the memory access. This includes energy consumed by the memory, as well as energy consumed by the compute while waiting for data from the object memory. Computing speed and energy efficiency for abundant data applications are limited by connectivity to memory. Many recent application workloads, including deep learning, which is shown on this slide are dominated by access to memories. Caching becomes less and less effective for these computing workloads. As a result, the majority of the energy consumption is wasted in data access, and a small portion is consumed by the compute circuitry. While this may be a challenge. It is also an opportunity for massive gains if we can focus on developing technologies with system performance in mind.”

In the past few decades, advances in semiconductor technology have been driven by two-dimensional scaling. “If you look at the density of transistors, logic gates, and on-chip memory, no matter whether it is the number of transistors in a microprocessor per unit area, the logic gate density, SRAM density, or simply the density of the inverter in a standard cell layout — whatever your definition for density is, you see that the density has been increasing exponentially even today, and 2D scaling is the driver,” said Wong. “However, if you look back in history, advances in semiconductor technology have never been sustained using just one method. The success of 2D scaling, or Moore’s Law, has been sustained by a variety of innovations. In the beginning, there was Dennard scaling. That simple theory provides that as transistors are scaled down in size, they become faster and more energy efficient, but the industry departed from Dennard scaling a long time ago. Yet semiconductor technology has continued to progress, although not in exactly the same way Dennard scaling prescribed. After Dennard scaling came equivalence scaling, brought on by strained silicon and high-k/metal gate. Then, the concept of using channel geometry to control channel effect brought us finFET. Now, much of the density gains come from design technology co-optimization, known as DTCO.

“When one knob has reached a diminishing return, there are many other knobs available,” he continued. “There are many roads that lead to Rome. Semiconductor technology is a history of innovations that have provided continuous benefits in generations after generations of technology products. We have achieved these goals via different means. The semiconductor industry has already made several important shifts in the past. When Gordon Moore wrote the famous Moore’s Law paper, Bob Dennard’s scaling rule was not even invented. Then, Dennard scaling slowed down and we switched to do equivalence scaling through strained silicon and high-k/metal gate. When the planar transistor reached limitations, we went 3D in the transistor structure and brought in FinFET. And today, there is DTCO Design Technology Co-optimization, which continued the density scaling path. When one method begins to saturate. There are multiple other knobs waiting in the wings,” he continued.

In addition to running through some examples of the most important attributes of 2D scaling that allows more transistors per chip, resulting in cost reduction, and other system benefits, Wong stressed the importance of taking a top-down approach. “Application drives design choices, and design choices are informed by system technology options, and there needs to be innovations up and down the system stack to arrive at the best solution. It is like a Jenga game. If you pull out some pieces in the middle or the bottom, no matter how good the top layer looks the stack is going to fall.”

He stressed that new design tools that optimally perform system partitioning will become indispensable. “System partitioning needs to be performed across dies, not just within die.”

Wong concluded his talk by addressing the importance of democratizing innovation. “In an application and system driven world, it is extremely important to have an ecosystem that fosters innovation. Today, chip design is a very expensive activity, affordable by only a small number of companies, when it comes to the most advanced technologies. As a result, innovations in hardware systems are limited to a small group of engineers. This is very different from innovations in software applications that oftentimes comes from a much broader cross section of society. If design tools and the ecosystem can lower the barrier for entry for chip design and system implementation, an enormous amount of innovations in hardware will be unleashed. We will see a renaissance of application and system design. Ideally, it should be as easy to innovate in hardware as it is to write a piece of software code.”

Cloud-based chip design
Next, there were some interesting talks within the Design-on-Cloud Pavilion from AWS, Cadence, Mentor, NetApp, and Pure Storage.

David Pellerin and Mark Duffield from Amazon Web Services discussed how an entire chip design flow, through to GDS, now can be done in the cloud. Cadence’s Craig Johnson highlighted some interesting use cases with TSMC and Microsoft Azure, while Mentor’s Wei-Lii Tan explained how library characterization is being done in the cloud.

Electronics ecosystem still evolving
A highlight of Day One was Mentor CEO Emeritus Wally Rhines’ talk on the continuing evolution of the electronics ecosystem, full of interesting facts, statistics, insights and observations on future directions for EDA.

Rhines began by acknowledging this has been a pretty unique period for the EDA and semiconductor industries. “COVID-19 has caused worldwide economic impacts, as well as changing the way we work. But despite all that, employment, has remained very strong for designers, for electronic designers, for software developers. By the Department of Commerce definition, we’re at full employment. In fact, better than full employment, and it’s so difficult to hire good people in our field.”

Concurrently, companies within the industry are experiencing one of the modest busts in the boom-bust cycles that occur and have for many years. “Fortunately, there are more booms than busts,” he said, “but 2019 was a down year in the semiconductor industry. If you look at the revenue trend, the 11% decline was caused mostly by memory pricing, rather than logic. Logic actually grew in terms of a piece of the semiconductor industry. And that’s where the EDA industry gets the majority of its revenue. What about 2020? It doesn’t look very good after negative 11% last year. The analysts in October had a consensus around the 7% growth this year but if you look at the current forecast, it’s down to zero growth with some as low as -12% according to Handel Jones at IBS, and others coming down, which still probably haven’t hit bottom.”

Rhines reminded that the EDA industry enjoys the benefit that semiconductor research and development tends to be pretty stable over time. “There are periods, like 2008, when people didn’t know if the recession was ever going to end, and so we took a hit in semiconductor R&D. But in general, semiconductor companies know that the recessions will be over. COVID-19 will be over, we’ll get a vaccine or cure, and they’ll need new products, so they continue to maintain their staffs and develop products, during a recession. And when there’s a downturn in R&D spending, it’s usually very mild. After all, the industry has spent 14% of revenue for the last 35 years, with only modest deviations, enforcing their confidence that things will get better and improve, else, they wouldn’t spend 14% of the revenue on R&D.”

Other positives include semiconductor capacity increasing, and new wafer fabs coming online in 2020. “That’s a lot,” he said. “And as semiconductor wafer capacity comes online sometimes it causes pricing pressure, which is actually good for the fabless companies who are buying wafers. But over the long term it’s simply a signal that revenue will grow. After all, throughout the history of our industry over 60 years, the semiconductor revenue per unit area has been relatively constant so when you see the semiconductor area increasing through capacity, it’s indicative that semiconductor revenue will increase as well. There’s another ratio that’s interesting: EDA revenue for our industry has been 2% of semiconductor revenue for the last 25 years, and that’s remained fairly constant, as well. So when the semiconductor industry grows, it’s good for EDA.”

Trouble for automotive ahead?
There are also some markets that aren’t so good, Rhines said. “One of them is the automotive market, and the associated manufacturing industries, robotics other fields like that. They’re under pressure due to the COVID recession. The automotive market looks like a bubble waiting to burst. Why do I say that? Well, after all, the automotive percent of semiconductor revenue has been the fastest growing of any category, and now is 9% of semiconductor revenue and the forecasts are for it to go closer to 10% in the future. How is that going to happen? One reason is there are over 500 companies that have announced that they’ll introduce electric cars, or light trucks. They’re also 277 companies that have announced autonomous drive programs. Does the world need over 500 companies developing these products? And what about the current decrease in automotive unit sales? It’s down from 22% versus last year at this time. There are probably going to be some cuts in R&D. There are going to be some companies that don’t make it. In terms of autonomous drive, there will probably some decrease in development of multi-passenger autonomous vehicles, although the investment in autonomous delivery vehicles could well continue. Overall, we’ll see some of the Chinese companies drop out of the electric car race. But we’re still seeing an enormous number of companies that are performing electronic design for this evolution of autonomous drive and electric vehicles, and that’s caused a big boom for EDA, one that’s probably going to take a few hits.”

Back to the positive markets, home compute, computer servers, PCs, and video games all remain pretty strong. “Worldwide global revenue for servers took a step up in 2017, and then a big step up in 2018 to $80 million. That held in 2019, and may likely hold that in 2020 based upon the first quarter results. Even areas like wireless handsets, which have been declining in unit volume are holding their semiconductor revenue, relatively constant because of the growth of semiconductor content in 5G phones. There may be a modest decrease, but not a big one for the semiconductor industry and even unit volumes are slowing or declining as 5G becomes popular,” he continued.

Trade restrictions with China
China remains a big issue for semiconductors since over half of all semiconductors in the world are purchased by China, Rhines noted. “You may think that’s in order to insert them into end products that are shipped back to the United States, but that’s becoming less true all the time. Almost half of the China consumption of semiconductors is going into products manufactured in China for sale domestically in China so the China market is very important for the U.S. semiconductor industry.”

What about Chinese production of its own semiconductors, and equipment businesses that are being built by semiconductor technology? “China, took a big hit in areas like PC units and other electronics in December when COVID hit but bounced back very strongly, even above the previous level, in all but the cellular phone category. The semiconductors for those, only 19% of Chinese need for semiconductors is produced in China. That’s changing over the last 10 years it’s almost doubled from what it was.”

As far as how U.S. semiconductor companies remain leaders without selling to the leading companies in certain markets, Rhines said it is very difficult. “They probably have to wait for the next discontinuity and hope that they will be able to work with the new leaders.”

Emerging challenges for EDA include 2.5D/3D/chiplet packaging, as well as secure data/the value of data, among other things.

Applying AI
For examples at implementing AI within organizations today, Toby Cappello of IBM Data & AI gave an interesting Skytalk.

Open-source EDA
And in the afternoon, DARPA’s Serge Leef, who previously spent quite a long time in the EDA industry, discussed open-source EDA and how the U.S. government is best suited to support innovation. In a nutshell, he suggested a path ahead could include productized, cloud-based EDA and IP originating from open source, and which follows industry standards.

Again, quite a few interesting talks, and today on Day Two of DAC 2020 promises to continue in that direction.



Leave a Reply


(Note: This name will be displayed publicly)