中文 English

Growth Spurred By Negatives

Unexpected upside driving the semiconductor and EDA industries into 2022.

popularity

The success and health of the semiconductor industry is driven by the insatiable appetite for increasingly complex devices that impact every aspect of our lives. The number of design starts for the chips used in those devices drives the EDA industry.

But at no point in history have there been as many market segments driving innovation as there are today. Moreover, there is no indication this is about to change.

“As global demand for chips continues, the semiconductor industry can expect another growth year, though ongoing business challenges and disruption caused by the pandemic could mean more consolidation,” said Amin Shokrollhi, founder and CEO of Kandou. “New chip designs in AI and ML and other emerging market segments will be tempered by enduring capacity constraints across the entire supply chain. However, our industry is full of bright innovators who look at these challenges as opportunities.”

Two of the key drivers for this expansion are COVID and the end of Moore’s Law.

“In 2022, the realization will sink in that the free lunch is over again,” says Frank Schirrmeister, senior group director for solutions and ecosystem at Cadence. “In 2005, Microsoft’s Herb Sutter published his article, ‘The Free Lunch is Over,’ pointing out that the seemingly free advances software developers had enjoyed from processor architecture improvements would come to an end and require fundamentally new approaches to software development. Three years later, we were amid what we now look back at as the multi-core revolution, triggering advances in software development concurrency. The industry now has clearly realized the free lunch we enjoyed by simply waiting for the next semiconductor technology node to build the next, more complex system on chip (SoC) is over. High-end designs reach or exceed reticle limits requiring heterogeneous integration, effectively leading to disaggregation of SoCs.”

The other driver is COVID. “The pandemic put an emphasis on digital transformation and the importance of cloud-based services,” says Timothy Vang, vice president of marketing and applications for Semtech’s Signal Integrity Products Group. “As we look to the year ahead, massive intra-data center traffic is multiplying the need for additional bandwidth and faster networking interconnection speeds. Current data consumption trends suggest an increasing demand for data and compute, and we are seeing a convergence of infrastructure for data centers and wireless as data centers move toward edge compute models that are tied directly into 5G networks.”

Several developments over the past few years are making structural changes within the industry possible. “This year will be a demarcation point from the siloed proprietary past to a collaborative future that will massively disrupt the industry,” says Rob Mains, general manager at CHIPS Alliance. “This will increase innovation and help meet the growing demand for differentiated silicon. For far too long, silicon design was only open to the privileged few with the money or skills to enter the Wizard of Oz’s cabal. The need for differentiated silicon for artificial intelligence (AI), machine learning (ML) and other compute-intensive applications has changed the very nature of collaboration in the silicon industry, and is doing away with the notion of a one-stop shop that manages all aspects of hardware development.”

As suggested by Professors John Hennessy and David Patterson in their 2018 Turing lecture, the industry is facing a perfect storm of requirements driving the transition toward 2.5D/3D-IC integration.

More than Moore
Moore’s Law has governed the performance and miniaturization of semiconductor technology since the 1960s, but it is no longer economically feasible for many companies. Lithography limitations ensure that chips cannot get physically bigger. This requires different strategies for growing design sizes.

This means different things to different people. “Moore’s Law will come to its last breath in this decade,” says Vinay Ravuri, CEO for EdgeQ. “For the first time in the semiconductor industry, the intersection of Moore’s Law, hardware acceleration, and ‘softwarization’ will be principle to the future of performance scaling and chip design. The industry will shift toward software-defined hardware with an emphasis on highly custom, programmable chip designs rather than monolithic one-size-fits all chips. Specific intensive functions will be compartmentalized to hardware accelerators, but the design trend will be toward smaller processors to tackle more specified tasks using software definition.”

Along with a breakdown in Moore’s Law, chip development also has been impacted by the termination of Dennard scaling since 2005. “Because of this, increasing processor performance relied on single cores being implemented in smaller geometries and having ever-higher clock frequencies,” says Roddy Urquhart, senior marketing director at Codasip. “Clock frequencies have reached a ceiling to avoid thermal runaway, and increased performance since then has been delivered by multi-core systems with some specialization, such as DSPs, GPUs, NPUs, etc. The More-than-Moore transition will drive further specialization, including programmable accelerators designed to match their computational workload. This will require varied, specialized designs — and more of them. Processor design automation will be essential to implement special architectures.”

Getting to bigger chips will require multiple dies. “As demands accelerate for increasing density, higher bandwidths, and lower power, 3D-IC technology is taking off,” says Melika Roshandell, product marketing director for multi-physics system analysis at Cadence. “3D-ICs promise many advantages, such as reduced footprints and increased functionality, as compared with traditional single-die planar designs. But with new technology comes new challenges. Multi-physics analysis will be critical as engineers face the thermal challenges associated with 3D-ICs. In addition, engineering teams must consider the entire system, including packaging, PCB, and chiplets together, as opposed to the behavior of individual chips operating in isolation.”

It may be easy to get caught up in the hype. “We will definitely see more 2.5D and 3D-IC design starts, but it’s quite a ways off from becoming, as you say, ‘the norm,'” says Joe Sawicki, executive vice president for Siemens EDA. “We also have to take into account the current chip shortage and backlog in capacity. It actually takes more chips to make a 3D-IC, and then capacity is needed to put those ICs in an advanced package. That said, for applications where companies don’t want to jump to the leading-edge node or require heterogenous integration, the infrastructure is now in place to make it a more viable option.”

That doesn’t mean there won’t be hiccups along the way. “Current supply chain disruptions (shortages of semiconductor chips and raw materials, coupled with logistics constraints like crowded ports and a shortage of truck drivers) have created bottlenecks that will continue to constrain output in 2022,” says one executive from Keysight. “Supply chain resilience is now key to an organization’s ability to navigate the ongoing volatility. Organizations will increasingly divert efforts to future-proof supply chains to gain a competitive advantage. In addition, sustainable supply chains will be prioritized to mitigate the environmental, social and corporate governance risk.”

But supply chain issues are spurring innovation. “During the past 18 months, the global supply chains have been largely disrupted due to the COVID-19 pandemic,” says Sree Durbha, director of product management for Semtech’s Wireless and Sensing Products Group. “To address these challenges, businesses should look to deploy long-range, ultra-low-power IoT-connected solutions that incorporate geolocation capabilities through the cloud. Leveraging cloud-enabled geolocation solutions enable ultra-low power asset management platforms to automatically locate, track, and monitor physical assets such as equipment, product, vehicles, and people. In 2022, it will be critical to smooth supply chain operations to ensure smarter, more actionable asset tracking.”

The More than Moore trend is driving a lot of EDA development. “Multidisciplinary approaches, which heterogeneous integration requires, will lead to further innovation for co-design of chiplets, hardware/software co-development, electromagnetic, and thermal aspects,” says Cadence’s Schirrmeister. “In a sense, we are reaching a new level of system-level design that extends way beyond what the SoC-focused term, electronic system level (coined by Gary Smith back in 1997) implied.”

Machine learning
It is impossible to ignore the advances being made in machine learning algorithms and the hardware on which they are running. “The technologies people are building are fantastic,” says Simon Davidmann, founder and CEO for Imperas Software. “Some of them will be wildly successful, and some of them will end up on the graveyard slide. Is the world becoming a better place because of AI and machine learning? There are some challenges with how data mining and data scientists can find out weird things about people. But in the same way that electricity transformed things, AI is transforming things.”

Hardware is beginning to play an increasingly significant role in deployment. “In 2022 the industry will see a big uptick in processing AI workloads at the edge, moving away from cloud processing,” says Mike Henry, co-founder and CEO at Mythic. “New advancements in AI computing technologies, such as analog compute-in-memory, are driving this shift by enabling extremely powerful processing at the edge while maintaining low power consumption – all in a compact processor.”

The industry has to explore alternatives. “What’s interesting to me is the architectures for these chips, and their accompanying external DRAM memory choices, are becoming more diverse,” says Marc Greenberg, group director of product marketing at Cadence. “Among commercially available GPU-type accelerator cards we are seeing the highest end occupied by HBM2-based designs with street prices over $10,000, but the GDDR6-based cards offer up to about 75% of the performance of HBM2 at less than 50% of the cost. When HBM3 arrives it might shake that up a bit, but we are seeing design starts that might have used HBM memory before are now looking at GDDR6, and design starts that might have been looking at GDDR6 are shifting into LPDDR5, LPDDR5X and DDR5. To me that signals a spread of AI/ML server-type applications into a broader market adoption at lower cost.”

The application space is certainly increasing. “Industrial automation is one sector where a significant amount of processing is already being done at the edge, and this will only continue into 2022 as factories increasingly rely on robots with machine vision to boost productivity and increase safety,” adds Mythic’s Henry. “While autonomous drones are another popular use case for edge-AI, a lot of people might be surprised to hear that in 2022 we’ll start to see drones being used to detect potential wildfire outbreaks. Finally, the automotive industry is another segment that will continue to see an increase in edge AI processing, whether it’s for assisted driving features in consumer vehicle applications or safety monitoring applications for commercial applications.”

Many aspects have to be balanced. “Several analysts have predicted AI/ML will drive a good portion of the semiconductor growth over the next decade,” says Schirrmeister. “Ubiquitous hyperconnectivity requires a balance of processing from sensors through devices, far, middle, and near edges, to data centers, in turn driving significantly different latency, power/thermal, performance, scale, and cost requirements at every point of computing. As a result, specialized designs that address various design requirements will be needed to drive domain-specific semiconductor growth, often then assembled at the packaging level using heterogenous 3D-IC integration to allow further specialization.”

And expect to see development move into new areas. “AI/ML is at the heart of automation, not just in running the tests, but in how we use the data to make informed decisions,” says a Keysight spokesperson. “It is far more efficient to move the algorithm to the data rather than move terabytes of data into the cloud, so we expect to see some advances that will help gain insight faster on data in movement.”

Open source
Another trend that is gaining momentum is open-source hardware. “In 2022 we’ll see a massive increase in companies investing their time and energy in developing open-source tooling and collaborating with each other,” says CHIPS Alliance’s Mains. “This will allow organizations of all sizes to take advantage of IP that is community-proven and develop new recipes for differentiated and customized products. It’s exciting to see how the barriers to entry in silicon design are coming down as open-source hardware makes it easier than ever for companies to innovate, scale, and meet the computing needs of today and tomorrow.”

Simon Rance, vice president of marketing for ClioSoft, is in full agreement. “Open-source IP and tools will continue to gain popularity and adoption. This is because open-source semiconductor IP and tools are financially accessible, customizable, and secure for any size company. With open-source IP, the IP is available for others to use, modify or distribute as they wish without asking permission from, or paying royalties to, those who initially developed it. It is also very beneficial for companies making products with low margins such as IoT. For companies of all sizes, reducing or eliminating the time spent iterating on license agreements and negotiating royalties will continue to help improve product time to market.”

Time to market is becoming increasingly important. “The need for proven IP becomes even more important in 2022 as time-to-market pressures escalate,” says Bipul Talukdar, director of applications engineering in North America for SmartDV. “Third-party vendors who are able to provide highly configurable or customizable design IP for better area and performance and a quick customization capability will be in high demand as chip designers look for ways to alleviate the pressure. The RISC-V architecture creates an ecosystem for companies offering turnkey solutions for open-source core productization services, partnerships between productized open-source core providers, FPGA houses, and design IP houses.”

Some see open-source hardware as a threat, but that need not be the case. “Open-source hardware is a great idea,” says Imperas’ Davidmann. “The reason why open-source projects work is because people invest huge amounts of money to make them work. Commercial companies have to invest heavily, because open-source needs the resources to do it. It is not a hobby. It has to be extremely high quality and robust. We couldn’t operate our business without open-source software. We use the GNU tools and Linux, the website stuff, and it’s fantastic. Hardware developers will end up benefiting from being able to use hardware like that. It’s an important evolution in the electronic product area.”

Advances in EDA
The EDA industry re-invests more of its income into R&D than any other technology segment. That investment in the future will certainly continue in a variety of ways. “Noteworthy for 2022 is the milestone of a new SystemVerilog draft,” says Michiel Ligthart, president and COO for Verific Design Automation. “A committee of dedicated industry volunteers is working diligently on fixes and improvements to the standard, and scheduled to complete the work by August. From there, it will still be another year before the actual standard gets ratified by the IEEE, but the effort is worth recognizing.”

With a growing focus on verification, we can expect to see a lot of changes there. “We believe the coming three years, starting 2022, will see an intense increase in formal verification adoption,” says Ashish Darbari, founder and CEO Axiomise. “Simulation-based verification is slowing down delivery schedules and leaking bugs. A multi-pronged approach of training engineers in life-transforming formal verification skills and executing projects with predictable sign-off methods is required. This drive for formal methods will be necessary for all kinds of designs, but for anything that ends up in functional safety and security domains, including AI and deep learning hardware that specializes in statistical and probabilistic computations.”

Will ML transform the industry? “In the past, EDA tools were designed for people,” says ClioSoft’s Rance. “Their primary function was to help engineers complete a design with minimal loss of time or labor. Now, AI is entering into the industry, and it will change the way we work with electronics design. As AI plays a larger role in EDA tools, they’re going to be more powerful and comprehensive without having to be more complicated to use. Designers will spend less time on manual tasks and more time on creative work such as conceptualization and innovation.”

The adoption of AI/ML in EDA is certainly accelerating. “Over the past few years, we have seen AI/ML in functional verification, enabling more intelligent regression compression and improving formal proof optimization,” says Schirrmeister. “Use models will become much more refined with further regression compression, better targeting of regressions, and advanced bug hunting and coverage closure. AI/ML also will become more pervasive throughout the top-to-bottom design flows, extending beyond the high-impact verification and digital implementation areas. We will find adoption of more AI/ML enabled statistical analog simulation. AI/ML will further extend its application to semiconductor library characterization and prediction of unknown yield-limiting hotspots in design for manufacturing. We already have reported significant research in applying AI/ML to printed circuit board design and system design for signal integrity analysis. AI/ML is already pervasive. There is a lot more to come.”

Increasingly complex interdependencies in systems are making heuristic algorithm development less feasible. “AI/ML architectures demand innovation around two axes for EDA,” says Siemens’ Sawicki. “First, is that the regularity of these designs puts a premium on techniques that can leverage that regularity for greater design efficiency — whether that be designing compilers to much more quickly move these designs into verification, or hierarchical test techniques that can greatly increase the efficiency of both test and applying production test on the manufacturing line. Second, are new techniques that allow for the important aspects of architectural exploration that allow for an optimum power curve operation architecture, particularly for edge-based AI chips. High-level synthesis has come into play, as it allows designers to do architectural exploration with a very fast and productive path to RTL, and to move into traditional parts of the design flow.”

Increasing compute needs are also driving cloud adoption. “The cloud is an opportunity for growth for customers’ design needs,” says Brad Griffin, product management group director for multi-physics system analysis at Cadence. “Industries such as banking moved to the cloud years ago, but the EDA industry has been slower than others to adopt this strategy. A move, however, is afoot. NXP recently announced that it is moving most of its EDA workloads to the cloud, and we believe this forward-looking approach will soon be widely adopted by other companies looking to improve the efficiency of their engineering operations.”



Leave a Reply


(Note: This name will be displayed publicly)