Semicon West Day One/Two

Reporter’s Notebook: Roadmap; Al Gore; panel; cyber standards.

popularity

For years, the semiconductor and equipment industry has congregated at the annual Semicon West trade show in San Francisco.

It’s an event to get an update on the latest equipment, test and packaging technologies. It’s also a good way to meet with people who you haven’t seen in a year, if not longer. It’s a great way to get a pulse on the industry.

Needless to say, Semicon is a virtual event this year. Virtual events have their places. It’s no substitute for an in-person event.

Still, at the virtual Semicon event, one can get an idea of what the industry is thinking and where it’s heading. It’s impossible to write-up everything that is happening at this year’s Semicon West. So, I will highlight just a few of the events.

Pre-Semicon/Day 1
Chip roadmaps/products
Semicon actually started last week, when Imec held its annual event. It was a virtual event. As before, Imec gave several presentations, outlining where the technology is heading.

From a system point of view, Imec splits hardware into three categories — data center, mobile/handhelds, and IoT/edge. “For all of these market segments, dimensional scaling of logic, memory and packaging will be needed,” said Sri Samavedam, senior vice president of CMOS technologies at Imec, in a presentation during the event. “But that is not sufficient to provide the PPA value. We expect new materials will be needed to improve the functionality or an improvement in performance will be needed. In addition to new materials, new architectures will also show up.”

The new architectures include traditional system-on-a-chip (SoC) products, AI accelerators, and 2.5D/3D technologies. The good news is that transistor density scaling continues at each node. The problem is that the performance improvements have been slowing.

That doesn’t seem to be stopping the industry from scaling. At the event, Imec presented a slide on its latest logic roadmap. On the transistor front, chipmakers are currently shipping finFET transistors, which will extend to the 3nm and/or 2nm foundry node. Then, at 3nm or 2nm, chipmakers are expected to migrate to the nanosheet FET.

The nanosheet FET could extend beyond 2nm. At 2nm, Imec is developing what is calls a Forksheet FET. Then at 1nm, the R&D organization is developing a complementary FET (CFET). According to Imec’s roadmap, a CFET with 2D channel materials could appear at the sub-1nm node.

Meanwhile, Semicon West officially started Monday. Several companies had product or corporate announcements. For example, Coventor, a Lam Research Company, announced the availability of CoventorMP 1.3 — the newest version of its MEMS design automation platform. CoventorMP 1.3 addresses the need to design complex MEMS devices in consumer, automotive, aerospace, industrial, and IoT applications.

KLA rolled out the eSL10 e-beam patterned-wafer defect inspection system. The new system is designed to accelerate the time-to-market for high-performance logic and memory chips. The system detects defects that cannot be routinely captured by optical or other e-beam defect inspection platforms.

Day 2
Sobering keynote
Al Gore, the former vice president of the United States, on Tuesday kicked off Semicon West with a sobering keynote on the threat posed by climate change. Gore is the co-founder and chairman of Generation Investment Management, and the founder and chairman of The Climate Reality Project. He is also a senior partner at Kleiner Perkins Caufield & Byers and a member of Apple’s board.

“Nineteen of the 20 hottest years ever measured with instruments have been in the last 20 years. Last year was the second hottest ever measured. And this year, according to scientists barely halfway through the year, is almost certain to be the hottest. Of course, the ocean temperatures virtually every year are hotter than the previous ones,” Gore said. “We’re still putting 152 million tons of man-made heat-trapping global warming pollution into the thin shell of the atmosphere surrounding our planet every 24 hours. The molecules on average stay up there every 100 years.”

The situation is improving. The Covid-19 pandemic caused governments to implement stay-at-home orders for the masses. This in turn is creating a short-term decrease in emissions.

Still, there is a major problem, especially when life returns to normal. “All of that extra heat energy is disrupting the earth’s climate balance. It’s disrupting the water cycle, evaporating a lot more water vapor off the oceans and atmospheric rivers that are on average 25 times the size of the Mississippi River,” Gore said.

What’s more, the global economy suffered $2.5 trillion in damage from climate-related extreme weather conditions in the last decade, compared to $1 trillion in the previous decades, according to Gore.

The trends are expected to continue. “We have to make policy changes that speed up the introduction of new technologies that help us solve this crisis,” Gore declared.

The issues aren’t limited to the weather. It’s widely documented that the data center consumes an inordinate amount of energy. The ongoing shift towards AI and cloud computing will accelerate those trends. “One MIT study last year found that training large AI models can result in emissions nearly five times the lifetime emissions of the average American car,” he said.

Amid the doom and gloom, there is hope. Gore was quick to point out that chip advances have enabled smaller and more power-efficient devices. The cost of solar panels and electric vehicles continue to improve, he added.

AI, Moore’s Law panel
Following Gore’s address on Tuesday, there was a panel at Semicon, entitled: “Bending the Climate Curve: Enabling Sustainable Growth of Big Data, AI, and Cloud Computing.” The panelists addressed a number of issues, such as climate change, memory, packaging, and scaling.

Climate change is a major concern. “There is a general recognition that digitalization can play a key role in climate change mitigation. Some examples include renewable energy integration, smart buildings, advanced manufacturing processes and de-materialization just to name a few,” said Eric Masanet, the Duncan and Suzanne Mellichamp Chair in Sustainability Science for Emerging Technologies at the University of California at Santa Barbara. “But it’s also known that digitalization brings with it some environmental challenges. For one, IT device manufacturing can be very energy and resource intensive.”

By 2023, Cisco projects that there will be nearly 30 billion devices connected in the world, according to Masanet. “A recent study estimated that the combined energy footprint of data centers, networks and all of those connected devices account for about 6% to 7% of global electricity use. And that may rise as demand for data in these systems grows rapidly,” Masanet said.

Demand for data is exploding. IDC projects that the amount of data created in the world is expected to increase fivefold by the year 2025.

The IC industry continues to attack the problem. In the data center, and other apps, for example, vendors over the years have increased the processor speeds, while also making major efficiency gains.

However, Moore’s Law is slowing down. “This means that future efficiency gains might not be enough to offset future demand growth, and that could lead to a sharp rise in energy use and emissions globally,” Masanet said.

Still, chip scaling is continuing. Vendors are moving from 7nm to 5nm with 3nm in R&D. But at each node, the benefits are diminishing. Costs are going through the roof.

So in response, OEMs as well as the semiconductor industry are looking at new approaches. 2.5D/3D architectures using various chip-packaging approaches is one path.

“Whether there’s a 2X improvement in devices or a 10X improvement in devices, we can be pretty sure that there’s not any more of an infinite runway of such improvements. And that means that you have to approach the problem differently,” said Rob Aitken, fellow and director of technology for R&D at Arm, during the panel. “Typically, what’s been done up to this point has been primarily through better algorithms, better optimized micro architectures and so on. What we’re thinking, though, is going to be a real boom over the next 5 to 10 years. It is going to be moving to more 3D-based solutions. And by that I mean die stacking-type solutions, the way we currently see in NAND flash, for example, or the way that we’ve seen in systems, where there are high bandwidth memory stacks. Those are going to move into first the infrastructure domain, where you’ll see high performance processing coupled to very localized memory. The vertical stacking essentially allows you to get more connectivity bandwidth and it allows you to get that bandwidth at lower capacitance for lower power use and also a lower delay, which means improved performance. So, the 3D aspect of Moore’s Law has not yet begun really in earnest. But when it does, we’ll start to see some very serious gains in performance.”

Energy efficiency and power consumption are still critical in systems. Here, the industry will need to advance the current technologies or look for new ways to solve the problem.

In this area, the first step is “to improve our memory systems,” Aitken said. “Moving data around is incredibly inefficient and bringing the compute closer to the memory is the path forward, whether that’s through die stacking as I mentioned earlier, or whether it’s through moving AI to the edge, or whether it’s through improved memory structures such as MRAM. Each of these things have something to contribute. And this will require re-architecting our systems a little bit. Persistent memory is different than non-persistent memory. There are security implications to where data is and how it’s accessed and so on.”

Others agreed. “Moving the data around is one of our most expensive things. And even moving things over cellular networks is expensive,” said Cliff Young, a software engineer at Google, during the panel. “In addition to that, if I could wave a magic wand, I wish we could return to Dennard scaling. I don’t know whether there are technologies around that have that virtuous property to do that, where the devices keep getting better every year for the same power cost. But that would be ideal, if anything like that could be found.”

There are other solutions as well. For example, AI processing is mainly conducted in the cloud. But many are looking to move some of the processing to the edge, which will supposedly reduce the energy consumption in systems. “(This involves) special purpose chips that are meant for edge AI acceleration, things around sub-hundred milliwatts. That will have a major impact on our energy consumption,” said Moe Tanabian, general manager of Azure Edge Devices at Microsoft, during the panel.

Let’s not forget software. In a system, the software needs to run more efficiently to improve system performance. “There needs to be a much better collaboration between manufacturing hardware and software. This is where we’re going to gain probably the most efficiencies, especially in the data center,” said Samantha Alt, a machine learning engineer at Intel, during the panel.

Clearly, there is no one technology that can do everything. Plus, companies are all moving in completely different directions. “We’re all independent. So, we need to drive co-optimization within this ecosystem,” said Ellie Yieh, corporate vice president for advanced product technology development at Applied Materials.

For example, AI chip companies will need to take a closer look at new devices and materials. Then, the materials and device vendors need to look closer at the systems. “That’s what we should to do to continue to drive the industry forward,” Yieh added.

Fab tool forecast
At a virtual press event, meanwhile, SEMI presented its latest forecast. Global sales of semiconductor manufacturing equipment by original equipment manufacturers are projected to increase by 6% to $63.2 billion in 2020, compared to $59.6 billion in 2019, according to SEMI. In 2021, the equipment market is expected to reach a record high revenue of $70 billion, according to SEMI.

The wafer fab equipment segment is expected to rise by 5% in 2020 followed by 13% growth in 2021, according to the trade group. This is driven by a memory spending recovery, investments in leading-edge, and China. The assembly and packaging equipment segment is forecast to grow by 10% to $3.2 billion in 2020 and 8% to $3.4 billion in 2021 driven by advanced packaging capacity buildup, according to SEMI.

“Regionally, China, Taiwan and Korea are expected to lead the pack in spending in 2020,” according to SEMI. “Robust spending in China in the foundry and memory sectors is expected to vault the region to the top in total semiconductor equipment spending in 2020 and 2021.”

New standards
Several trends are emerging during the Covid-19 pandemic. Among the takeaways is the need for the accelerated development of standards with cybersecurity in mind. Underscored by the pandemic, two draft standards are underway in the fab tool sector.

First, Intel and Cimetrix are leading SEMI Draft Document 6566–Specification for Malware-Free Equipment Integration. This defines the protocols for pre-shipment scans of equipment as well as various types of ongoing support, including file transfers, maintenance patches, and component replacement.

Second, TSMC and Industrial Technology Research Institute (ITRI) are leading SEMI Draft Document 6506–Specification for Cybersecurity of Fab Equipment. This defines a common, minimum set of security requirements for fab equipment.

Related
Semicon West Day Three
Reporter’s notebook: Auto outlook; auto VC panel; SEMI board.



Leave a Reply


(Note: This name will be displayed publicly)