GenAI + Semiconductors + Humanity

AI will change the semiconductor startup ecosystem in myriad ways.

popularity

Silicon Catalyst held its 2024 Semiconductor Industry Forum in Mountain View, CA, at the Computer History Museum on November 13th. Richard Curtin, managing partner for Si Catalyst, opened the event by thanking David House, vice chair of the Board at the Computer History Museum and creator of the 4004 processor, and the CHM staff for hosting the event.

Richard talked about the start of semiconductors in Silicon Valley and startups, when several engineers left Schockley to start their own company and then go on to start many more.

Robert Noyce’s pitch to Sherman Fairchild motivated Fairchild to create the Fairchild Semiconductor division. Noyce advocated the use of silicon as a substrate since the materials cost would consist of sand and a few fine wires and most of the cost would be in manufacturing. Noyce also expressed his belief that semiconductors would herald the start of disposable appliances, due to the cheap electronic components they would not be replaced but discarded when they wore out.

Richard said that the key ingredient is passionate CEOs and founders with an incredible educational background, with passion to go convince someone to give them money, and that sounded like Silicon Catalyst CEOs as well.

Pete Rodriguez, CEO of Silicon Catalyst, then took the stage to talk about the mission of Silicon Catalyst to help startups. Pete introduced the evening’s moderator, David French, CEO of SigmaSense and Si Catalyst board member. David then introduced and welcomed the panelists: Navin Chadda, managing partner, Mayfield; Chloe Ma, VP of IoT Line of Business / GTM China, Arm; and Jay Dawani, CEO and co-founder, Lemurian Labs.

Left to right, moderator David French, panelists Chloe Ma, Navin Chadda and Jay Dawani.

David said that semiconductors are everywhere and now we’re in the age of AI, and then asked Navin what he sees as the most important potential semiconductor breakthrough, particularly in AI.

Navin said that we’re glad that Silicon has come back to Silicon Valley, largely due to AI. We’re experiencing a 10x change in user interface and a 10x change in cognition that’s going to impact every industry. One shouldn’t be afraid of AI, it will be a teammate. AI and humans will work together to affect the way we live, work and play. First, software engineering has co-pilots and this will happen in silicon engineering next. It acts as a buddy to do mundane work and free up time for more creative work.

Yield enhancement is big in the semi industry and there will be use of AI for advanced process control. AI will bring efficiency and productivity; we’ll go on to do greater things with an AI buddy. Navin said that from a Mayfield perspective, he hasn’t seen a better time to be investing in semiconductors and he thinks that this is a golden era for the semiconductor industry. There was a saying that software is eating the world. It’s already eaten. So now it’s turning back to hardware for infrastructure and basic innovation in physics, chemistry and biology. The most valuable companies today are semiconductor companies, infrastructure companies and cloud providers. They are the new system vendors.

Navin Chadda: Software has already eaten the world and it’s turning back to hardware.

David’s next question was for Chloe Ma of Arm. He asked, “With Arm at the core of every app these days, what product areas are most exciting to you, and in particular AI?” In response to David’s earlier joke about going to ChatGPT to get the questions for the panel, Chloe jokingly said that she went to ChatGPT to get the answers for David’s question.

Chloe said that Arm grew roots in processors and microcontrollers and rose to stardom in the smartphone era. Arm recently has made a lot of progress in the cloud and AI cloud infrastructure. She’s really excited about the Grace Blackwell super chip from Nvidia. The Grace CPU is based on Arm.

We talk about GPU having such high performance for AI computing, but at the same time we’re hitting the memory wall. Because the GPUs and the compute engine are like the car engine and the data is like the fuel, so you have to get the fuel to the engine. That memory wall keeps the data from getting to the compute engine fast enough.

Arm enabled Nvidia to create this Grace Hopper super chip so that between the CPU and GPU there is this NVLink, chip-to-chip, that enables the GPU to be able to access the LPDDR5x memory that the CPU is managing and vice versa the CPU can access the HBM, HBM3 memory that the GPU is managing. The bandwidth between the CPU and GPU is 900GB/s and is cache coherent. That actually enables the data to go into the compute engine.

Chloe said that she’s excited about smart glasses that can see what the user sees and hears. Chloe thinks that the AI for glasses is still somewhat dumb in terms of dealing with the physical world. Roughly, it’s the intelligence of a cat. It would be nice to have glasses to tell you who someone is when they walk up and start talking to you.

Personally, she’s excited about robotics. If AI takes over white-collar jobs, she wants robots to take care of the physical labor tasks too. Chloe said that when she gets old, she would like to have robots to take care of her.

Chloe Ma: The memory wall creates a challenge of getting the data “fuel” to the compute engines.

David’s next question was for Jay. He asked him, “What emerging applications do you find most exciting?”

Jay said that the acceleration of science, as basically all science fiction is becoming a reality. Applications of biology and the potential to map every person and provide hyper personalized medicine. Not just treating a disease that you’ve gotten now and expanding your life a little bit but providing a healthier and longer life.

There are fundamental innovations in architecture that are happening and there are a couple of efforts that Jay is involved in, in creating the next architecture beyond LLMs. These architectures can create their own world model and embed different kinds of modalities together, search over that embedded space and think about all the possible things. Now, you can have a human in the loop and experiment with that and turn them into reality.

Science comes down to the speed of thought or the cost of compute, which is getting better and better. That’s exciting and he said that he couldn’t think of anything better than that.

Jay Dawani: Basically everything that you’ve ever seen in Sci-Fi is basically about to become a reality.

David said that small companies can bring genius and energy. There’s been a lot of consolidation in the semi industry and we’re seeing some renaissance in semiconductors. Rick Lazansky created Silicon Catalyst to help even out ups and downs in the economy. David asked Chloe, “Arm is everywhere, how is Arm working to create a thriving ecosystem for startups and innovation?”

Chloe said that Arm is working with Silicon Catalyst with a flexible access program. The usual Arm business model is an upfront fee plus royalty. She noted that this is not super friendly to startups. Some don’t have money for the upfront licensing fee. So, Arm flexible access was created with a low annual subscription fee. For low revenue companies, it can be $0. Only once you’ve gone to tape out then would the company pay a manufacturing (license) fee. When you go to production then you pay a royalty.

Traditionally CPU, interconnect, and interrupt IP were included. Arm developed a Compute Subsystem (CSS). Accelerator companies, like Rebellions AI (Korea), have some interesting power reduction ideas. Arm gave them the compute subsystem so that they can plug in their accelerator and they also worked with Samsung on the foundry side.

David asked Jay, “How do you feel about startups and semiconductors, government, academia and financial institutions to support these companies?”

Jay said that governments played important roles: ARPAnet, GPS, radar were all government funded projects. Government must contribute. For example, the urban challenge for self-driving cars. Incentives need to get better aligned. Jay thinks we’re going to see a very positive future.

David asked Navin, “Mayfield has been associated with capital to entrepreneurs over the years. Is the startup ecosystem facing a new golden era?”

Navin replied that with AI, all boats are going to rise. We are looking for opportunities up and down the tech stack. There’s no point in funding the twentieth GPU company. The first layer is the semiconductor and hardware layer, but on that you need the new operating system, which is models. This is all startups.

The next layer up is the compute, the GPU, the model, which is the operating system, and now you need to bring the data. There will be public data and there will be private data. There are lots of problems in that area. Now you put these things together, maybe the cloud providers package all of this together and provide it. If all of this infrastructure is there then there’s a need for new kinds of dev tools that will enable intelligent SaaS business. Up and down the stack there’s going to be so much innovation.

For Mayfield, cooling and power are huge, huge issues. Fans don’t work for edge-based applications. Better power supplies are needed, and they have made an investment in an MIT-based project using GaN for a new kind of power supply. They are also looking at liquid and air cooling.

Everything is going to move away from copper to optics. 400G, 800G, 1.6T, 3.2T. Also looking at coherent DSPs. There’s a need for a second source.

RISC-V has potential. There’s a need for new optical transducers and a new switch company was just funded. Networking, plus software, plus IP is important. Jay believes I/O is the bottleneck. Once you move up the stack it’s wide open in terms of opportunity. Software companies want to go over to a use-based business model. It’s hard to displace Arm and Nvidia. Look around the edge and find entry points.

VCs have to support these endeavors. Up and down the tech stack there are opportunities. Use the platforms that have been created. Fabless semis need to be infrastructure-less. Build around the cloud infrastructure. Jay is seeing opportunities up and down the stack and encouraging entrepreneurs to innovate around and move up the stack.

Chloe asked, why try to change the underlying hardware? Once we get through this training wave, inference is going to happen everywhere. The general feeling is that AI is going to change everything and chips will need to be redesigned for AI.

Dave asked Jay if he had any thoughts on investing in AI. Jay replied that he also agreed that interesting things happen around the boundaries. If Internet, communication becomes free, and AI becomes free, then we need to adjust to the new reality. The new fundamental unit of compute is the data center. We need to think about this and find solutions innovating around a post-kernel world with a unifying model to run from a single computer to a data center.

Navin said that thinking around the edge, you can have a prepared mind, but as an entrepreneur you need to have an open mind. We haven’t seen enough yet for the possibilities of AI. For example, when exploring the drug creation space, companies now can go forward with 10 candidates because you know that 90 won’t work. New markets will be created around things that haven’t been thought about. Reimagine the world together.

David closed on a positive note by stating that the concerns about the negative impact on the workforce are overblown because more opportunities will be created than taken away. There are more good times to come.

A full video of the panel session including the post question and answer period is available on the Silicon Catalyst website here.



Leave a Reply


(Note: This name will be displayed publicly)