2025: So Many Possibilities

This will be an incredible year for innovation, driven by AI and for AI, and pushing the limits of fundamental physics.

popularity

The stage is set for a year of innovation in the chip industry, unlike anything seen for decades, but what makes this period of advancement truly unique is the need to focus on physics and real design skills.

Planar scaling of SoCs enabled design and verification tools and methodologies to mature on a relatively linear path, but the last few years have created an environment for more radical changes than at any time since the EDA industry came into existence. In the past, the focus generally involved new process technology, which sometimes created ripple effects throughout the flow. Designs were incremental in nature, borrowing as much from previous designs as possible.

But scaling no longer provides the big improvements in power, performance, and area that it once did. The future increasingly involves heterogeneous and vertical scaling  — referred to here generally as 3D-ICs, but including 2.5D, as well — which has been proven inside some of the most advanced data centers. Alongside of those designs, AI will act as an enabler, impacting tools, methodologies, and flows from within. In addition, power and thermal will rise to become primary optimization targets, and multi-physics no longer will be confined to just mixed-signal design components.

There also is change coming from the demand side. AI demands a massive increase in compute power. The industry is no longer satisfied with incremental improvements in performance, focusing instead on fundamental architectural changes to achieve orders of magnitude gains. Increases in compute power also need appropriate increases in memory performance and communications bandwidth, but chips and packages are constrained by the amount of heat they can dissipate.

Semiconductor Engineering believes that 2025 could be one of the most exciting years in decades. New tools, methodologies, and flows will emerge, driven by the insatiable demand for compute power within the limits of power and thermal. Design teams will undergo significant restructuring to handle the expanding flows between systems and silicon, as well as growing concerns about safety and security, including data security.

Driven by the data center
The data center is at the heart of many of the innovations today, driven by an insatiable need for compute power associated with AI.

AI innovation cycles are surpassing typical design iteration times. “The push toward generative AI solutions means that traditional computing has to re-invent itself and produce exponential throughput,” says Nilesh Kamdar, general manager for the design and verification business unit at Keysight Technologies. “Traditional electronic solutions are running out of steam, and the first breakthrough on the horizon is silicon photonics and more optical communications. Some level of optical communications is already prevalent today, but with the advent of silicon photonics, it will start replacing short-haul transport. In 2025, photonics solutions will become mainstream and drive investments and hiring in this space. Semiconductor foundries will innovate with newer process variants and help drive the ecosystem forward.”

Other forms of communications also are being considered. “AI/ML workloads are pushing the limits on data rates with trillions of calculations processed every second,” says David Kuo, associate vice president for product marketing and business development at Point2 Technology. “Communications bandwidth and interconnects must keep pace to support the growth. This requires an evolutionary shift from copper and optical technologies to new forms of communications, such as using mmWave RF signals to transmit and receive data over plastic dielectric waveguides.”

In the past, design teams were not overly concerned about power in data center chips. “Today, the conversation revolves around the unexpected impact of data centers on power consumption, which has surpassed projections related to cell phone and battery life,” says Rich Goldman, director at Ansys. “The discussion reveals a shift in focus toward the substantial power needs of data centers and the potential necessity for new nuclear power plants to meet these demands.”

That will have a big impact on both the chips being designed for data centers and the supply chain.  “Rumors of the big IP vendors tiptoeing into selling chiplets, and moving upstream into selling silicon, have already been circulating in the press,” says Steve Roddy, chief marketing officer at Quadric. “2025 is likely the year when one or more IP vendors formally announce a move into chiplets. If chiplets succeed in disaggregating the SoC into a system of chiplets, we can expect to see CPU subsystem chiplets, connectivity chiplets, GPU processing chiplets, and AI/ML subsystem chiplets. The market segment most likely to be first in this evolution is the big data center computing segment. Closed, proprietary chiplet systems are already shipping from the big semiconductor companies in that segment. As standardization efforts such as UCIe gain traction, the first place we will see this impact IP vendors is in the data center, with higher volume segments like automotive and eventually mobile handsets to follow in future years.”

While data center consolidation has been happening over the past few years, that could change. “Deployment flexibility is becoming critical,” says Jeff Wittich, chief product officer at Ampere Computing. “As AI workloads expand into diverse environments — on-premises, edge, and air-gapped hosting facilities — latency-sensitive applications will demand infrastructure closer to users, deployed in existing data centers and PoPs (point of presence). Moreover, inference is no longer a standalone workload. Supporting tasks like retrieval-augmented generation (RAG) and app integration will require robust, general-purpose compute alongside AI-specialized resources, emphasizing efficiency and scalability.”

We also could see Quantum become a commercial compute platform. “The final frontier after photonics is quantum computing,” says Keysight’s Kamdar. “This is an exciting area of research, and we already have quantum computers that can handle more than a thousand qubits. With the pace of research and innovation, 10,000 qubits are just a few years away. Quantum research will spread to more countries, especially Asia, as no region wants to cede any computing advantage to another.”

Powered by AI
A number of tools have been enhanced by AI in the past couple of years, but so far few tools or methodologies have been fundamentally altered by AI. That is likely to change in 2025. “We can expect to see AI embedded into tools such as placement, routing, and optimization,” says Andy Nightingale, vice president of product management and marketing at Arteris. “This will reduce manual iterations. We also can expect to see initial adoption of generative AI for design exploration, system architecture suggestions, and managing IP reuse. Within verification, AI will prioritize corner-case testing, accelerate bug detection, and analyze large data sets for functional and formal verification.”

2025 is likely to be the year of AI agents. “Highly specialized AI agents could come together and analyze vast amounts of information spanning software architectures, workloads, manufacturing rules, data flows, timing, and other parameters,” says Stelios Diamantidis, distinguished architect and executive director of Synopsys’ Center for GenAI. “This AI-to-AI collaboration would help identify previously unseen patterns and correlations, develop new solutions for persistent challenges, and offer detailed recommendations for optimizing chip design and performance.”

AI is likely spread to more tool areas, as well. “In the engineering and design space, AI/ML solutions will move from the digital to the analog and make a bigger impact on RF/Analog designers,” says Kamdar. “Generative AI will impact the design community, where ML-based synthesis solutions will help create new and unique designs. Businesses will hire data specialists and assign chief data officers to focus on the fuel that drives all AI/ML work — data. The impact on productivity will improve across all functions due to advancements in AI/ML.”

Chatbots have been shown to help designer productivity. “While the past year focused on chatbot use cases, largely using public data, the future lies in applying generative AI to private, secure datasets to create even more valuable tools,” says Ampere’s Wittich. “Enterprises in sectors like finance, insurance, and e-commerce are poised to adopt these technologies to extract meaningful insights from proprietary data.”

This will start to become a competitive differentiator. “Speed to innovation is the winning formula,” says Sarmad Khemmoro, senior vice president for technical strategy, electronics design, and simulation at Altair. “As AI chip demand continues to surge, semiconductor companies will realize the critical role emerging technologies play in the design process. By integrating AI with simulation software, engineers can test new concepts and make design decisions up to 1,000 times faster than traditional methods, dramatically speeding time to market and cutting costs. This approach will be key to producing high-performance chips more efficiently and staying competitive in the rapidly evolving semiconductor industry.”

Changes in licensing will be required, however. “Enterprises will deploy AI in two distinct ways, to automate highly constrained tasks with well-structured outputs, and to provide collaborative tools for open-ended tasks — both aimed at improving employee efficiency,” says Adam Tilton, co-founder and CEO of Driver. “However, the pricing models will reflect these different use cases,  consumption-based pricing for structured outputs versus per-seat licensing for collaborative tools.”

It’s often said that data is the new oil, but the industry is just beginning to realize that containing and protecting data is more difficult than storing oil. Data has to be constantly verified and cleaned. “The evolving terrain of AI agents reinforce the need for transparency,” says Synopsys’ Diamantidis. “In other words, we need a clear view of each AI agent. How are they developed and trained? What are their operating objectives? How are they interacting with other AI agents? What data sets are they leveraging?”

Data sovereignty and security will heavily influence AI deployment strategies in 2025. “Enterprises are increasingly aware of the value of their proprietary datasets, treating them as competitive assets,” says Wittich. “This shift will mean that AI inference workloads run not only on public hyperscale clouds, but also in more secure environments like private clouds, on-premises data centers, or privately hosted facilities. The risk of data breaches and tampering with AI algorithms underscores the need for secure, isolated infrastructure. As enterprises compete on AI-driven innovation, the ability to safeguard intellectual property and sensitive information will become a cornerstone of success. Furthermore, this trend will expand the role of enterprise-owned compute resources, creating a more decentralized and secure AI ecosystem.”

It requires creative solutions when tools and data come from different places. “EDA vendors have been looking at the best way to train models and isolate proprietary data,” says Paul Graykowski, product marketing director in Cadence’s System Verification Group. “Solutions are forthcoming. While we aren’t going to see a complex SoC designed and verified by GenAI just yet, some of the more mundane work of documentation, coding templates, and scripts for automation are coming. AI technologies will become the force multiplier needed to verify the next generation of chips.”

Today, little thought is given to the cost effectiveness of using AI. “Companies will succeed by building AI products rather than just wrappers around LLMs,” says Driver’s Tilton. “This means things like hybrid tech stacks that include traditional software processing, algorithms, and then surgical use of LLMs. AI solutions need to demonstrate concrete metrics like cost savings, productivity gains, or revenue growth that justify their implementation costs.”

There also will be a shakeout in NPUs. “In the boom years of 1998 to 2001 we saw upward of 50 different RISC CPU architectures and 25-plus DSP architectures in the industry,” says Quadric’s Roddy. “Much like watching a nature documentary about the booms and inevitable busts in populations of species, so too will the overpopulation of NPUs lead to a thinning of the herd. Companies that thought building a matrix accelerator was a source of differentiation will come to learn that licensing an IP block is cheaper and better than reinventing what is already ready-made. And the realities of competition are such that the market cannot sustain 10 or 15 licensing companies. We’ve already seen the peak of population in 2024, and a number of weaker NPU IP companies have closed. Look for that trend to accelerate in 2025, even as the volume of transactions increases, as companies shut down internal NPU developments.”

Physical constraints
Until recently, there were few physical constraints as to what could be done. Many designs are now up against the reticle limit, and while power has been a consideration for quite some time, thermal is now becoming a limiter for many designs.

“Power always has a soft goal,” says Marc Swinnen, director of product marketing at Ansys. “A design has to meet a certain frequency, and if it doesn’t, then you go back and delay the product until you meet the frequency. But power always has been a matter of, ‘We did the best we could.’ Now it’s becoming, ‘You must design for power.’ You can’t take a design that consumes a lot of power and turn it into a low-power design. It has to be something that you do from the start. Companies are thinking about thermal much earlier, and they’re seeing it as a core limiting ceiling on what you could design. To that end, prototyping is becoming very important. That caught us by surprise when we first started addressing the 3D-IC market.”

It requires new types of tools. “This is an electro-thermal co-design problem, ” says Arteris’ Nightingale. “It requires tools that address co-simulation of power, heat dissipation, and structural mechanics. This is especially important for managing heat and power delivery in stacked architectures. In the extreme, it requires real-time thermal simulations integrated with design and verification workflows, focusing on compact and active cooling solutions.”

AI’s insatiable demand for electricity in data centers is stressing power grids and geographic power constraints. “To avoid bringing new non-renewable energy sources online or prolonging their life in the short-term, hardware optimization will play a pivotal role in reducing power requirements,” says Wittich. “Replacing older, power-hungry systems with modern efficient processors can dramatically cut energy use, making existing infrastructure more sustainable. This efficiency shift is critical to balancing the need for more energy with responsible environmental stewardship.”

3D-IC technology (including 2.5D) already is being used to tackle some of these issues. “Chiplet and 3D-IC solutions will continue to become more mainstream,” says Kamdar. “You can expect additional packaging houses to join the chiplet ecosystem and help standardize many aspects of design, and collaboration. Advanced packaging techniques, including heterogeneous integration (HI), will give a technological and business advantage to system companies, which will continue to attract investment in this space. The design solutions for creating 3D-IC/HI designs will mature, with system designers being able to do upfront design and tradeoffs more easily.”

This could have a significant impact on the IP marketplace. “Synopsys and Cadence have market dominance in the physical IP space,” says Quadric’s Roddy. “This stemmed from the engineering-intensive nature of porting complex high-speed analog interfaces to each new process variant from each fab. But what happens when chiplets become more mainstream? No longer does an SoC design team need to have all the IOs in the same process. If you can leverage chiplets and 3D-IC packaging to use existing interface IP in a 5nm chiplet, there’s far less need to port all that physical IP. Existing physical IP players will likely see an uptick in license revenues as more IP gets re-used, but a reduction in demand for NRE porting fees. And aspiring new physical IP players will be emboldened to enter the market knowing that their handiwork can have a much longer useful lifespan. Don’t expect to see a change of the top three ranking in 2025, but the seeds of future changes will be planted in 2025.”

Within EDA
New technology nodes continue to add design pressure. “For highly scaled transistors, such as at the 2nm node, effects such as random dopant fluctuation have a large impact on transistor characteristics such as Vt or mobility,” says Shawn Thomas, head of advanced logic nodes and power business at Atomera. “Variations in Vt can cause differences in switching speeds, which is compounded in the GAA structure. Each channel in a GAA transistor could have a different Vt, which then aggregate to the overall Vt of the device. This results in increased mismatch at the circuit level and shrinks the design margin that circuit designers have to work with (i.e. more conservative designs to account for increased variability). Variations in Vt also can result in increased leakage of the transistor, thereby increasing the off-state power consumption of the transistor and subsequent circuit element.”

New tools are required for advanced packaging technologies. “There needs to be a shakeout in packaging technologies,” says Ansys’ Swinnen. “There’s a gazillion different ways of assembling multiple chips and doing the bonding, but not all of them can have enough investment to bring them to full production viability across a general market. What’s driving that is everybody wants to be able to sell chiplets, and everybody working on 3D needs chiplets, so there’s a strong push to standardize. The industry has its interests aligned to make sure that standards are developed for that. It will take more than a year to get there, but then you will see advances in that area.”

While standards are being prepared, collaboration becomes important. “Semiconductor companies will increasingly place a greater emphasis on forging strong partnerships with system companies,” says Altair’s Khemmoro. “These collaborations are vital, as many chip manufacturers lack a complete understanding of how their products are integrated into final devices. In the years ahead this teamwork will be even more important, especially as systems designers face relentless pressure to create smaller, more efficient products. Staying attuned to these changes and adapting accordingly will be essential for maintaining competitiveness and driving innovation forward.”

A lot of buzz associated with AI in EDA has been in the area of verification. “Over the last few years, we’ve seen a lot of development in the AI space in verification, which has helped optimize regressions and narrow down the scope of debugging verification failures,” says Cadence’s Graykowski. “As the industry has had some time to prototype and refine these solutions, I anticipate we’ll get a more optimized feature set and see additional adoptions bringing this technology to mainstream verification flows. The areas likely to see the most advancement in the coming year go beyond regression optimization to help close coverage gaps and wring out more bugs in less time. A lot of potential exists for automating flows to triage and narrow down the scope of debugging and identify the likely source of the failure. The verification engineer will utilize the technology to sift through the volume of data, allowing them to keep pace with the size of today’s designs. I also expect GenAI to see more applications in the coming year.”

The design process also will be enhanced by AI. “Printed circuit board (PCB) design requires multiple disciplines and skill sets,” says Khemmoro. “As a result, companies are increasingly adopting automation across the entire workflow, from initial requirements and logical design to fabrication and assembly. By converging AI with simulation in the design process, design decisions are accelerated, and development cycles are significantly shortened. To enhance this process even further, monitoring systems are now being employed in the field for industries with long product lifespans, such as automotive, aerospace, and defense, to track PCB performance after the fact, as well as de-rate components to relay potential issues directly to development teams. These insights then can be incorporated into the PCB design process for further efficiency and reliability.”

Design teams are having to span an increasing number of disciplines. “The scope of many designs are expanding and that means simulation must also expand to cover optical, fluidic, and mechanical effects, especially in complex packaging,” says Arteris’ Nightingale. “Some markets add other requirements, such as new methodologies for hardware-level security, including cryptographic verification and attack resistance, and compliance at the chip level to ISO 21434 for cybersecurity risk management. And reliability requires predictive tools for aging, electromigration, and other long-term failure modes.”

Conclusion
Collectively, these are just a few of the areas in which the industry will see significant change within 2025. The pace in which some of these technologies are being prototyped and released into the industry is accelerating. This is unprecedented, and it speaks to the importance of staying ahead of the curve. The entire industry is in the throws of a huge makeover.

Related Reading
Design And Verification Issues In 2024
What have you been reading over the past year? This often indicates the problems you most desperately need to be solved.
Startup Challenges In A Changing EDA World
Without innovation, it may not be possible to fully utilize technological advances.



Leave a Reply


(Note: This name will be displayed publicly)