Key Challenges In Scaling AI Clusters


AI is evolving at an unprecedented pace, driving an urgent need for more powerful and efficient data centers. In response, nations and companies are ramping up investments into AI infrastructure. According to Forbes, AI spending from the Big Tech sector will exceed $250 billion in 2025, with the bulk going towards infrastructure. By 2029, global investments in AI infrastructure, including dat... » read more

What Scares Chip Engineers About Generative AI


Experts At The Table: LLMs and other generative AI programs are a long way away from being able to design entire chips on their own from scratch, but the emergence of the tech has still raised some genuine concerns. Semiconductor Engineering sat down with a panel of experts, which included Rod Metcalfe, product management group director at Cadence; Syrus Ziai, vice-president of engineering at E... » read more

Advanced Packaging Evolution: Chiplet And Silicon Photonics-CPO


As we enter the AI era, the demand for enhanced connectivity in cloud services and AI computing continues to surge. With Moore’s Law slowing down, the increasing data rate requirements are surpassing the advancements of any single semiconductor technology. This shift underscores the importance of heterogeneous integration (HI) as a crucial solution for alleviating bandwidth bottlenecks. Tod... » read more

Memory Wall Problem Grows With LLMs


The growing imbalance between the amount of data that needs to be processed to train large language models (LLMs) and the inability to move that data back and forth fast enough between memories and processors has set off a massive global search for a better and more energy- and cost-efficient solution. Much of this is evident in the numbers. The GPU market is forecast to reach $190 billion in ... » read more

EUV’s Future Looks Even Brighter


The rapidly increasing demand for advanced-node chips to support everything-AI is putting pressure on the industry's ability to meet demand. The need for cutting-edge semiconductors is accelerating in applications ranging from hyperscale data centers powering large language models to edge AI in smartphones, IoT devices, and autonomous systems. But manufacturing those chips relies heavily on ... » read more

Normalization Keeps AI Numbers In Check


AI training and inference are all about running data through models — typically to make some kind of decision. But the paths that the calculations take aren’t always straightforward, and as a model processes its inputs, those calculations may go astray. Normalization is a process that can keep data in bounds, improving both training and inference. Foregoing normalization can result in at... » read more

Optimizing DFT With AI And BiST


Experts at the Table: Semiconductor Engineering sat down to explore how AI impacts design for testability, with Jeorge Hurtarte, senior director of product marketing in the Semiconductor Test Group at Teradyne; Sri Ganta, director of test products at Synopsys; Dave Armstrong, principal test strategist at Advantest; and Lee Harrison, director of Tessent automotive IC solutions at Siemens EDA. Wh... » read more

Chip Industry Week In Review


The chip industry is well on its way to hit $1 trillion in revenue by the end of its decade. Several analyst firms released 2024 annual results and 2025 predictions: Worldwide semiconductor revenue reached $626 billion in 2024, an 18% increase versus 2023, according to preliminary Gartner report. Memory revenue grew about 70%  2024 versus 2023. The firm forecasts that HBM will make up 19%... » read more

Transforming Industrial IoT With Edge AI And AR


The Internet of Things (IoT) has evolved significantly from its early days of centralized cloud processing. Initially, IoT applications relied heavily on cloud-based data processing, where data from various devices was collected, processed, and analyzed in the cloud before insights were sent back to the devices. While effective, this approach has limitations, particularly in environments requir... » read more

Using AI In Semiconductor Inspection


AI is exceptionally good at spotting anomalies in semiconductor inspection. The challenge is training different models for different inspection tools and topographies, and knowing which model to use at any particular time. Different textures in backgrounds are difficult for traditional algorithms, for example. But once machine learning models are trained properly, they have proven effective in ... » read more

← Older posts Newer posts →