How AI Is Transforming System Design


Experts At The Table: ChatGPT and other LLMs have attracted most of the attention in recent years, but other forms of AI have long been incorporated into design workflows. The technology has become so common that many designers may not even realize it’s a part of the tools they use every day. But its adoption is spreading deeper into tools and methodologies. Semiconductor Engineering sat down... » read more

Yield Management Embraces Expanding Role


Competitive pressures, shrinking time-to-market windows, and increased customization are collectively changing the dynamics and demands for yield management systems, shifting left from the fab to the design flow and right to assembly, packaging, and in-field analysis. The basic role of yield management systems is still expediting new product introductions, reducing scrap, and delivering grea... » read more

LLMs Show Promise In Secure IC Design


The introduction of large language models into the EDA flow could significantly reduce the time, effort, and cost of designing secure chips and systems, but they also could open the door to more sophisticated attacks. It's still early days for the use of LLMs in chip and system design. The technology is just beginning to be implemented, and there are numerous technical challenges that must b... » read more

Why It’s So Hard To Secure AI Chips


Demand for high-performance chips designed specifically for AI applications is spiking, driven by massive interest in generative AI at the edge and in the data center, but the rapid growth in this sector also is raising concerns about the security of these devices and the data they process. Generative AI — whether it's OpenAI’s ChatGPT, Anthropic’s Claude, or xAI’s Grok — sifts thr... » read more

Is The Transformer Era Over?


The idea of transformer networks has existed since the seminal publication of the Attention is All You Need paper by Google researchers in June 2017.  And while transformers quickly gained traction within the ML research community, and in particular demonstrated superlative results in vision applications (ViT paper), transformer networks were definitely not a topic of trendy conversation ar... » read more

Dramatic Changes Ahead For Chips And Systems


Early this year, most people had never heard of generative AI. Now the entire world is racing to capitalize on it, and that's just the beginning. New markets, such as spatial computing, quantum computing, 6G, smart infrastructure, sustainability, and many more are accelerating the need to process more data faster, more efficiently, and with much more domain specificity. Compared to the days ... » read more

2023: A Good Year For Semiconductors


Looking back, 2023 has had more than its fair share of surprises, but who were the winners and losers? The good news is that by the end of the year, almost everyone was happy. That is not how we exited 2022, where there was overcapacity, inventories had built up in many parts of the industry, and few sectors — apart from data centers — were seeing much growth. The supposed new leaders we... » read more

Vision Transformers Change The AI Acceleration Rules


Transformers were first introduced by the team at Google Brain in 2017 in their paper, "Attention is All You Need". Since their introduction, transformers have inspired a flurry of investment and research which have produced some of the most impactful model architectures and AI products to-date, including ChatGPT which is an acronym for Chat Generative Pre-trained Transformer. Transformers a... » read more

Energy Usage in Layers Of Computing (SLAC)


A technical paper titled “Energy Estimates Across Layers of Computing: From Devices to Large-Scale Applications in Machine Learning for Natural Language Processing, Scientific Computing, and Cryptocurrency Mining” was published by researchers at SLAC National Laboratory and Stanford University. Abstract: "Estimates of energy usage in layers of computing from devices to algorithms have bee... » read more

A Study Of LLMs On Multiple AI Accelerators And GPUs With A Performance Evaluation


A technical paper titled “A Comprehensive Performance Study of Large Language Models on Novel AI Accelerators” was published by researchers at Argonne National Laboratory, State University of New York, and University of Illinois. Abstract: "Artificial intelligence (AI) methods have become critical in scientific applications to help accelerate scientific discovery. Large language models (L... » read more

← Older posts