Why It’s So Hard To Secure AI Chips


Demand for high-performance chips designed specifically for AI applications is spiking, driven by massive interest in generative AI at the edge and in the data center, but the rapid growth in this sector also is raising concerns about the security of these devices and the data they process. Generative AI — whether it's OpenAI’s ChatGPT, Anthropic’s Claude, or xAI’s Grok — sifts thr... » read more

Is The Transformer Era Over?


The idea of transformer networks has existed since the seminal publication of the Attention is All You Need paper by Google researchers in June 2017.  And while transformers quickly gained traction within the ML research community, and in particular demonstrated superlative results in vision applications (ViT paper), transformer networks were definitely not a topic of trendy conversation ar... » read more

Dramatic Changes Ahead For Chips And Systems


Early this year, most people had never heard of generative AI. Now the entire world is racing to capitalize on it, and that's just the beginning. New markets, such as spatial computing, quantum computing, 6G, smart infrastructure, sustainability, and many more are accelerating the need to process more data faster, more efficiently, and with much more domain specificity. Compared to the days ... » read more

2023: A Good Year For Semiconductors


Looking back, 2023 has had more than its fair share of surprises, but who were the winners and losers? The good news is that by the end of the year, almost everyone was happy. That is not how we exited 2022, where there was overcapacity, inventories had built up in many parts of the industry, and few sectors — apart from data centers — were seeing much growth. The supposed new leaders we... » read more

Vision Transformers Change The AI Acceleration Rules


Transformers were first introduced by the team at Google Brain in 2017 in their paper, "Attention is All You Need". Since their introduction, transformers have inspired a flurry of investment and research which have produced some of the most impactful model architectures and AI products to-date, including ChatGPT which is an acronym for Chat Generative Pre-trained Transformer. Transformers a... » read more

Energy Usage in Layers Of Computing (SLAC)


A technical paper titled “Energy Estimates Across Layers of Computing: From Devices to Large-Scale Applications in Machine Learning for Natural Language Processing, Scientific Computing, and Cryptocurrency Mining” was published by researchers at SLAC National Laboratory and Stanford University. Abstract: "Estimates of energy usage in layers of computing from devices to algorithms have bee... » read more

A Study Of LLMs On Multiple AI Accelerators And GPUs With A Performance Evaluation


A technical paper titled “A Comprehensive Performance Study of Large Language Models on Novel AI Accelerators” was published by researchers at Argonne National Laboratory, State University of New York, and University of Illinois. Abstract: "Artificial intelligence (AI) methods have become critical in scientific applications to help accelerate scientific discovery. Large language models (L... » read more

How Much AI Is Really Needed?


Tensor Core GPUs have created a generative AI model gold rush. Whether it’s helping students with math homework, planning a vacation, or learning to prepare a six-course meal, generative AI is ready with answers. But that's only one aspect of AI, and not every application requires it. AI — now an all-inclusive term, referring to the process of using algorithms to learn, predict, and make... » read more

Need To Share Data Widens In IC Manufacturing


Experts at the Table: Semiconductor Engineering sat down to discuss issues in smart manufacturing of chips, including data management and grounding, chiplets, and standards, with Mujtaba Hamid, general manager for product management for secure cloud environments at Microsoft; Vijaykishan Narayanan, vice president and general manager of India engineering and operations at proteanTecs; KT Moore,... » read more

Issues and Opportunities in Using LLMs for Hardware Design


A technical paper titled "Chip-Chat: Challenges and Opportunities in Conversational Hardware Design" was published by researchers at NYU and University of New South Wales. Abstract "Modern hardware design starts with specifications provided in natural language. These are then translated by hardware engineers into appropriate Hardware Description Languages (HDLs) such as Verilog before syn... » read more

← Older posts