Verification Experts Vs. Generalists


Experts At The Table: As chips and systems become more complicated, more verification tasks get abstracted. So do we need more specialists who are experts in specific tasks, or do we need more generalists who know how to use the tools but don't necessarily have the depth of understanding? Or do we need some way to balance both? Semiconductor Engineering sat down with a panel of experts, includi... » read more

Improving Verification Methodologies


Methodology improvements and automation are becoming pivotal for keeping pace with the growing complexity and breadth of the tasks assigned to verification teams, helping to compensate for lagging speed improvements in the tools. The problem with the tools is that many of them still run on single processor cores. Functional simulation, for example, cannot make use of an unlimited number of c... » read more

AI’s Rapid Growth: The Crucial Role Of High Bandwidth Memory


System efficiency is dictated by the performance of crucial components. For AI hardware systems, memory subsystem performance is the single most crucial component. In this blog post, we will provide an overview of the AI model landscape and the impact of HBM memory subsystems on effective system performance. AI models have grown from a few billions of parameters from the early '90s to today�... » read more

Lines Blurring Between Supercomputing And HPC


Supercomputers and high-performance computers are becoming increasingly difficult to differentiate due to the proliferation of AI, which is driving huge performance increases in commercial and scientific applications and raising similar challenges for both. While the goals of supercomputing and high-performance computing (HPC) have always been similar — blazing fast processing — the mark... » read more

What Scares Chip Engineers About Generative AI


Experts At The Table: LLMs and other generative AI programs are a long way away from being able to design entire chips on their own from scratch, but the emergence of the tech has still raised some genuine concerns. Semiconductor Engineering sat down with a panel of experts, which included Rod Metcalfe, product management group director at Cadence; Syrus Ziai, vice-president of engineering at E... » read more

Thermal Analysis Of 3D Stacking And BEOL Technologies With Functional Partitioning Of Many-Core RISC-V SoC


Thermal challenges in 3D-IC designs can cause a significant risk in meeting performance specifications. While the pace of Moore’s Law has slowed in recent years, system technology co-optimization (STCO) promises to mitigate technology scaling bottlenecks with system architecture tuning based on emerging technology offerings, including 3D technology. This white paper analyzes the impact of mat... » read more

Blog Review: Feb. 19


Cadence's Ravi Vora explains the AMBA Local Translation Interface protocol, which defines the point-to-point protocol between an I/O device and the Translation Buffer Unit of an Arm System Memory Management Unit. Siemens' Stephen V. Chavez provides a checklist for ensuring the quality and functionality of a PCB at every stage, from design through fabrication, assembly, and testing, with a fo... » read more

ADAS Adds Complexity To Automotive Sensor Fusion


Sensor fusion is becoming increasingly popular and more complex in automotive designs, integrating multiple types of sensors into a single chip or package and intelligently routing data to wherever it is needed. The primary goal is to bring together information from cameras, radar, lidar, and other sensors in order to provide a detailed view of what's happening inside and outside of a vehicl... » read more

Signal Integrity Plays Increasingly Critical Role In Chiplet Design


Maintaining the quality and reliability of electrical signals as they travel through interconnects is proving to be much more challenging with chiplets and advanced packaging than in monolithic SoCs and PCBs. Signal integrity is a fundamental requirement for all chips and systems, but it becomes more difficult with chiplets due to reflections, loss, crosstalk, process variation, and various ... » read more

Normalization Keeps AI Numbers In Check


AI training and inference are all about running data through models — typically to make some kind of decision. But the paths that the calculations take aren’t always straightforward, and as a model processes its inputs, those calculations may go astray. Normalization is a process that can keep data in bounds, improving both training and inference. Foregoing normalization can result in at... » read more

← Older posts Newer posts →