Integrating Data From Design, Manufacturing, And The Field


Chip design is starting to include more options to ensure chips behave reliably in the field, boosting the ability to tweak both hardware and software as chips age. The basic problem is that as dimensions become smaller, and as more features are added into devices — especially with heterogeneous assemblies of chiplets running some type of AI — the potential for thermally induced structur... » read more

Extracting Parasitic Impedance Of Semiconductor Power Modules


As a key component in energy conversion system, power semiconductor devices are widely used in various applications, e.g., electric vehicles, renewable energy conversion, and uninterrupted power supplies. The trend for power converter design is always toward higher power density. Power modules that integrate multiple semiconductor devices can meet this demand. It also reduces the compl... » read more

Blog Review: Mar. 12


Cadence's P. Saisrinivas explains the relationship between drive strength and cell delay and why it is key to choose the appropriate drive strength to meet timing constraints while minimizing power and area. Siemens' Daniel Berger and Dirk Hartmann tackle the readout problem of accurately measuring the state of a quantum system after it has undergone a quantum computation, which becomes incr... » read more

Improving Verification Methodologies


Methodology improvements and automation are becoming pivotal for keeping pace with the growing complexity and breadth of the tasks assigned to verification teams, helping to compensate for lagging speed improvements in the tools. The problem with the tools is that many of them still run on single processor cores. Functional simulation, for example, cannot make use of an unlimited number of c... » read more

Multi-Die Design Complicates Data Management


The continued unbundling of SoCs into multi-die packages is increasing the complexity of those designs and the amount of design data that needs to be managed, stored, sorted, and analyzed. Simulations and test runs are generating increasing amounts of information. That raises questions about which data needs to be saved and for how long. During the design process, engineers now must wrestle ... » read more

Blog Review: Feb. 19


Cadence's Ravi Vora explains the AMBA Local Translation Interface protocol, which defines the point-to-point protocol between an I/O device and the Translation Buffer Unit of an Arm System Memory Management Unit. Siemens' Stephen V. Chavez provides a checklist for ensuring the quality and functionality of a PCB at every stage, from design through fabrication, assembly, and testing, with a fo... » read more

Chip Industry Week In Review


Worldwide silicon wafer shipments declined nearly 2.7% to 12,266 million square inches in 2024, with wafer revenue contracting 6.5% to $11.5 billion, according to the SEMI Silicon Manufacturers Group. CSIS released a new report, “Critical Minerals and the Future of the U.S. Economy,” with detailed analysis and policy recommendations for building a secure mineral supply chain for semicond... » read more

Signal Integrity Plays Increasingly Critical Role In Chiplet Design


Maintaining the quality and reliability of electrical signals as they travel through interconnects is proving to be much more challenging with chiplets and advanced packaging than in monolithic SoCs and PCBs. Signal integrity is a fundamental requirement for all chips and systems, but it becomes more difficult with chiplets due to reflections, loss, crosstalk, process variation, and various ... » read more

How SPDM Can Drive Digital Transformation


While simulation has proven to help companies develop better products faster and more efficiently, it also produces copious amounts of data. Simulation process data management (SPDM) solutions further accelerate and improve the approach to product development and serve as the cornerstone for implementing and optimizing the digital thread in modern product development. Ansys subject matter ex... » read more

Normalization Keeps AI Numbers In Check


AI training and inference are all about running data through models — typically to make some kind of decision. But the paths that the calculations take aren’t always straightforward, and as a model processes its inputs, those calculations may go astray. Normalization is a process that can keep data in bounds, improving both training and inference. Foregoing normalization can result in at... » read more

← Older posts Newer posts →