Blog Review: June 28

AI and verification noise; aero learns from auto; multibody dynamics simulation; optimizing manufacturing processes.


In a podcast, Siemens’ Spencer Acain discusses the role of AI and machine learning in IC verification and how it could help address noise by analyzing different signals from the diagnosis data to figure out the real root cause of a failure.

Synopsys’ Ian Land and Ron DiGiuseppe find that designers of aerospace microelectronics are applying lessons and technologies learned from the automotive sector and highlight three key pillars of high-reliability semiconductor design applicable across both industries.

Cadence’s Vinod Khera points to the challenges in finding and fixing rare high-sigma design failures and how machine learning and advanced statistical techniques can enable early, accurate, and faster yield estimation than traditional brute-force Monte Carlo simulations.

Ansys’ Devashish Sarkar explains how multibody dynamics simulation is used to digitally model systems with many interconnected parts, sometimes involving hundreds of bodies in constantly changing states of motion.

Keysight’s Renee Morad points to some common questions around fully autonomous vehicles to provide a clearer picture of where the industry is headed and when new technologies are expected to come to fruition.

Renesas’ Hirofumi Ohta introduces a deep neural network (DNN) simulator tool for debugging analysis and accuracy improvement of models on the company’s R-Car SoC.

Arm’s Saurabh Pradhan checks out what’s new in Cortex-X4, the company’s new high-performance CPU, while Manish Pandey introduces the latest Cortex-A cores and Dan Wilson shares additions to the company’s GPU range.

The ESD Alliance’s Bob Smith chats with Silvaco’s Babak A. Taheri about the challenges of tuning and optimizing semiconductor manufacturing processes and how AI and digital twins might be applied to reduce time and costs.

IBM’s Abu Sebastian, Abbas Rahimi, and Geethan Karunaratne introduce neuro-vector-symbolic architectures implemented using memristive analog in-memory computing hardware to create a ‘resonator network’ that can iteratively solve a particular factorization problem where factors assume holographic distributed representations.

Marvell’s Suhas Nayak argues that DSP-based optical connectivity is critical for meeting bandwidth requirements and ensuring seamless data transfer within AI clusters.

Intel’s Jennifer Huffstetler presents four principles for writing energy and carbon-efficient software.

And don’t miss the blogs featured in the latest Systems & Design newsletter:

Technology Editor Brian Bailey warns that some changes to software can damage hardware.

Siemens’ Om Prakash outlines a new way to securely transfer data from one device to another and shows how the integrity of data is maintained through an encryption process.

NI’s Joey Tun shows how a company’s semiconductor testing strategy is a differentiating opportunity in the shifting automotive ecosystem.

Expedera’s Paul Karazuba shows what to consider when choosing an accelerator in a constantly changing AI landscape.

Renesas’ Graeme Clark explains why noise becomes a much bigger problem as transistor sizes shrink and switching frequency increases.

Synopsys’ Robert Ruiz zeroes in on how to find the root causes of simulation regression failures.

Cadence’s Reela Samuel looks at how generative AI in EDA tools could boost design space exploration.

Codasip’s Tora Fridholm finds that a ‘digital ear’ that can detect and measure body signals more precisely would be a boon for medical devices.

Keysight’s Hwee Yng Yeo drills down into emerging battery cell chemistries and why they require test systems with the flexibility and readiness to adapt to changes.

Leave a Reply

(Note: This name will be displayed publicly)