Blog Review: April 1

Memory for high performance AI; legacy vulnerabilities; efficient PCB design.

popularity

Rambus’ Steven Woo takes an in-depth look at on-chip memory for high performance AI applications and explores some of the primary differences between HBM and GDDR6.

Synopsys’ Taylor Armerding warns of the risks of legacy vulnerabilities, where software has problems that were never fixed then forgotten about or never discovered in the first place, and key steps for finding and addressing them.

Mentor’s Shivani Joshi points to several efficient PCB design techniques that can help you reach your designs goals the first time around.

Cadence’s Paul McLellan listens in as noted cryptographers discuss data privacy, security, and the state of encryption at the recent RSA Conference.

In a video, VLSI Research’s Andrea Lati and Dan Hutcheson discuss COVID-19’s impact on IC sales, prices, and unit shipments, some positive signs from China, and why work-from-home could mean more demand than expected.

Verification blogger Tudor Timi explains why progressively adding constraints to tests to reduce their randomness makes for a good constrained random test suite and how to implement it.

Arm’s Geoffroy Vallee digs into the Message Passing Interface (MPI) for high performance computing workloads and what developers need to know before trying to create and run MPI in containers.

Ansys’ Theresa Duncan explains why understanding coefficient of thermal expansion (CTE) mismatch and other CTE effects is important for preventing solder joint failure.

SEMI’s Serena Brischetto talks with Franz Bozsak of medtech company Sensome about using technology to detect and treat strokes faster with better results.

For more good reading, check out the blogs featured in last week’s Systems & Design newsletter:

Editor In Chief Ed Sperling finds the overall chip industry still healthy, with some caveats.

EDA Technology Editor Brian Bailey questions whether more end-user involvement requires a different approach to standardization.

Synopsys’ Manoz Palaparthi examines why photonic ICs, while promising for a range of networking applications, present unique verification challenges.

Cadence’s Frank Schirrmeister digs into how the digital-twin concept translates to different industries.

OneSpin’s Tom Anderson explains why formal verification thoroughly verifies RISC-V processor design.

Mentor’s Michael White demonstrates how leveraging massive amounts of compute resources reduces time-to-market.

Valtrix Systems’ Shubhodeep Roy Choudhury shows how to set up and manage regressions in a lightweight verification tool.

Imagination’s Benny Har-Even looks at how GPUs are reaching beyond 3D graphics processing to power a range of new markets.



Leave a Reply


(Note: This name will be displayed publicly)