Blog Review: March 20

Formal verification and AI chips; DFM in PCBs; 1600G Ethernet; AI in physics simulation.

popularity

Synopsys’ Kiran Vittal delves into AI chips, including the expansion of chip design beyond traditional semiconductor companies, adoption of RISC-V, and the use of formal equivalence checking to verify complex AI datapaths.

Siemens’ Patrick McGoff points to a survey that suggests projects deploying design for manufacturing within a PCB design flow are more likely to be completed on-time, on-quality and on-budget.

Cadence’s Krunal Patel finds that 1600G Ethernet brings performance and scalability improvements through new data encoding techniques, enhanced switching and routing architectures, and additional physical layer considerations.

Ansys’ Mazen El Hout explores how generative AI applied to 3D physics leverages previously generated simulation results from physics-based solvers to train the AI models and deliver faster predictions.

Keysight’s Andrew Herrera explains how to simulate telecom signals using an arbitrary waveform generator to synthesize a complex analog or digital signal by first generating a suitable digital data sequence and notes several different signal synthesis techniques.

Arm’s Marc Meunier checks out Next Gen Firewalls and why performance optimization is important to get the added security benefits without performance penalties.

SEMI’s Jaegwan Shim shares highlights from the recent SEMICON Korea, including how new packaging and memory technology innovations are key to driving the increased processing speeds demanded by AI and other data-intensive applications.

Memory analyst Jim Handy explains what stranded memory is and why hyperscale data centers are embracing CXL as a way to solve the problem.

Lithography expert Chris Mack shares some highlights from the SPIE Advanced Lithography and Patterning Symposium, including efforts to prevent image fading due to mask 3D effects and the use of DSA rectification combined with EUV.

Plus, check out the blogs featured in the latest Low Power-High Performance newsletter:

Fraunhofer IIS/EAS’s Andy Heinig finds that different markets require additional chiplet standards that address more than just interfaces.

Siemens’ Keith Felton and Todd Burkholder contend that instead of stalling out, Moore’s Law will be revitalized and turbocharged by 3D-ICs.

Synopsys’ Taruna Reddy looks at measuring how injected faults propagate through a design and how long they remain in the system to verify hardware security.

Mixel’s Mahmoud ElBanna and ams OSRAM’s Brian Lenkowski show how event detection enables cameras to switch between high-resolution, high-frame rate, and low-power modes.

Quadric’s Steve Roddy questions why companies fixate on old technology when new ML networks far outperform them.

Rambus’ Emma-Jane Crozier explains why choosing the right memory for AI inference is a balance of bandwidth, capacity, power, and form factor.

Keysight’s Jenn Mullen lays out why quantum error detection, suppression, and correction strategies are critical to realizing fault-tolerant quantum computers.

Cadence’s Dharini Subashchandran and Shyam Sharma explain the differences in the structure, functionality, and use cases of flash memory.

Arm’s Prakash Mohapatra examines the shift from discrete ECUs to zonal controllers in emerging EE architectures.



Leave a Reply


(Note: This name will be displayed publicly)