Blog Review: Feb. 28

Portable stimulus; HLS and machine learning; NVM and IoT; 5G; test-driven development.


Mentor’s Matthew Ballance explains just what the portable stimulus standard makes portable.

Cadence’s Dave Pursley considers why high-level synthesis is a good fit for cutting-edge machine learning designs.

Synopsys’ Melissa Kirschner notes that the growing number of IoT devices means new opportunities for one-time programmable NVM.

Applied’s Mike Rosa considers the pros and cons of 5G and why the ultimate solution will be a combination of 5G and 4G.

Rambus’ Aharon Etengoff takes a closer look at NR and PAM4, signal integrity challenges, and when one is better to use than the other.

Intel’s Ron Wilson digs into test-driven development for hardware, its role in reuse, and why it’s a methodology worth exploring.

Lam Research’s Pat Lord notes that while the market is booming for automotive electronics, the segment presents challenges to manufacturing.

Coventor’s Chris Welham takes a look at the causes of noise in MEMS condenser microphones and the challenges of modeling them.

Arm’s Dipesh Patel sees big opportunities for IoT in the utilities, logistics, and smart buildings markets, but integration challenges remain.

Lithography blogger Chris Mack shares highlights from the first day of SPIE, including the biggest challenges still facing EUV.

Silicon Labs’ Kevin Smith considers spurs in clock phase noise measurements and how to use them for test purposes.

Verification blogger Gaurav Jalan warns against becoming too focused on new verification flows and methodologies without keeping the end goal of catching bug in mind.

Cadence’s Dimitry Pavlovsky digs into atomic transactions and the changes in signaling in the latest AMBA 5 ACE/AXI specification

Synopsys’ Fred Bals and Larry Trowell discuss why the video game industry takes security seriously and what other industries can learn from its multi-pronged approach.

Mentor’s Jeff Miller checks out some of the sensor-driven technology used to protect Alpine skiers.

And don’t miss the highlighted blogs from last week’s System-Level Design newsletter:

Editor in Chief Ed Sperling argues that the whole tech industry needs to start thinking differently about what it creates.

Mentor’s Matthew Hogan explains why successful verification requires more than just DRC and LVS rule decks.

Synopsys’ Shekhar Kapoor digs into machine learning and how it can help meet PPA challenges and improve ECO optimization productivity.

NetSpeed’s Rajesh Ramanujam highlights the importance of cultivating the right company culture.

ArterisIP’s Kurt Shuler contends that more sophisticated automotive chips require a more advanced methodology to stitch together IP.

Aldec’s Vatsal Choksi looks at communication between sequence, sequencer and driver.

eSilicon’s Mike Gianfagna examines what’s inside the package, what’s the goal, and how new technology is evolving.

OneSpin’s Tom Anderson shows why tool safety compliance matters, and how vendors can make the process easier.

Cadence’s Frank Schirrmeister argues that a combination of verification engines is required to successfully develop a new class of chips.

Technology Editor Brian Bailey notes that while EDA exists because it was too expensive for semiconductor companies to develop and maintain tools, they must continue to be part of the innovation cycle.

Leave a Reply

(Note: This name will be displayed publicly)