Blog Review: June 10

Benchmarking AI; jitter basics; mainstream AR.

popularity

Cadence’s Paul McLellan considers the issues around benchmarking neural networks running on different hardware and challenges in comparing designs.

Mentor’s Shivani Joshi points to a few of the different types of jitter and some key factors to review when trying to limit jitter.

Synopsys’ Fred Bals notes that while the National Vulnerability Database is a good source for information on publicly disclosed vulnerabilities in open source software, the average 27-day reporting time means it shouldn’t be your only resource.

Arm’s Chris Szabo identifies several challenges to AR devces becoming mainstream, the necessity of integrating multiple different compute elements, and ways to identify performance bottlenecks.

Ansys’ Krista Loeffler warns that the increased amount of software and connectivity in vehicles has created numerous openings for cyberattacks and points to a new standard for system-level security.

In a video, VLSI Research’s Dan Hutcheson chats with Tom Sonderman of Skywater Technology about the foundry’s work making parts for some of the critical devices needed to do DNA sequencing and diagnose COVID-19, plus new sensors on the horizon.

SEMI’s Gity Samadi shares three projects that FlexTech has funded with ITN Energy Systems that use thin, flexible ceramic sheets as both a substrate for functional devices and as an integral part of the packaging of paper-thin flexible hybrid electronics products.

Plus, check out the blogs featured in the recent Auto, Security & Pervasive Computing and Test, Measurement & Analytics newsletters:

Editor in Chief Ed Sperling finds that it’s both good and bad that the edge is vast, vague, and highly specialized.

Rambus’ Scott Best describes the main ways attackers try to monitor or affect the correct operation of a chip and how to protect against them.

Mentor’s Lee Harrison summarizes how to align IC test and functional safety metrics for ISO 26262.

Maxim Integrated’s Reno Rossetti outlines the benefits of using a two-phase buck converter, which include ripple voltage reduction, fewer input capacitors, and better efficiency.

ClioSoft’s Amit Varde explains why keeping track of changes made during the ECO phase helps to avoid miscommunication and unnecessary modifications.

Arteris IP’s Kurt Shuler warns that the risks of designing some IP in-house are higher cost of development and being late to market.

Synopsys’s Neelabja Dutta advocates for using equivalence checking between the RTL design and a C/C++ model of the design’s functionality to verify complex datapaths.

Cadence’s Paul McLellan talks about how the rapid growth of machine learning and AI means many new design starts and hundreds of fabless chip startups.

Flex Logix’s Geoff Tate explains how a key interconnect innovation helped make eFPGA economical across nodes.

Editor in Chief Ed Sperling points to the daunting implications of more data everywhere.

yieldHUB’s Marie Ryan looks at how companies have adapted to work from home and the impact on teams and schedules.

FormFactor’s Thomas Funke explains how high performance image sensors and quantum computing are driving the need for test and measurement tools that can operate in extreme low temperatures.



Leave a Reply


(Note: This name will be displayed publicly)