Top Stories
Roaring ’20s For The Chip Industry
New markets, different architectures, and continued virtual work environments all point to positive and sustained growth.
Big Changes In Verification
High-quality and efficient verification requires a focus on details.
Taming Non-Predictable Systems
The non-predictable nature of most systems is not inherently bad, so long as it is understood and bounded — but that’s becoming a bigger challenge.
Blogs
Technology Editor Brian Bailey observes that just because an invention is no longer practical for many applications does not mean it wasn’t a good invention at the time, in Von Neumann Upset.
Siemens’ Geir Eide explains why a bus-based scan data distribution architecture enables true bottom-up DFT flows, in Have It All With No-Compromise DFT.
Synopsys’s Kiran Vittal looks at why equivalence checking provides a powerful method for verifying the most complex AI datapaths, even across very different levels of abstraction, in AI And ML Applications Require Advanced Datapath Verification.
Vtool’s Hagai Arbel digs into what needs to change in verification to improve the rate of first silicon success, in Bug Escapes And The Definition Of Done.
Cadence’s Frank Schirrmeister finds that although a lot has changed technologically in the last 20 years, underlying themes hold steady, in Hyperscaling The 21st Century Engineer.
Codasip’s Roddy Urquhart warns that regardless of whether an ISA is commercially licensed or open, continued stability and availability are big concerns for licensees, in ISA Ownership Matters: A Tale of Three ISAs.
OneSpin’s Rob van Blommestein explains how to detect critical bugs in an open-source core for high-volume chips with formal verification, in RISC-V Becoming Less Risky With The Right Verification.