(Artificially) Intelligent Verification


Functional verification produces a lot of data, , but does that make it suitable for Artificial Intelligence (AI) or Machine Learning (ML)? Experts weigh in about where and how AI can help and what the industry could do to improve the benefits. "It's not necessarily the quantity," says Harry Foster, chief scientist for verification at Mentor, a Siemens Business. "It's the quality that matter... » read more

An Eye For An AI


AI comes in multiple forms and flavors. The challenge is choosing the right one for the right purpose, and recognizing that just because AI can be applied to a particular process or problem doesn't mean it should be. While AI has been billed as a ideal solution for just about every problem, there are three primary requirements for a successful application. First, there needs to be sufficient q... » read more

FPGA Prototyping Complexity Rising


Multi-FPGA prototyping of ASIC and SoC designs allows verification teams to achieve the highest clock rates among emulation techniques, but setting up the design for prototyping is complicated and challenging. This is where machine learning and other new approaches are beginning to help. The underlying problem is that designs are becoming so large and complex that they have to be partitioned... » read more

What Will The Next-Gen Verification Flow Look Like?


Semiconductor Engineering sat down to discuss what's ahead for verification with Daniel Schostak, Arm fellow and verification architect; Ty Garibay, vice president of hardware engineering at Mythic; Balachandran Rajendran, CTO at Dell EMC; Saad Godil, director of applied deep learning research at Nvidia; and Nasr Ullah, senior director of performance architecture at SiFive. What follows are exc... » read more

Rising Packaging Complexity


Synopsys’ Rita Horner looks at the design side of advanced packaging, including how tools are chosen today, what considerations are needed for integrating IP while maintaining low latency and low power, why this is more complex in some ways than even the most advanced planar chip designs, and what’s still missing from the tool flow. » read more

Week In Review: Auto, Security, Pervasive Computing


Security Ninety-one percent of commercial applications contain outdated or abandoned open-source components —a security threat, says Synopsys in its recently released report 2020 Open Source Security and Risk Analysis (OSSRA). In the fifth annual edition of the report, Synopsys’ research team in its Cybersecurity Research Center (CyRC) found that 99% of the 1,250 commercial codebases revie... » read more

Layout Generators For Artificial Intelligence Hardware Design


Artificial intelligence (AI) is a powerful tool that offers great convenience in many areas of life. In addition to improving Internet searches and online shopping, it enables driver assistance systems that can save lives, for example. AI in its various forms is the essential tool for such applications, and it can be expected to show a similar development as microelectronics did. Although AI... » read more

An Inside Look At Testing’s Leading Edge


Mike Slessor, president and CEO of FormFactor, sat down with Semiconductor Engineering to discuss testing of AI and 5G chips, and why getting power into a chip for testing is becoming more difficult at each new node. SE: How does test change with AI chips, where you've got massive numbers of accelerators and processors developed at 7 and 5nm? Slessor: A lot of the AI stuff that we've been... » read more

Which Chip Interconnect Protocol Is Better?


Semiconductor Engineering sat down to the discuss the pros and cons of the Compute Express Link (CXL) and the Cache Coherent Interconnect for Accelerators (CCIX) with Kurt Shuler, vice president of marketing at Arteris IP; Richard Solomon, technical marketing manager for PCI Express controller IP at Synopsys; and Jitendra Mohan, CEO of Astera Labs. What follows are excerpts of that conversation... » read more

Inference Moves To The Network


Machine-learning inference started out as a data-center activity, but tremendous effort is being put into inference at the edge. At this point, the “edge” is not a well-defined concept, and future inference capabilities will reside not only at the extremes of the data center and a data-gathering device, but at multiple points in between. “Inference isn't a function that has to resid... » read more

← Older posts Newer posts →