Using AI And Bugs To Find Other Bugs


Debug is starting to be rethought and retooled as chips become more complex and more tightly integrated into packages or other systems, particularly in safety- and mission-critical applications where life expectancy is significantly longer. Today, the predominant bug-finding approaches use the ubiquitous constrained random/coverage driven verification technology, or formal verification techn... » read more

The Expanding Universe Of MIPI Applications


It’s hard to imagine today, but there was a time when mobile phones had no cameras and displays were tiny monochrome LCDs capable of displaying a phone number and not much more. The iconic Nokia 3310 announced Sept. 1, 2000, had an 84 x 48 pixel monochrome display and went on to sell 126 million units worldwide. You may still have one in your junk drawer. By the time of the original iPhone... » read more

Blog Review: Oct. 14


Arm's Hongsup Shin explains a machine learning application that can determine which tests are most likely to find hardware bugs, improving efficiency and reducing the number of tests that need to be run. Synopsys' Pieter van der Wolf and Dmitry Zakharov take a look at the increasing need for low power processors optimized for machine learning tasks as IoT, smart home, and wearable devices pr... » read more

Reliability Over Time And Space


The demand for known good die is well understood as multi-chip packages are used in safety-critical and mission-critical applications, but that alone isn't sufficient. As chips are swapped in and out of packages to customize them for specific applications, it will be the entire module that needs to be verified, simulated and tested, and analyzed. This is more complicated than it sounds for s... » read more

One More Time: TOPS Do Not Predict Inference Throughput


Many times you’ll hear vendors talking about how many TOPS their chip has and imply that more TOPS means better inference performance. If you use TOPS to pick your AI inference chip, you will likely not be happy with what you get. Recently, Vivienne Sze, a professor at MIT, gave an excellent talk entitled “How to Evaluate Efficient Deep Neural Network Approaches.” Slides are also av... » read more

Good Vs. Bad Acquisitions


M&A activity is beginning to heat up across the semiconductor industry, fueled by high market caps, low interest rates, and a slew of startups with innovative technology and limited market reach. Some of these deals are gigantic, such as the pending acquisition of Arm by Nvidia, and the proposed purchase of Maxim Integrated by Analog Devices. Others are more modest, such as Arteris IP's ... » read more

Deals That Change The Chip Industry


Nvidia's pending $40 billion acquisition of Arm is expected to have a big impact on the chip world, but it will take years before the effects of this deal are fully understood. More such deals are expected over the next couple of years due to several factors — there is a fresh supply of startups with innovative technology, interest rates are low, and market caps and stock prices of buyers ... » read more

Nvidia To Buy Arm For $40B


Nvidia inked a deal with Softbank to buy Arm for $40 billion, combining the No. 1 AI/ML GPU maker with the No. 1 processor IP company. Assuming the deal wins regulatory approval, the combination of these two companies will create a powerhouse in the AI/ML world. Nvidia's GPUs are the go-to platform for training algorithms, while Arm has a broad portfolio of AI/ML processor cores. Arm also ha... » read more

How ML Enables Cadence Digital Tools To Deliver Better PPA


Artificial intelligence (AI) and machine learning (ML) are emerging as powerful new ways to do old things more efficiently, which is the benchmark that any new and potentially disruptive technology must meet. In chip design, results are measured in many different ways, but common metrics are power (consumed), performance (provided), and area (required), collectively referred to as PPA. These me... » read more

From Data Center To End Device: AI/ML Inferencing With GDDR6


Created to support 3D gaming on consoles and PCs, GDDR packs performance that makes it an ideal solution for AI/ML inferencing. As inferencing migrates from the heart of the data center to the network edge, and ultimately to a broad range of AI-powered IoT devices, GDDR memory’s combination of high bandwidth, low latency, power efficiency and suitability for high-volume applications will be i... » read more

← Older posts