A Power-First Approach


It is becoming evidently clear that heat will be the limiter for the future of semiconductors. Already, large percentages of a chip are dark at any time, because if everything operated at the same time the amount of heat generated would exceed the ability of the chip and package to dissipate that energy. If we now start to contemplate stacking dies, where the ability to extract heat remains con... » read more

The End Of Closed EDA


In a previous life, I was a technologist for a large EDA company. One of my primary responsibilities in that position involved talking to a lot of customers to identify their pain points, and what new tools we could develop that would ease their problems. You would think that would be an easy task, but it certainly was not the case. For example, if you ask a developer what their biggest frus... » read more

Learning How To Forget


There has been a lot of talk recently about the right to be forgotten, or data privacy rights. These require companies that hold data about us to remove it when properly requested. This might be data that was collected as we browse the Internet, or from online shopping. Or perhaps it's collected as we drive our cars past cameras, or GPS tracking of our cellphones, or many other ways – some of... » read more

ML And UVM Share Same Flaws


A number of people must be scratching their heads over what UVM and machine learning (ML) have in common, such that they can be described as having the same flaws. In both cases, it is a flaw of omission in some sense. Let's start with ML, and in particular, object recognition. A decade ago, Alexnet, coupled with GPUs, managed to beat all of the object detection systems that relied on tradit... » read more

A New Breed Of EDA Required


While doing research for one of my stories this month, a couple of people basically said that applying methodologies of the past to the designs of today can be problematic because there are fundamental differences in the architectures and workloads. While I completely agree, I don't think these statements go far enough. Designs of today generally have one of everything — one CPU, one accel... » read more

Is AI Improving A Broken Process?


Verification is fundamentally comparing two models, each derived independently, to find out if there are any different behaviors expressed between the two models. One of those models represents the intended design, and the other is part of the testbench. In an ideal flow, the design model would be derived from the specification, and each stage of the design process would be adding other deta... » read more

The Industrial Revolution Is Over


One of the greatest impacts of the industrial revolution was that better communication allowed for greater specialization, and with that came better economics. There have been multiple waves of the industrial revolution, each triggered by some improvement in communications. The first wave was all about trains — raw materials and finished goods could be quickly and cheaply moved between cit... » read more

UCIe: Marketing Ruins It Again


You may have seen the press release and articles recently about a new standard called UCIe. It stands for Universal Chiplet Interconnect Express. The standard is a great idea and will certainly help the market for chiplet-based designs to advance. But the name — Argggh. More on that later. First, let's talk about what it is. You may notice the name looks similar to PCIe (Peripheral Compone... » read more

Does EDA Sell Fear?


I worked in the EDA industry for over 30 years and a common lament I heard was that the EDA industry survived by selling fear. Your new chip will fail if you do not buy the latest tool offering. There always seemed to be a natural dislike for the EDA industry and many users thought the industry overcharged and was unable to innovate. I never quite understood the reasoning. A recent comment, ... » read more

Ethical Coverage


How many times have you heard statements such as, "The verification task quadruples when the design size doubles?" The implication is that every register bit that is created has doubled the state space of the design. It gives the impression that complete verification is hopeless, and because of that little progress has been made in coming up with real coverage metrics. When constrained rando... » read more

← Older posts Newer posts →