Blog Review: Sept. 2


Arm's Pranay Prabhat highlights research into zero-power or low-power sensing devices and work toward designing a microcontroller that could fit with DARPA N-ZERO sensors. Mentor's Shivani Joshi provides a primer on the ODB++ standard data exchange file format that generates PCB design data files for use in fabrication, assembly, and test. Cadence's Paul McLellan shares some highlights fr... » read more

Getting Particular About Partitioning


Partitioning could well be one of the most important and pervasive trends since the invention of computers. It has been around for almost as long, too. The idea dates back at least as far back as the Manhattan Project during World War II, when computations were wrapped within computations. It continued from there with what we know as time-sharing, which rather crudely partitioned access by p... » read more

New Architectures, Much Faster Chips


The chip industry is making progress in multiple physical dimensions and with multiple architectural approaches, setting the stage for huge performance increases based on more modular and heterogeneous designs, new advanced packaging options, and continued scaling of digital logic for at least a couple more process nodes. A number of these changes have been discussed in recent conferences. I... » read more

Bridging The Gap Between Driven And Driverless Cars


Today, 91% of car accidents worldwide are caused by some form of human error. Moving to ADAS functions, such as Automatic Emergency Braking or Lane Keep Assist, and autonomous vehicles (AVs) will significantly improve road safety and reduce costs associated with accidents, such as car and highway repair, police, ambulance, and insurance. However, to be fully autonomous will take many years, if ... » read more

Week In Review: Design, Low Power


Tools & IP Monozukuri unveiled its IC/Package co-design tool, GENIO. GENIO integrates existing silicon and package EDA flows to create full co-design and I/O optimization of complex multi-chip designs.  It works seamlessly across all existing EDA flows and comprises floor planning, I/O planning and end-to-end interconnect planning combined with cross-hierarchical pathfinding optimization.... » read more

Confusion Persists In Verification Terms


I find it amazing that an area of technology that attempts to show, beyond a reasonable doubt, that a design will work before it is constructed can be so bad at getting some basic things right. I am talking about verification terminology. I have been in this industry for over 40 years and it is not improving. In fact, it is getting worse. The number of calls I have with people where they hav... » read more

The Power Of Visualization


In the 1990s, the National Semiconductor Israeli site in Herzliya was responsible for the design and verification of the company’s flagship RISC processor. That was the place and the time when the concept of constraint-random, abstract, coverage-driven verification was born. Engineers realized that without a random generation of stimuli opcodes, it would be very hard to fully verify new pr... » read more

The Four Pillars Of Hyperscale Computing


In his keynote at CadenceLIVE Americas 2020, Facebook’s Vijay Rao, director, Technology and Strategy, described the four core elements the team considers when designing their data centers—compute, storage, memory, and networking. Wait a minute. Facebook? How did we get here? Wasn’t EDA supposed to be focused on chip design? As indicated in a previous blog, electronic value chains are defi... » read more

Understanding The Performance Of Processor IP Cores


Looking at any processor IP, you will find that their vendors emphasize PPA (performance, power & area) numbers. In theory, they should provide a level playing field for comparing different processor IP cores, but in reality, the situation is more complex. Let us consider performance. The first thing to think about is what aspect of performance you care about. Do you care more about the ... » read more

Challenges In Using AI In Verification


Pressure to use AI/ML techniques in design and verification is growing as the amount of data generated from complex chips continues to explode, but how to begin building those capabilities into tools, flows and methodologies isn't always obvious. For starters, there is debate about whether the data needs to be better understood before those techniques are used, or whether it's best to figure... » read more

← Older posts Newer posts →