Debugging Embedded Applications


Debugging embedded designs is becoming increasingly difficult as the number of observed and possible interactions between hardware and software continue to grow, and as more features are crammed into chips, packages, and systems. But there also appear to be some advances on this front, involving a mix of techniques, including hardware trace, scan chain-based debug, along with better simulation ... » read more

Dealing With Market Shifts


Back in the days when I was in EDA development, I was taken in by the words of Clayton Christensen when he published "The Innovators Dilemma." He successfully introduced the technology world to the ideas of disruptive innovation. One of the key takeaways was that you should always be working to make your own successful products redundant, or someone else will do it for you. One tool I worked... » read more

What’s Next For Emulation


Emulation is now the cornerstone of verification for advanced chip designs, but how emulation will evolve to meet future demands involving increasingly dense, complex, and heterogeneous architectures isn't entirely clear. EDA companies have been investing heavily in emulation, increasing capacity, boosting performance, and adding new capabilities. Now the big question is how else they can le... » read more

Pushing The Limits Of Hardware-Assisted Verification


As semiconductor complexity continues to escalate, so does the reliance on hardware-assisted simulation, emulation, and prototyping. Since chip design first began, engineers have complained their design goals exceeded the capabilities of the tools. This is especially evident in verification and debug, which continue to dominate the design cycle. Big-iron tooling has enabled design teams to k... » read more

Scaling Simulation


Without functional simulation the semiconductor industry would not be where it is today, but some people in the industry contend it hasn't received the attention and research it deserves, causing a stagnation in performance. Others disagree, noting that design sizes have increased by orders of magnitude while design times have shrunk, pointing to simulation remaining a suitable tool for the job... » read more

Power Optimization: What’s Next?


Concerns about the power consumed by semiconductors has been on the rise for the past couple of decades, but what can we expect to see coming in terms of analysis and automation from EDA companies, and is the industry ready to make the investment? Ever since Dennard scaling stopped providing automatic power gains by going to a smaller geometry, circa 2006, semiconductors have been increasing... » read more

Standards, Open Source, and Tools


Experts at the Table: Semiconductor Engineering discussed what open source verification means today and what it should evolve into with Jean-Marie Brunet, senior director for the Emulation Division at Siemens EDA; Ashish Darbari, CEO of Axiomise; Simon Davidmann, CEO of Imperas Software; Serge Leef, program manager in the Microsystems Technology Office at DARPA; Tao Liu, staff hardware engineer... » read more

Steep Spike For Chip Complexity And Unknowns


Cramming more and different kinds of processors and memories onto a die or into a package is causing the number of unknowns and the complexity of those designs to skyrocket. There are good reasons for combining all of these different devices into an SoC or advanced package. They increase functionality and can offer big improvements in performance and power that are no longer available just b... » read more

How Heterogeneous ICs Are Reshaping Design Teams


Experts at the Table: Semiconductor Engineering sat down to discuss the complex interactions developing between different engineering groups as designs become more heterogeneous, with Jean-Marie Brunet, senior director for the Emulation Division at Siemens EDA; Frank Schirrmeister, senior group director for solution marketing at Cadence; Maurizio Griva, R&D Manager at Reply; and Laurent Mai... » read more

Computing Where Data Resides


Computational storage is starting to gain traction as system architects come to grips with the rising performance, energy and latency impacts of moving large amounts of data between processors and hierarchical memory and storage. According to IDC, the global datasphere will grow from 45 zettabytes in 2019 to 175 by 2025. But that data is essentially useless unless it is analyzed or some amou... » read more

← Older posts Newer posts →