Who Will Own Debug?

Understanding what is happening is a key first step in debug.

popularity

Recently, I had an interesting conversation with a verification leader of one of the world’s leading semiconductors companies. He has some 150 verification engineers in his organization and the group has been exploring EDA solutions for many years.

While we’ve exchanged many ideas about EDA and innovation, one sentence that he said stays in my head:

Whoever will own debug, will own the entire verification flow.

Where the problem lies

Most of the EDA and verification leaders and influencers you’d ask will tell you how important debugging is. It is said to consume 50% of the overall verification effort. But much more importantly, the insufficient debugging ability of the ASIC verification community as a whole is responsible, at least in part, for the 30% first-silicon-success rate, delays in tapeouts, and unpredictable time-to-market of new designs across the industry.

But what is debug, really?

Sometimes I join a colleague for a debug session and on other occasions, I’m stopping my own debug session in order to look around at what I’ve come up with so far. In some cases, if I could illustrate the way this debug process looks like and how the debug environment is built, it would look like this:

Nothing is in order, no systematic conclusion making, and no way to repeat yesterday’s success.

Debugging is understanding

When we call something debugging, we mean that we do not understand how a state progressed from A to B.

We see a failure – something that was not expected. Many times, we do not even fully understand the failure itself. On other occasions, we do understand the failure, but we have no clue on how we got there. That is when we launch a “debug process.” We are looking for the root cause.

But what if we would first focus our efforts on:

  • Understanding how our own stimuli generation is built and what are its weaknesses.
  • Understanding our modeling, aka, the scoreboard.
  • Understanding what the SoC scenario looks like, while we are tasked to help with system tests.
  • Understanding what each line in the log means.

If you say this is impossible, you are almost right. It is not possible without data-analysis tools. We could maybe do it, given that the dataset, the collection of terms, and the conditions are small enough, for our brain capacity to handle it.

The only problem is scale. When it comes to today’s complex ASICs and their gigantic testbenches, the data is simply too big to fully understand, even with most data-analysis tools.

The understanding platform

Vtool’s Cogita helps you understand the data.

Using Cogita, we encourage you to understand what is happening before anything else.

The first barrier in understanding something is language. The second is the ability to ask the right questions.

Imagine you could absorb millions of log lines and Gigabytes of FSDB or TRN waves databases and get something meaningful and easily digestible.

Impossible, for a human – but that’s what EDA is all about, right?

So who will own debugging? The one who will create the language that will make us understand.



Leave a Reply


(Note: This name will be displayed publicly)