A Different View On Debugging

Raising the abstraction level plus machine learning helps reduce time to market.


The classic approach to improve an engineering task that is becoming too complex due to its size and detail is to raise the abstraction of design representation. In this way we plan cities, build aircraft and plan 500M gate SoCs.

For example, there is no way an ASIC design could go beyond a few thousand logic gates without shifting abstraction to the Register Transfer Level (RTL) and leveraging logic synthesis. Similarly for software design, languages such as C and/or Java were an absolute requirement for anything but the most simple of programs.

Raising the abstraction of the debug process allows the engineer to inspect broader scenarios, consider larger aspects of verification execution, and arrive at correct assumptions faster.

This is why Vtool has taken an abstract, visual approach to debug to solve these and other issues. The approach relies on psychological studies into problem-solving and large scale data analysis, and applies modern Machine Learning (ML) algorithms in order to examine large data sets in a cognitive manner.

This strategy may be applied to large-scale block verification, particularly tuned to operate with complex UVM test benches. It also brings unique debug power to tracking down intricate corner cases from SoC verification runs. Focusing on log information, as well as key signals, lets you minimize the large data dumps required for traditional signal level analysis. That makes it ideal for emulation debug, as well as large scale simulation regressions. It also can be used as standalone debug capability for particularly complex blocks and SoCs, or in conjunction with — and to enhance —traditional debugging environments, providing a front-end to these tools that greatly accelerates the process.

Some of the key challenges in the debug process are:

  • Reducing the chance of taking an incorrect assumption for granted.
  • Supporting the process of asking the right questions.
  • Validating assumptions or answering questions faster.
  • Revealing a possible path that could otherwise have been overlooked.

Solving these challenges has a number of other benefits:

  1. Accelerate debug time, up to an order of magnitude, for a broad range of complex bug types. Given that debug represents 25% of the entire development time of a semiconductor, this represents a huge resource-saving and time-to-market advantage.
  2. Improve design and verification quality over and above coverage assessment through visibility into verification scenarios, allowing a clear understanding of convoluted design code for easy team communication and cooperation.
  3. Extend debug for large-scale system verification on emulation, tracking down complex issues directly without the need for re-simulation, while working cohesively with existing debug and simulation environments.

New Debug Elements, Building On The Old
Part of the impetus behind Vtool’s Cogita tool included a behavioral analysis of engineering debug to understand aspects of design and verification that could be used to drive a better ergonomic experience. This study led to four clear areas of focus, as follows:

Ultimately, the engineer should be able to ask a question and get an answer.

The goal is to unify data from a broad range of inputs, including simulation UVM logs, VIP logs, emulator logs, software messages, and waveform databases, presenting them as one comprehensive high-level view. This allows the user to gain a more abstract perspective of all of this information in a single, comprehensive view.

Our brain can process visual data when presented correctly, hundreds of times faster than textural. Simply put, we understand and process visual structures well. It is interesting that over 50% of the human brain is dedicated to visual processing, and we often neglect this fact as we create engineering tools.

What is required to make all of this work, however, is a sophisticated search mechanism that fetches and presents textual messages as colored bars along a timeline. Numerical patterns and sequences of events can be easily grasped, enabling a streamlined debugging process. It is this rapid visualization of compounded event sets and search results that enable users to efficiently spot the rotten from the whole, which is not clear from inspecting individual signals or text.

Another powerful way to understand and debug complex systems is the process of classification. Classification in debug manifests itself in several different ways:

  • Selecting the relevant from the irrelevant data under inspection at any given moment. Looking at 10,000 signals or log messages is impossible. Thus, a focus on a few specific items of related data each time is critical.
  • Identify patterns. ASICs operate in cycles. Some test cases inject packets or transactions over and over again throughout a test. When debugging a system, one of our jobs (and it is a hard one) is to identify such patterns or cycles, attaching cause-and-effect to them.
  • Classify ‘good’ vs. ‘bad’ patterns. Why did this random test fail while the other 1000 in the regression passed? Why was this packet dropped while the rest of them were processed correctly?

The solution involves built-in Machine Learning (ML) algorithms that prepare and process input data (i.e. log files and waveforms). That processed data then is classified and screened to make it simple to comprehend. The algorithm’s role in this process is to classify massive amounts of data and to direct the next inspection point, while making assumptions and conclusions in the root-cause analysis process.

In order to efficiently step along the cause-and-effect chain described above, one needs an efficient navigation system that operates along the required dimensions of design space and time concurrently. Without it, we quickly become lost in details and by the time we answered a question along the assumption chain, we already forgot the assumption that led to it.

All of these attributes are leveraged in effective debug environments today. However, opportunities exist to greatly improve upon them, as we will see.

Vtool’s Cogita is equipped with a powerful Graphical User Interface (GUI) that helps the user navigate quickly and effectively between the different views required for an efficient debugging process. Included as part of this mechanism are powerful search and filtering options that extract specific data from general irrelevant detail.

The goal is to explore the essence of debugging and introduce a new capability. We explain the four major characteristics required of such a tool, Abstraction, Visualization, Classification, and Navigation.

Leave a Reply

(Note: This name will be displayed publicly)