Systems & Design
SPONSOR BLOG

A Fireside Chat With Imagination On Hardware-Assisted Development And Emulation

The key is to choose the right platform for the job, but also to understand what’s possible with each platform.

popularity

During one of my trips to Europe I was able to sit down with Colin McKellar, senior director of hardware engineering at Imagination Technologies. He is my main contact for all things verification at Imagination.

Imagination’s product challenges include maintaining the level of quality of their IP products within shorter timelines and dealing with integration of their IP into more complex chips, all within a very competitive environment. The quality and efficiency of their verification and debug time is a key component. As with many of our customers, there is not really a golden reference point to measure against. Therefore, the team’s main objective becomes finding more bugs earlier within a project.

Contrary to what we sometimes hear, emulation is not considered to be the most cost-effective way to find defects. So the focus for emulation is really after unit tests are done and blocks are tied together to validate and stress test the system level, check whether the overall performance and power are as expected, as well as validate the software/hardware/ecosystem. Imagination pushes the available technologies/methodologies to minimize finding what would be considered “early bugs” in emulation. As with most of our customers, Imagination is using a combination of dynamic engines, from virtual prototyping to RTL simulation, emulation, and FPGA-based prototyping. Added to the mix is the usage of formal verification methodologies.

Imagination is using FPGA-based prototyping for smaller IP cores. But for bigger cores targeted at, for example, high-end mobile applications, use of FPGA-based prototyping has been more difficult, mostly due to the need for RTL/software debug. For that the team considers the Full Vision capabilities of our emulation platform to be very useful, applying them extensively in their verification methodology. One key priority is optimization of PPA – power, performance, and area. Although they can make some decent approximations using FPGA-/simulation-/modeling-based techniques, the dynamic power analysis offers the sweet spot in terms of speed, accuracy, and turnaround time. With DPA, Imagination is verifying that their cores have sustained low energy consumption and that the IP core design is optimal with respect to the target market.

Use-model versatility of emulation is important to the company. Imagination makes extensive use of both in-circuit emulation (ICE) and transaction-based acceleration (TBA). They also have invested in an offsite, state-of-the-art data center to host emulation, data filers, compute grid, etc. I was not able to actually see the datacenter in which the hardware resides because it is remote. Contrary to what sometimes is reported about this use model, ICE set-ups are accessible remotely just fine, and multi-user access, as well as the ability to handle flexible sizes of verification loads down to 4 million gates, are considered very useful.

Colin confirms the importance of what we call “turnaround time”, the time it takes to get answers from debug. It is comprised of the actual design compile and bring-up, the execution until a bug occurs, and the actual time it takes to debug. In GPUs that Imagination licenses, they are looking for two main failure categories – lock up of the steady state execution and image corruption. Besides the fast compile capabilities, Imagination especially uses the State Description Language (SDL) to improve debug efficiency. It allows them to use emulation like a logic analyzer, with a set of sophisticated, internal SDL scripts. They allow Imagination engineers to dynamically and simply define what they are looking for by triggering to capture problem cases and greatly minimizing the need to rewind and re-execute. They always execute with Full Vision, use max trace steps, and sophisticated SDL triggers. They can then debug offline, and can even do small code manipulations without re-compile using the force/release capabilities.

Going back to low power consumption – it is a key challenge for Imagination. How to verify the complex switching on and off in the various power domains in the context of software has proven to be an interesting challenge. Imagination is using UPF/CPF for some of their cores, and provides UPF/CPF descriptions supported by emulation. They are currently transitioning to UPF2.x. For the dynamic effects, DPA allows Imagination to evaluate power consumption in the context of firmware and API-level drivers. Imagination has an in-house playback mechanism, allowing them to play back the relevant chunks of industry-standard benchmarks. For Imagination, low power is a key differentiator, and they want to get accurate information as early as possible and also have fast turnaround time when making design changes. This allows architecture decisions to be sanity-checked early in the development flow.

In closing, Colin and I talked again about the continuum of engines from virtual prototyping through RTL simulation, formal, emulation, and FPGA-based prototyping. Of course, emulation is only one of the different verification methods Imagination is using. In the ideal world, users would make use of the spectrum of different engines/methodologies at the correct time, but Colin assures me that we in EDA have some more work to do here. A lack of standardization and inoperability of metrics/coverage, limited standardization of SystemC/UVM co-simulation, and the cost of entry in terms of education/training/infrastructure deployment could see some improvements. We are on the right path, but there is still more work to do.



Leave a Reply


(Note: This name will be displayed publicly)