Dealing With The Data Glut

Emulators can be used for power analysis, but first the file sizes must be broken down into manageable pieces.

popularity

By Ann Steffora Mutschler

Tools like emulation and simulation are an absolute necessity to design and verify today’s complex SoCs, but what happens when you want to do power analysis and the file sizes are too massive for the emulator to handle?

Even with an emulator a five-minute mobile phone call could take three months. Understandably, this issue is causing pain to many design teams around the world that want, and need, to take advantage of what emulation can provide.

“In different modes of operation—video streaming, GPS, voice calls or Web browsing, for instance—the various scenarios can create simulation files that are huge,” said Vic Kulkarni, senior vice president and general manager of the RTL Business Unit at Apache Design. “That’s where the problem is in terms of how to handle and accurately estimate power when looking at millions and terabytes and gigabytes of simulation DUT (design under test) files. People use emulators to start with, in terms of applying it not only for functional creation of stimulus, but also for power scenarios in various modes of operation for the FSDB profile.”

To illustrate this problem further, Preeti Gupta, director of product management of the RTL Business Unit at Apache Design noted, “We did receive a 500-gigabyte FSDB file from a customer—the format in which an emulator would save the simulation activity on disk—and this was at least a year back, so the files sizes are probably getting bigger. This was just to save one video frame of data.”

For power analysis, a couple of things are important, said Pete Hardee, low-power design solution marketing director at Cadence. “You’ve got to be able to accurately characterize the circuit and you’ve got to be able to apply to that characteristic realistic activity that represents the system modes. Where those system modes are very predictable and you can boil things down to simulation testbenches, it’s not so bad. But where those system modes are a little less predictable, then running deep [emulation] cycles becomes the only realistic way to do that.”

Regular simulation testbenches are way too limited and slow when it comes to running real data.

“What the emulator does is to capture a lot of activity from executing the hardware,” said Hardee. “What it’s missing is the characterization piece. The design mapped to the emulator is very different than the design mapped to the silicon. How do we deal with that? By mapping the design to the target library at the same time using our RTL compiler synthesis engine. Then the problem is to be able to take the activity on the emulator box and be able to map that accurately to the design that you just mapped to the target silicon library. Being able to process that, being able to match it all up because there’s going to be subtle differences, you’ve got to pick what we term the ‘synthesis invariant point’ to be able to capture the activity and map that activity onto the design characterized with the target silicon library. All of that is built into the power analysis so this is not a straightforward thing to do, but the advantages are very, very clear when you do it in terms of just being able to process more data.”

From the perspective of Lauro Rizzatti, senior director of marketing for emulation at Synopsys, there are two areas where emulation can be deployed:  One is emulation for very fine, low-power designs based on multiple power voltage islands; the other is power estimation.

“This power island approach to reduce power consumption is sort of an extension—but a significant extension—of what the industry was doing maybe 10 years ago or even before which was clock gating. In the old days, and still today, one way to reduce power, switching power, dynamic power, is turning off clocks when they are not used on an instant-by-instant basis. By extension it’s not just focusing on clocks, so why not include entire sections of the design—the islands—and have multiple independent power domains controlled either through hardware or the embedded software? This started to be experimented with at least five years ago, and at the time power domains were in the single digits—maybe two, four, six. Today domains are in the hundreds, which is mind-boggling. And, of course, switching them on and off you’ll take the risk that things might in the transition not work properly, so you can increase the functionality and integrity of the design. Testing that process of switching domains on and off is very, very important.”

What to do with hundreds of power domains is exactly why the file sizes can’t be swallowed in one gulp by power analysis tools from the likes of Apache and Calypto, for instance.

“The performance, capacity and usability all become important when managing this data and doing power analysis as well as analysis itself, but also debug,” said Kulkarni. “If you have hotspots, what happens to those? That’s a problem we are looking at.”

Overall, to deal with the massive file sizes, Apache’s approach is to identify the power critical signals to speed up emulation runs and shrink DUT sizes (15 to 200x smaller). “At the same time when you are selecting these power critical signals, you already lose accuracy from the full FSDB so-called stimulus. So even if you create such large stimulus files we can reduce that size at the same time keep it within +/- 3%,” he added.

Once the file sizes are manageable, power analysis tools can process them.

Jim Kenney, director of marketing for the emulation division at Mentor Graphics, said a lot of customers want to do power analysis with the emulator but they haven’t gotten to it yet. However, with interest extremely high, this won’t be the case for long.

He noted that Mentor also approaches power analysis with emulation in two paths: power aware and power estimation. The power aware piece verifies that all of the power islands work properly, can be powered down and can be powered back up, whether they are supposed to retain their state or whether you expect something to be corrupted, which would mean it has to be re-initialized. It verifies all the control logic that runs all the power islands and also verifies the isolation logic: “When you power these things down you want to make sure that they don’t drag anything with them. So there’s isolation logic that allows the voltage on these to go to zero without impinging the rest of the design.”

“In the emulator this is by the UPF—the same UPF you use in simulation. What we are seeing is customers wanting to use it to compare to simulation where you are just basically verifying the functionality of these blocks. In emulation they want to run their power control software,” Kenney said.

Power estimation is a very different thing, however. “For power estimation, what the emulator provides is a very robust environment for running software, running real functional operations of the chip and then keeping track of the level of activity that happens—activity being mostly associated with the power consumption. What the emulator can do is output switching activity for consumption by power analysis tools like Apache or Calypto.”

The trick on the emulator, Kenney said, echoing the comments above, is it can output massive amounts of data these tools can’t consume. “Our focus at this point is coming up with clever ways to get enough information to get an accurate power estimation without overloading the tools and things like doing sparse sampling for peak activities and then saying, give me the toggle information for this limited time where I have a big peak that I’m concerned about.” To address this, Mentor and Apache have a partnership in this area.



Leave a Reply


(Note: This name will be displayed publicly)