The Double Whammy

The confluence of low power design and analog/mixed-signal content is causing huge headaches in full-chip verification.


By Ann Steffora Mutschler
Given that at 40nm and below every SoC has some mixed-signal content, combined with the fact that power awareness is top priority no matter what the target application is, design teams and verification engineers are grappling with tremendous challenges just to get a chip to yield.

“For verification engineers and for designers, this is a double whammy,” noted Piyush Sancheti, vice president of product marketing at Atrenta. “If you ask a digital design or digital verification team, they will tell you that low-power design and the introduction of analog/mixed-signal components on what used to be a simple digital chip is a significant verification challenge. For verification engineers what this means is your finite state machines or your control logic just got that much more complicated. If you go from 2 domains to 20 domains, your verification complexity just increased an order of magnitude.”

Analog designers are witnessing the same thing. The real challenge on the analog side is designing analog circuits on low-power digital SoCs.

“Analog low power techniques were free form,” said Hany Elhak, senior product marketing manager at Synopsys. “Now when you have an analog circuit on a digital SoC it has to be on advanced nodes so they need to deal with leakage, which is typically not a big problem if you’re designing an analog circuit on a 0.18-micron analog process. Another issue here is that the analog/mixed-signal block that is on the digital SoC has to comply with the digital methodology and today. But the digital methodology for low power is very complicated—techniques for multiple power domains, power gating, body biasing—and the analog block just needs to cope with that. On the analog block it still needs to be done manually. In the digital world, they have UPF files where all the power domains are described and the switch on, power off procedures are. In analog blocks they cannot understand the UPF files. The analog designer needs to design the block so that it complies with whatever power domain this block is specifying. Still, he needs to verify that, but he doesn’t have all this nice place and route and formal verification designers enjoy.”

Where to start
With the complexity from the manufacturing process, combined with the low power techniques and the analog/mixed-signal features, engineering teams need to consider both sides of the power story, said William Ruby, senior director of RTL power product engineering at Apache Design.

This means considering how the design will be impacted both from a power integrity and power consumption perspective, he said. On the power integrity side, the analog circuits, which are dealing with continuous analog signals, are very sensitive to noise—specifically power supply noise. Things like dynamic voltage drop and all kinds of noise on a chip can more than just upset the analog circuit. They can prevent it from functioning completely, so it’s important to be able to verify the analog and digital from the power perspective together.

Mixed-signal verification and full-chip verification for power integrity also are becoming very important because of the emergence of these localized power reduction techniques such as power gating, voltage domains, and even expensive clock gating. “All of these things can cause pretty large power supply transience, which can kill an analog circuit if it’s not carefully designed and analyzed and laid out,” Ruby said.

On the power consumption side, when there is analog and digital, or mixed signal and digital together, it’s just an SoC design. “It obviously has its own power specifications that it needs to meet, and what happens now is that you’re getting into what exactly these power specifications mean,” he explained. “A lot of times you have this much power consumption in this mode versus this much power consumption in that mode. Mobile phone SoCs will have a very different power profile during a voice call than a GPS search, or versus an Internet browser. Things like that are very important. Analog circuitry also works in different modes. There are a lot of digital controls coming into the analog blocks that will affect the amplifier or change the characteristics of the filter that turn things on and off on demand. From a power consumption perspective, the functionality of the analog needs to be analyzed together with the digital controlling portion to truly understand power consumption.”

A key part of this is the ability to model the analog circuitry, just as it is done in the digital realm.

Kishore Karnane, product manager for mixed signal verification and Specman at Cadence, said analog modeling has been done for a very long time using Verilog-A (a subset of Verilog AMS). However, even though the behavioral modeling is done it takes a very long time to simulate because the concept of simulation on the analog side is still the same.

To address this, EDA vendors across the board are taking those analog models and modeling them in a digital context. In a few examples, Cadence calls it digital-centric mixed-signal verification (DMS), while Apache’s modeling is done by its Totem tool, and creates both thermal and power models.

Karnane said Cadence has been showing customers how to take any analog models at the transistor level and model them using real number models, which gives significant performance improvement because there is no analog solver required. Everything is running at digital speeds. “Customers can now do a full SoC verification or even regression runs all using digital simulators. So everything is modeled in a digital context.”

Cadence’s ‘wreal’ technology extends the Verilog AMS LRM (language reference manual) with new extensions that allow for analog modeling in a digital context. The company also is working on developing the same capabilities for SystemVerilog, which should be completed by next year. The benefit of an approach like this is that once the analog models are brought into the digital context, all of the digital verification methodologies can be applied to these models and full coverage driven/metric driven verification can be done in a mixed signal context.

The low power connection
Adam Sherer, verification product management director at Cadence, noted that where this really helps with low power is that due to the fact that 40nm and below really every SoC has some mixed-signal component, there’s some analog on that SoC. “It’s so complex, it’s running so fast, customers are all using some sort of power management just to keep the chips cool enough to function. Even chips that are in communication devices that are wired into some power backplane at a server farm, in a cloud configuration, at the scale at 40nm, even these chips need to consider power just to be able to be cool enough to run.”

“When we’re talking about managing the power domains, one of the things our customers are looking for is, ‘Will the domains wake up and shut down properly? What’s the interaction among them? How do you manage feed-through circuits?’ Inevitably, we’re running into analog functions in that check. So it’s actually a functional test, which we really can’t do with a transistor or the old style behavioral models. You almost have to use these digital mixed-signal models for the system to run fast enough for the low power verification,” he explained.

Another key part of this approach is to generate verification plans for low-power from the power format (CPF or UPF). Now that there is mixed-signal content, it can contribute coverage as part of that overall verification plan, Sherer added.

Don’t forgot the thermal
Layered on top of the understanding of how analog/mixed-signal circuits behave in a digital SoC, Gene Matter, senior applications manager at Docea Power said, “One of the big deals we could learn for low-power mixed-signal design is to be able to do co-simulation or simulation of circuit design as a function of temperature. If these things are built in a common substrate (either PCB or silicon die or package), you’re going to have a thermal gradient and that is going to provide co-adjacent heating of other components.”

“It’s a really big deal to be able to do co-simulation where you vary the operating points of the system for voltage and frequency and have solvers that do the circuit simulation or the power modeling simulation as a function of operating temperature so you can pick the optimal operating point—the voltage and frequency of operation per the workload. Match the behavior of the system to the demands of the application and the only way to do that is to do the same techniques that we do for power modeling,” he said.

The desire to do this has always been there, but the tools and methodology have all been based mostly on designing the circuit for performance to meet performance characteristics.

At the end of the day, the problems of low-power, mixed-signal design and verification really come from the fact that up until now people have mostly separated their digital and analog designs and they put them together at the end of the process, then they try to see if they work, observed Martin Vlach, chief technologist for AMS at Mentor Graphics.

“Of course that is problematic now for several reasons. One is the feature sizes are so small that you actually have to worry about that part. But mostly the designs are getting very large and the analog parts are controlled by digital. There are an incredible number of states that are difficult to deal with using the analog ways of testing things, which are still kind of ad hoc. We’re looking at applying true verification methodologies in analog and mixed-signal design. AMS people will probably adopt and adapt the UVM methodology of digital systems,” he added.

What’s also needed is more public discussion about various approaches in use and the success or failure of different approaches. So far, details are very sketchy. But whatever the approach, there will need to be a method for modeling the entire design that includes both digital and analog/mixed-signal functions such that simulation and verification are done in one system. Complexity and power concerns demand it.