While analog and digital verification efforts have been essentially separated, closer integration is resulting in a rethinking of standards.
We live in an analog world, but analog has been minimized whenever possible. At some point digital and analog must come together in every electronic device, and that has long been an area where errors creep in.
The Wilson Research Group and Siemens EDA functional verification study has long shown that analog and mixed signal are two of the highest causes of flaws that result in chip respins. The latest survey results show a precipitous drop in first-time silicon success rates, and that is not just for the largest and most complex of designs. But the 2022 study also revealed that 47% of ASICs with fewer than 1 million gates had respins due to analog issues.
All aspects of verification need to be re-examined to find out where improvements can be made.
Fig. 1: Types of flaws resulting in respins from 2016 – 2022. Source: Wilson Research Group/Siemens EDA. *Multiple respins possible
Analog design and verification have not evolved at the same pace as their digital counterparts. Analog design has remained a manual task performed by experts and handed over the wall for system-level integration.
“The analog and digital domains have been fairly independent until the last 10 years,” says Satish Balasubramanian, head of product marketing and senior director for custom IC in Siemens EDA. “That is when we started seeing a lot more IP reuse. A lot of analog IP is going into different SoCs, and the time to market needs have become much more aggressive.”
Investment in analog technology is much lower than in the digital domain. “Analog is a smaller market than digital,” says Andy Heinig, head of department for efficient electronics at Fraunhofer IIS’ Engineering of Adaptive Systems Division. “If the amount of effort and development money spent on analog was the same as what is spent on digital, things could be very different. If you put the same money in, maybe you will see faster development.”
Change is needed. “This past approach is unsustainable with today’s designs,” says Paul Graykowski, product marketing director for the System Verification Group of Cadence. “Previously distinct analog and digital functional blocks are now intertwined. We no longer can design them in a silo and throw them over the fence for someone else to worry about. Today’s analog blocks, such as digital interfacing analog circuits (e.g., ADCs with digital calibration loops or voltage regulators with digital trimming logic), demand earlier and more rigorous verification within the digital-centric verification flow.”
Out of necessity, designs are evolving. “The most important thing is that we are starting to integrate analog IP, not only into discrete ICs, but much more embedded within the SoC,” says Siemens’ Balasubramanian. “And analog is being done on the same advanced process nodes. Gone are the days when you just plug in a black box and say it’s going to work. We need to start driving and verifying the entire system. It’s not easy to herd the entire team to come up with the proper standard.”
Complexity
While device numbers for digital swamp those of analog, analog has different types of complexity. “In general, analog has more states,” says Fraunhofer’s Heinig. “It’s not only zero and one. That makes it very complex to standardize something, because you have an infinite number of numbers between zero and one, and how you model that is very complex, and often very application-specific.”
Digital productivity results from abstractions that are not possible in analog. “The lag in mixed-signal standard development arises from the inherent complexity of analog design, which involves continuous signals and intricate physical phenomena such as parasitic effects, noise, environment interference, and process variation that complicate modeling and verification,” says Dave Cronauer, principal engineer at Synopsys. “Integrating analog with digital verification frameworks requires significant R&D to bridge fundamentally different domains.”
And while digital has to undergo several types of verification, it is very limited compared to analog. “With digital you have one or two types of simulation, and that satisfies everything,” says Heinig. “If you go to analog simulation, you have 10, 15, 20 different types of analog simulation — transient analysis, for example. All of them make sense, and all of them are necessary. But it shows it’s not that easy to find one solution that fits all.”
This has resulted in different approaches to verification. “Many view analog verification methodologies as being behind their digital counterpart,” says Cadence’s Graykowski. “The reality is the different methodologies evolved out of necessity due to fundamental differences in approaches and skillsets. Traditionally, analog verification relied heavily on visual waveform inspection through SPICE-level simulations. In contrast, digital verification embraced scalable, reusable, and automated methodologies built around hardware description languages (HDLs) and the Universal Verification Methodology (UVM).”
Changes ahead
That could be about to change. “Getting designers to change is always a difficult thing,” says Tom Fitzpatrick, strategic verification architect at Siemens EDA, and chair of the Accellera UVM-MS and SystemVerilog-MSI committees. “The nice thing about it is that we can abstract some of the analog behavior. A verification engineer who understands the need to create a transaction that has amplitude and frequency, that runs alongside an Ethernet packet transactor, will be able to do those kinds of things. There’s going to be some additional work needed to define the mixed-signal bridge for a UVM approach. There might be some additional design work. But once you define that, we can take advantage of the UVM ecosystem. And now you can start having third-party IP that does analog, as well, using the same infrastructure that the UVM ecosystem has. I can easily see VIP providers providing an analog VIP, which is literally an analog VIP with a bridge embedded in it to provide the transaction types. And then the verification person just needs to say, ‘I have a transaction that I can use with this VIP that made analog stuff happen.'”
With the growing interest in multi-physics and other system-level forms of analysis, the integration of things into the digital domain is going to become increasingly important. “These other domains have the same kinds of problems as analog,” says Heinig. “Currently, we don’t see any standards that allow us to do good power and thermal simulation of complex systems. Maybe it’s not that easy to address such complex questions, or to build standards and models. Maybe it is a domain where you need very specific models and simulation tools.”
Ironically, the first standard language in the semiconductor industry was an analog language — SPICE. It was created in the early 1970s. Attempts have been made to create more modern and abstract languages, such as SystemC-AMS, but results have been mixed.
“The SystemC-AMS standard is fine, but we have problems because the syntax and semantics are too complex for analog designers,” continues Heinig. “We often see they use it in the wrong way, or how they describe something, and this results in simulations that take too long, or we get the wrong results. For an analog engineer, it’s very hard, very complex. It is useful if you use it in the right way. We also see people attempt to model too much, and the runtime is too long. You have to find the right model of abstraction, but there are no mechanisms, no automation that helps you. Engineers who are highly skilled find the right level, but for young people, it’s very hard to be efficient. It’s a hard learning curve.”
There are hurdles that must be overcome. “The initial effort involved in creating the mixed-signal bridge and the need for specialized knowledge to accurately model and verify complex analog behaviors can be significant barriers to adoption,” says Synopsys’ Cronauer. “While widespread adoption takes time, tools are making progress to support UVM-MS, offering higher-quality mixed-signal design verification.”
A lack of highly skilled analog engineers exacerbates the problem. “Adding mixed-signal capabilities to SystemVerilog and UVM has taken some time to come to fruition because analog verification inherently requires expertise in continuous-time modeling, precise behavioral abstraction, and accurate representation of analog characteristics through real number modeling (RNM),” says Graykowski. “This complexity has led to a smaller, more niche set of engineers in the know, reducing the overall contributor base, slowing standardization efforts, and often leading to fragmented, vendor-specific solutions.”
Digital engineers have focused on their problems to the detriment of analog. “Part of it is the mindset of the SystemVerilog folks,” says Siemens’ Fitzpatrick. “They didn’t think much about the analog side. The engines were different. Eighteen years ago it wasn’t really necessary to combine them. There was an attempt with Verilog-A, and eventually Verilog-AMS, to bring that level of abstraction to the analog space, but the engines were different. There wasn’t really a need to have a single language to do that. But we’re getting to the point where the engines are becoming more seamlessly integrated. There is a value in having a single language to do that. The problem is that the two languages are very different in the way they look at the problem. Analog always has been based on a math-based solver, while SystemVerilog is event-based. Trying to merge the two has been somewhat difficult. We need a way, from a language perspective, of clearly defining the boundary between the two. That was something that the Verilog-A guys were aware of initially, but the SystemVerilog guys didn’t care about it. There just wasn’t the need from the SystemVerilog side at that time. But now we’re getting there.”
It was the formalization on the verification side that has made much of the advancement in digital complexity possible. “What UVM did for the digital side was to make the verification process a lot more modular, repeatable, and scalable,” says Balasubramanian. “Now it has started getting into the analog domain. People have been talking about matric-driven verification on the analog side, and this is a step where we definitely can standardize. The second thing is that analog never had a proper standard. For example, every digital simulator has a DPI, whereas none of the analog simulators have a DPI or the same interface that can talk to anyone. It’s just the way that technologies were built. Analog is more complex, and it was in a different process technology, as well.”
Two new standards
Some of the advantages of digital verification are now being extended into the mixed-signal domain. Accellera recently released the Universal Verification Methodology for Mixed Signal (UVM-MS), and the second standard is in the works and nearing completion, SystemVerilog-Mixed Signal Interface (MSI).
“UVM-MS is a use model more than anything else,” says Fitzpatrick. “There’s a module that you put between your UVM test bench, the virtual interface that you connect to, and the DUT. That includes a proxy class object, but basically takes a transaction in UVM, which would include instructions for the analog generator and that you send to a bridge. In the bridge, that information is used to drive the analog over to the DUT. Similarly, on the reverse, data is collected within the analog circuitry, the bridge fills that into the transaction, and sends it back out. From the UVM side, nothing changes. You’re still just talking to an interface through your driver.”
This standard is immediately adoptable. “The reference UVM-MS methodology provided by the Accellera working group already runs without modification on existing tools,” says Graykowski. “This accessibility dramatically shortens the usual adoption gap seen with new standards. As analog design evolves from standalone blocks to closely integrated analog-digital functionalities, UVM-MS represents a timely solution and a necessary evolution. By bridging the historical analog-digital divide, UVM-MS ensures verification methodologies can keep pace with the rapid innovation and increasing complexity of modern SoC designs.”
The second emerging standard looks at the modeling side. “We needed to standardize a mechanism that would allow the two domains to talk together,” says Fitzpatrick. “We realized that even with the use model as it is, trying to do bi-directional signals was difficult. Some of the tools do things such as driver/receiver segregation. We decided that for UVM-MS to be seamless, we wanted to be able to support bi-directional signals, and that led to the SystemVerilog-MSI effort.
This has to be implemented in SystemVerilog to make it work and that’s really why Verilog-AMS has been stagnant for so long. With SystemVerilog they had things like Connect modules, which were really a hack, and nobody really likes them. Some of the vendors have their own way of trying to do this, but it’s not really standard. We wanted to try to bring that into IEEE1800 as much as possible. By defining an Accelera standard for MSI, we’re making a clear addendum to 1800. We are not changing anything in 1800, but we are extending it so that as an Accellera standard, the vendors can implement it. Then in 2027, or whenever we do the next SystemVerilog version IEEE1800, we’ll bring it in at that point. We tried to do it in 2023 and got to a proposal point, but didn’t really get all the kinks worked out. So we formed the group in Accellera to work it all out.”
More help required
Will this be enough to improve mixed-signal productivity and help improve the quality levels? “We need many different models,” says Heinig. “For example, you have different questions you want to answer. You need a model for power simulation. You may need a different model for a thermal simulation. For behavior simulation you need different models of abstraction, and to build these models automatically is something that would be very, very nice — automatic generation of these different models. We would love to have a switch where you can tell the tool how to balance performance and abstraction. When a simulation runs too slow, make the model a little bit more abstract so it can run faster. ‘For my first simulation, I want a rough result, but make it run fast. I know that the accuracy is worse than before, but it’s fine for me at the beginning.’ Today, you have to build your own models, and this costs so much time and makes it so difficult.”
As with many new standards, tools, and models, change can be difficult. “On the analog side of things, we never approached it as a top-down problem,” says Balasubramanian. “They work on their own analog block, and then at some point, a model is given to the digital guys where they embed that and run it. This is going to change to a top-down methodology. You start by knowing the requirements for the analog portion in the context of that SoC much earlier. It then evolves in a much more systematic way with several iterations.”
Conclusion
In the past, it was perhaps acceptable for the analog and digital portions to be developed and verified independently. Integration could be done in a black-box manner. This is no longer true. The interactions between them are becoming a lot tighter than in the past. Before change can happen, languages, models, interfaces, methodologies, and tools have to be updated to make it easier to develop them in a more coordinated manner.
Those first steps are happening. How the industry reacts remains to be seen.
Related Reading
Trouble Ahead For IC Verification
First-time silicon success rates are falling, and while survey numbers point to problem areas, they may be missing the biggest issues.
Improving Verification Methodologies
The verification problem space is outpacing the speed of the tools, placing an increasing burden on verification methodologies and automation improvements.
Improving Verification Performance
Verification tools are getting faster and capacity is increasing, but they still can’t keep up with the problem space. Verification is crossing more silos, requiring expanded skill sets.
Leave a Reply