Filling In The Gaps For Mixed-Signal Verification

Experts at the table, part two: What motivates analog engineers; the need for a mixed-signal verification engineer; is analog simulation possible?

popularity

Semiconductor Engineering sat down to discuss mixed-signal verification with Haiko Morgenstern, Mixed-Signal Verification Group Staff Engineer at Infineon; Dr. Gernot Koch, CAD Manager at Micronas; Pierluigi Daglio, AMS Design Verification Flows Manager at STMicroelectronics; and Helene Thibieroz, AMS marketing manager at Synopsys. What follows are excerpts of that discussion. For part one, click here.

SE: What motivates the analog designer today?

Koch: The way this works now at least in our place is that you have an implementation that’s already there, and you start modeling it. No interest. If you get them more into top-down design, so they start with a model and refine that model until they get to an implementation, then by definition you have a model then because that’s what they started with. Maybe that would even benefit their design purposes because it lets them early on find out if what they intend to implement is actually what is needed or will it work with the rest of the environment.

Morgenstern: It has to be clearly defined, what the model has to reflect and therefore a lot of system knowledge is needed because something has to say, in Block A, this effect has to be modeled in Block C. Who will create this model? The digital guy or the analog guy has to write the code. They are doing some behavioral model verification to compare it to the spec and check whether the model is correct and the way it should be.

Daglio: For sure the modeling becomes more and more strategic because you have to keep into account the coverage in the verification so if you think that you want to put together a sort of digital-like, method-driven verification that today people do in digital, also extended to mixed-signal design with randomly-generated stimulus, you need for sure to put together a system that the simulation is very fast because if you have to run a hundred simulations, lasting four or five hours or one day, it’s not possible. If you want to increase the coverage because you want your design to be more and more safe, verified in a good way, you need to speed up the simulation. Without the simulation you have to reduce to the million transistor level part. To do that you have to use another. For sure, in the future the modeling will become more strategic than now and also used correctly saying how precise I need a model for the method-driven verification of the coverage. So, to be able to find a good level of accuracy but not slow down the simulation tradeoff, to have fast simulation with the needed level of accuracy to put together a verification system that applies digital-like technique of verification to mixed signal verification.

SE: What is unique to mixed-signal verification that we have to consider?

Koch: It is much more complex to get it to work.

Thibieroz: I think it’s complexity by itself because you’re dealing with digital and then you’re dealing with the analog side, which is time-driven, dealing with an entirely different system, which is equation based…and in addition to that you have the complexity of the tool, but you also have the complexity that you’re involving people from different organizations. Pierluigi made an excellent point: you have only a few people that have knowledge of both analog and digital, that have the right skill for that. So, I think when you’re dealing with mixed-signal design, analog engineers have to understand digital verification and digital simulators, and vice versa — it is a very complex puzzle to be solved because when do you know — let’s say you need to generate the behavioral model. You cannot come up with the requirements but you need to understand the digital requirement, you need to understand the amount of SPICE accuracy I want, how I want to model it, the amount of precision needed. This combined set of skills, being difficult to find, makes it even more complex for a company to have a mixed-signal flow.

SE: Is it incumbent on the engineering teams to train their people or do the universities need to do a better job?

Morgenstern: At university, there are a lot of languages taught –maybe in Europe more VHDL than Verilog but I think that’s changing now, although a lot of legacy code is in VHDL and the tools are not supporting VHDL as well as Verilog so we have a problem when there is IP delivered from another company. I think there are people on the development team which are open-minded, which are learning different languages — SystemVerilog, for example — and then you have teams of people which are straightforward: that’s my language that I have used for 10 years…the management has to be involved to motivate those people to get more open-minded, not only on languages, but also on methodologies.

SE: Aren’t the concepts of analog so fundamentally different than digital, is that the issue?

Thibieroz: That’s what makes it so interesting.

Koch: You need to have a special role, you need to have a mixed-signal verification engineer but I do think training of that person is possible because the knowledge is in house, it’s just not in one head, so they need to have the time to get it. An analog background is probably best because that’s more difficult for a digital engineer to learn than vice versa, but with time and interactions with the right people, you can set that kind of internal training up, you can get them to learn these things.

Daglio: In the digital, everything is standardized. In a certain kind of thinking, you have everything very well defined: the flow is always the same, you can change a tool, you can change a vendor, but the steps are very well defined. You have to use some formats to write specifications, you use some languages, you use the LEF-DEF format to exchange data…but everything is standardized. In the analog, it’s still at the SPICE netlist level. There is no standardization. Different simulators have different formats for the SPICE netlists. The model of the transistor, if you take one simulator, it is different of the model of the same transistor from another simulator. Analog is missing standardization. In the future, the EDA vendors and semiconductor companies should find a way a improve standardization of analog design.

Koch: To the designer, is that really important because the designer doesn’t write SPICE netlists.

Daglio: If I have a netlist and I use one simulator from one vendor, I give it to you and you are not able to simulate with the simulator of another vendor.

Thibieroz: Analog is never going to be as established as digital. Talk about analog/RF, the concept of zero and one: digital, you can standardize around it. That concept in analog, it’s like somewhere in between. I think that’s where you get more complexity because the job of an analog designer is extremely complex.

Daglio: For sure it will not be possible to standardize. For example, Verilog A is almost standardized. Basically, if I write a model in Verilog A, all the SPICE of all the vendors and all the Fast SPICE of all the vendors are almost able to read it and to work with it unless you do it special, but generally, 95% is independent of the simulator. For example, the model of the device are strongly dependent on the simulator because if i have a transistor I describe in a model for the simulator from one company. Probably in a simulator of another company it does not work — you have to do something. It is a part that can be standardized more. There is another part related to what we call analog design intent so you can, for example, put some properties on the schematic on the transistor, and this should go across the design flow — from the schematic simulation but it should automatically keep into account the right tool of the layout automatically out of the routing — this kind of analog property that you can assign to the component that are understood by all the tools in the design flow. My opinion is that this can be done.

Koch: A lot of things like that are done inside one tool, and another tool does it differently — there’s no commonality there — it’s not portable.

SE: Is it because it’s so specific to the design?

Koch: I don’t think so. That would of course require Synopsys and Cadence to get together and decide how they want to do this.

Daglio: For example, a lot of times on one group, they develop one block and the other group that needs the same block develop it again, and again…maybe there should be a template that can be common for 95% the tool needs and the team just trims some properties.

Thibieroz: My experience is that analog engineers have a tendency to think that their approach is best. So establishing standards in the analog community seems to way trickier than in the digital community — based on my observation.

Koch: 25 years ago, digital designers were saying the same thing. They didn’t trust anything automatic, they wanted to control their gates themselves. There is something like that in the analog community — it’s like an art — you can’t do automation there.



Leave a Reply


(Note: This name will be displayed publicly)