Combining both in a mixed-signal design brings challenges in a different realm. Expertise is the key to success.
How to bridge analog and digital is getting renewed attention as the amount of analog content that needs to be processed balloons with consumer and industrial IoT applications.
Solving that problem isn’t going to be easy, though. To begin with, digital designers view designs in terms of voltages. Analog designers, in contrast, look at currents.
“Unless you can analyze an analog circuit and understand those electrons and the motion of the currents, it’s difficult to get it right,” said Sundari Mitra, CEO of NetSpeed Systems and an analog engineer by training. “More for analog than for digital, a CAD tool will help create bad designs if you don’t have your fundamentals clear.”
Even with tools that address both markets, this combination isn’t always clear. In mixed signal simulators, for example, digital timescales are long, but most analog timescales are short, except in cases where there is an extreme abstraction. Yet at that point there is a question of what is really being simulated, said Jeff Miller, product marketing manager for electronic design systems in Mentor Graphics’ Deep Submicron Division.
“You need these different levels of abstraction,” Miller said. “The mixed signal simulators can help with basics and making sure that a certain interface is correct. But there are definite limits on that. You’re not going to simulate your PLL to generate your clocks.”
This is no small task.“That ability to abstract, and to recognize the correct level of abstraction — that is the calling card in mixed signal design,” noted Fred Martin, vice president of radio technology in ARM’s Wireless Connectivity business unit. “The guy that’s good at that is master of his craft.”
For these reasons, mixed signal is viewed as a spectrum of capabilities, not a single tool where designer expertise plays a big role. “You start off at the high abstraction level, looking at maybe connectivity and functionality problems, and then you move down to timing and power, which are more and more complex,” explained Geoffrey Ying, director of AMS product marketing for Synopsys’ Design Group. “As you progress down, the precision level requirement is higher, so you need to run more in the transistor domain. Therefore, simulations run slowly. PLL is a typical headache for a lot of people. There are a number of techniques that engineers have used to find the right balance between run-time and the precision requirement.”
Finding that balance takes years of experience. “When I first got into EDA, someone told me that tools make a good designer better, but a bad designer worse,” Ying said. “And it’s absolutely true. It very much depends on the expertise of the user, and unfortunately we don’t have a pushbutton solution to address this.
Mladen Nizic, engineering director for mixed-signal solutions at, Cadence agreed: “When it comes to mixed-signal verification, people say, ‘I’ve got to verify this, I’ve got a limited amount of resources, and a fixed amount of time. If I run traditional low-level simulations, I will never finish.’ So the verification engineers have to develop the right strategy and plan what and how to do this to maximize usage of resources, and coverage. They have to have a good understanding of what has been covered, and what’s left. They have to rely on a whole range of tool capabilities, and be smart enough to know what to use, and when. That’s the trickiest point. Can I trust the model? Is my abstraction high enough? Am I missing something? How are the electrical effects? These all require skills, and it is encouraging now that teams are requiring these skills.”
That wasn’t always the case. “Often, with no disrespect to us middle-age guys, younger analog engineers are much more versatile,” Nizic said. “They know how to code, and are not resistant to change like some other guys are. In analog, a lot of that is about change in regard to where the visibility lies for some of these tasks.”
One of the big issues involves methodology. What is the standard best practice?
“In the digital world, the advance of functional verification technology was propelled by formalized methodologies,” said George Zafiropoulos, vice president for solutions marketing for National Instruments’ AWR Group. “When there were standards in language, and standards in methods, then it wasn’t just the bleeding-edge customers that would adopt it. It became mainstream, and so today everybody uses fairly sophisticated functional verification methodologies. But that practice doesn’t exist in analog, so it’s really experts figuring it out, then doing it again. But the next guy, how does he know? So maybe there’s an opportunity for sorting out what the best practices would be, and then figuring out how to normalize that across the industry to bring that down to the typical engineer who may not be lucky enough to figure it out for themselves.”
This may be easier said than done, however. A very-high-frequency SerDes (28Gps or 56Gbps) probably takes 1% or 2% of a die area,but can likely be the cause for silicon failure as much as 50% of the time, said NetSpeed’s Mitra. “If you look at mixed-signal analog components on a piece of silicon, they occupy 10% to 15%. But if you go to TSMC and ask them how many times they find digital bugs versus analog bugs—or worse still, the mixed signal where you are mixing the two of them—the majority of the silicon bugs are mixed signal, which is the interface between the two of them.”
Engineers spend more time in the lab than anywhere else when this is the focus.
“Everything needs to change,” she asserted. “Silicon turns are so expensive. If you ask any company how many test silicon runs they do for digital design versus analog design, the answer is so obvious. We are still learning analog design by experimenting on silicon and reading the results. We still do not have what it takes to actually get it done correctly, and be confident that you have it done correctly. We rely on experts. Why? Because it’s not possible to take their expertise and translate it into something that is a methodology or a tool, for example. So we rely on that. But at the end of it, it’s physics. Someday, my hope for the mixed signal community is that there are going to be so many silicon faults that someone is going to wake up saying we need to invest in actually fixing this problem—and fixing it from a design-centric/physics-centric point of view, rather than ease-of-use/get it done faster. We use EDA tools to do it faster, yes, but we need to be able to do it better.”
ARM’s Martin compared this to playing golf—you never master it but you keep playing. “Some analog guys have that mentality. The larger concept that the knowledge isn’t in the tools, it’s in the expert, is hard to change in analog. So you are seeing a trend in moving toward more standardization of analog in larger blocks. People have tried to make a living off of small analog blocks for a long time. That’s a tough way to go because they are never the same. If you go up to that next level, instead of making a converter, make the whole analog subsystem for a Bluetooth radio or a SerDes or something. Have the expert do that, and have people buy the whole block. That doesn’t completely solve the problem, but it provides the expert with enough designs to keep him busy. It keeps that knowledge concentrated. That may be a partial solution to the problem. The larger problem of turning analog into digital—there is a physics limitation there. In digital, you have a finite number of states. In analog you’ve got an infinite number of states, so it’s a lot harder to fully characterize an analog circuit.”
Holding things back is the idea of knowing before something tapes out whether it’s going to work or not, said Mentor’s Miller. “There’s an awful lot of test chips and things like that going on. The fundamental problem there is that we don’t know if it’s going to work or not before we tape it out, and that drives a lot of this black magic of analog where the experts can look at it and figure it out. Sometimes two experts look at the same chip and don’t agree if it’s going to work or not. You have to choose which expert you’re going to trust. But if we focused on that problem from an EDA perspective, and just got it to the point where we had the physics nailed in terms of the modeling and validation of the IP, then you could open yourself up to a lot more approaches. It would be very difficult to get people to accept non-counterintuitive approaches to design, or even just getting people to agree or disagree on if a layout is going to work. If you could prove it was going to work without actually going to silicon and spending all that money, that could open up the methodology space a little bit.”
Zafiropoulos recalled this happened in the semiconductor industry with digital synthesis. At first, every customer in the world said there’s no way the tool could do a better job than they could, and that was true to some extent. But the tool could do it faster, and eventually, better. “The trick was the predictability with solid results, but also coming up with a topology they never would have come up with on their own. With more robust tools that are smarter than the designer, the designer can tune them and drive them. That just hasn’t been done as much in analog.”
Others agree. “We are still dreaming about this one,” Ying said. “We’ve seen almost the opposite trend. Digital synthesis takes high-level RTL and realizes the circuit. You can’t do that in analog, but we still have to verify the chip at a certain level, so there is adequate throughput to allow coverage, for example.There is a drive to do almost reserve synthesis—take a transistor-level block, abstract it to some behavioral modeling language.”
Looking back even five years, that was common practice. Ying said he is encouraged to see that SystemVerilog Real Number in large part has taken place instead of Verilog-AMS, as it gives designers a way to perform some sort of coverage at the top level where digital and analog can be together.
Separate from the technology, mixed-signal verification suffers from organizational problems, Ying said. “You have large groups of digital and analog within a top-five semi company, but they have a very small mixed signal verification team so their hands are really tied. They can’t touch the analog schematic. They can’t touch the digital. They are sitting in the middle. They are in the critical path before tapeout. I know they go through a lot of pain for each tapeout, but for such a large company with thousands of designers, they have such a small group chartered to do this verification at the end. It feels like an organizational problem.”
While counterintuitive from the outside, Mitra said the mindset remains, “‘It’s such a small piece of the die, you need that much headcount for that?’ Wait for silicon to come back and you’ll figure out where the headcount is going. Organizationally, I do believe there is a challenge. The projects I ran were slightly different because I come from a background of mixed-signal/analog, so when I ran my processor teams, I made sure that the verification component of the mixed signal piece was high because I valued it. But I had to fight tooth and nail with my management every time to get the resources. It wasn’t as if it was just given to me. I told them, ‘Trust me, when silicon comes back, there’s going to be either a memory issue or a clocking issue or a power or a noise issue,’ and those are the hardest ones to replicate on silicon because they are not deterministic failures. With a digital problem, you take it apart and it tells you there it is. But when you have an analog problem, you put it on your tester, you see a frequency hole—sporadically—and then you realize it is not digital, it’s a continuous system.”
Another challenge is that the number of people who can solve these kinds of problems is very small, Martin said. “You need tremendous breadth. You need enough experience to have seen a lot of it. That’s another reason why the teams are small. If you put 10 more guys on there it doesn’t help. You need those rare individuals who have both the scope and the experience to work those problems, and they are real hard to find.”
Optimism in a new generation
Understandably, EDA vendors are in an awkward position because they can come out with something innovative, and it won’t be adopted because it changes the methodology. As such, the methodology is stuck on both sides, Miller added. “We can come up with something crazy that might do the job in a very non-traditional way, and it won’t sell. A collaborative approach is needed between the design and EDA community to bring new tools into existence, and then get them adopted into actual design practice. There’s some hope that a new generation will come in that is a little more comfortable with things like capturing constraints, and other formal methodology kind of things that involve something other than what’s always been done.”
At the end of the day, this is an exciting time for analog/mixed-signal design, given the many opportunities for engineering teams to innovate in leading-edge markets — consumer and industrial IoT being just a few. The analog/mixed-signal portion of the design increasingly is the make or break point.