While analog and digital will continue to co-exist, the effort to combine those disciplines is losing steam.
EDA companies are stepping back after years of trying to force engineers to combine analog and digital disciplines. Rather than emphasizing mixed signal as a single expertise, they are building bridges and translation mechanisms between the two worlds.
The moves cap more than a decade of trying to find optimal ways to pack analog and digital together more elegantly. Problems began surfacing as early as 130nm, when many analog developers said there were no clear power and performance benefits to shrinking analog IP blocks. And at 90nm, chip designers began talking seriously about the impact of noise from digital components on the proper function of analog IP.
At the time, large analog companies said they were favoring a two-chip solution. But the introduction of the smart phone with its one-chip economies of scale pushed chipmakers in what many considered an unnatural direction, prompting EDA companies to create mixed-signal tools and to promote mixed signal engineering as an efficient way to deal with these disciplines. Highly sophisticated techniques to overcome the limitations of analog at these smaller nodes by adding digital assist and calibration pushed the two technologies even closer together. Success, as might be expected, was spotty.
To make matters worse, the effort to shrink analog along with digital has become more difficult and less cost effective at each new process node. On top of that, the smart phone market is flattening, weakening the single biggest driver for combining everything on one chip.
That certainly doesn’t mean there will be any less emphasis on analog. In fact, as more sensors are needed for everything from smart cars to connected home appliances to embedded vision everywhere, analog’s importance will grow significantly. But with limited volumes and rising complexity, analog components will have to stand on their own. In many cases, that will mean using different process nodes rather than being developed as part of a single-node, mixed-signal solution.
“Historically, what has retarded the growth of analog is that we keep finding ways to do analog things with digital,” said Walden Rhines, chairman and CEO of Mentor Graphics. “If you look at cell phones and RF, the internal layer is analog but the baseband is digital, so analog disappears. You have A-to-D converters on digital chips. But going forward, analog applications will not be so easily squeezed into digital.”
Analog always has been something of a specialty business, defined by a large number of unique parts, many with higher margins than digital counterparts, where they even exist. Rhines said the analog market will only grow, but the analog itself is becoming more complex, as well, which makes mixed signal design even tougher. “Integrating analog into a digital chip is getting harder.”
This marks a significant change in direction, and it’s one that is gaining acceptance across all parts of the EDA industry. In his keynote speech at the Synopsys User Group, Aart de Geus, chairman and co-CEO of Synopsys took the covers off a new bridge technology for custom layout that he referred to as “visually assisted automation.”
“We can assist the human rather than trying to replace them,” de Geus said. “We think this is a technology breakthrough.”
This is, in effect, automation within domain expertise rather than trying to expand that expertise across different domains. Cadence has taken a similar tack. “About 90% of our customers do mixed signal chips, but they don’t have mixed signal engineers,” said Steve Lewis, group marketing director at Cadence. “They each send to the other group what they need to know. About 10% actually blend those domains. What we’ve done is to create a simple bridge for what kinds of information can be automated so they can exchange information more freely. A small portion of the population will continue to be mixed signal, but the majority are separate. We’re not asking analog or digital engineers to do anything out of the ordinary from what they already do.”
That may come as a relief to many analog and digital engineers, because for the most part they still have very little in common. In fact, their entire approach to design is different.
“Analog and mixed-signal design is black magic to the layman digital designer,” said Anupam Bakshi, CEO of Agnisys. “As long as the interface between analog and digital designers is crisp and crystal clear, one side doesn’t need to know too much about the other. We see this all the time. For digital registers that control the analog functionality, a formal way to describe the interaction avoids a plethora of problems.”
One of the big changes is on the packaging and board side. As it becomes tougher to put everything on the same die, both from a manufacturability and an economic perspective, a number of options are being re-examined. Those include new packaging approaches such as 2.5D (silicon interposer), 2.1D (organic interposer), and various levels of fan-outs, where different components that are on a board are included in the package.
From a system-level design perspective, all of these approaches make it easier to isolate noise and other physical effects such as heat. The downside, at least for the moment, is that these packaging approaches are more expensive. That is unlikely to continue as proven manufacturability and economies of scale kick in, but at the moment it is still less expensive to create a PCB-based solution than one using advanced packaging.
Yet one of the biggest benefits of advanced packaging still hasn’t been widely adopted. The initial promise of 2.5/2.1D was that chipmakers could use IP developed at whatever node made sense. For analog, that might be as much as 350nm, and very rarely less than 40nm. Pushing analog down to 16/14nm or beyond is viewed by most chipmakers as potentially less expensive from a manufacturing perspective, but there are no clear benefits in terms of power or performance for the analog circuitry.
Mixed signal at the system level
From a system-level perspective, much of this still looks the same. The individual pieces are more clearly defined, but the system functionality is roughly the same. Analog still needs to fit within the system power budget, regardless of whether it is developed separately in conjunction with digital. Signals still need to flow in multiple directions.
“What’s new is that everything has clearly defined interfaces,” said Kurt Shuler, vice president of marketing at Arteris. “So if you look at integrated baseband chips, they still mix analog/RF and digital on a single chip. But when everything gets packaged together, they keep those defined interfaces. Otherwise it’s like mixing oil and water together. You end up with mayonnaise, which is not a good thing. If there were no calendar and you could optimize it all for one chip, you would probably come up with a great mixed signal solution. But the reality is that the clock is ticking.”
While sensors and power are analog, how they are connected together is still digital. Sundari Mitra, CEO of NetSpeed Systems, noted that network on chip technology is purely digital. That in turn is hooked up to a controller, which connects to the analog physical layer, or PHY.
“It certainly would help if people know mixed signal for digitizing some of the front-end pieces,” Mitra said. “So if you have a SerDes and DSP cores are able to do the heavy lifting and processing, you don’t need heavy analog circuits. That’s where a strong mixed signal background would help. It also would help with integration.”
Testing one, two, three
Test doesn’t change with the increase in analog, but it does become more difficult.
“The next wave that’s coming is envelope tracking, which will affect how a design works for power optimization, said George Zafiropoulos, vice president of solutions marketing for National Instruments. “With IoT, there is much more of a requirement that everything is integrated. More of this used to be a microcontroller or a sensor as a part, or a radio as a part, but all of that ultimately will go on one die. So if you had three to five parts, that may go to one or two parts as volumes go up and power consumption needs to go down.”
He said that creates new problems because the microcontroller until now has been tested in the digital domain, or tested as part of a mixed signal design without RF. “If you have RF built in, you now have to deal with ADCs, DACs and RF blocks, even in chips that are modest in gate complexity.”
NetSpeed’s Mitra pointed to similar challenges in the automotive market, where the focus is on integrating analog and digital into one SoC. That raises issues about noise and capacitance. “The other problem is that if you have an analog-induced failure, it’s very hard to reproduce.”
There are other problems, as well. Mixed signal engineering approaches problems more holistically. Separating the analog and digital often means that development proceeds on individual schedules, and because tooling on the analog side is more ad hoc, the integration is sometimes delayed until the last minute. That makes optimization for power, for example, much more difficult.
“If you have 10 blocks, how do you show each block met its target,” said Cadence’s Lewis. “If one is over and one is under the target, can you shift that power budget?”
Lewis noted these worlds will cross in other ways in the future, as well, including areas such as safety and security. No one is certain whether analog or digital is more secure at this point, or what the impact will be of separating these two worlds. And if they function in separate domains, will that be significantly less efficient from an energy standpoint?
As these two disciplines increasingly are allowed to proceed independently, that may depend on the effectiveness of the bridge technology to provide that kind of data, and the motivation of the analog and digital design teams to update and trade that kind of information. But the impetus to unify these two worlds from the inside clearly has diminished—at least for now.
Mixed-Signal Design Powers Ahead
The design and verification of current and next gen mixed-signal designs require a deep understanding of how it will be used.