One-To-Many: Shifting Left, Adding Gears

In this year’s DVCon Keynote, Aart DeGeus talked about the ways in which many technologies collaborate to bring about success and continue our exponential growth.


, chairman and co-CEO of Synopsys, launched into high gear for his keynote talk at this year’s Design and Verification Conference (DVCon). The gathering attracted a record number of attendees, and it is estimated that about 350 people crammed into the room to listen to him talk about the shift left that is happening in the EDA industry.

DeGeus started his talk by asking the audience a question. “What happens in 46 days?” (Subtract a few for publication of this article) The answer is that becomes 50 years old, and DeGeus says that this is the “biggest exponential in mankind by many zeros.” He expects it to continue for another 10 years. Even though there may be changes economically and technically, that will translate into another 10X in terms of density, higher performance and lower power. This will push up into software and that will provide a 100X impact in what we can do. “Technology is the enabler, but economics is driving. The money will be available if the impact is there.” DeGeus pointed to some of the fields going through revolutions such as automotive, financial, energy, wearables and industrial.

He provided a few interesting ways to get your head around the numbers. 1977 saw the first car to have an electronic Engine Control Unit (ECU). Since then, it has risen to the point where the equivalent of a complete 12” wafer is embedded into a car. In 2003, DNA was cracked for the first time, and today there are plans to sequence 1 million people, place them in a database and fuel research based on those results. How big is this task? The human genome has about 3.3 trillion lines of code; Synopsys has about 400,000,000 — a little more than a mouse.

DeGeus concedes that while the finFET is driving things forward, 28nm will be the resting point for most people, but there are a lot of companies moving to smaller nodes. “ has been the field that has surprised us most in terms of finding solutions to impossible challenges.”

Aart DeGeus

With the stage set, DeGeus went on to talk about monotonic convergence, meaning decreasing chances of failure as the design flow progresses. “To optimize any design flow you need predictability, and that was not the case 25 years ago.” At that time, tools were independent and the design was passed from one stage to the next. “Everything is interdependent and that creates new kinds of complexity – systemic complexity, which implies understanding all of the dimensions at the same time. But we need to continue shifting left and to improve it by another 10X. The reality may not be that the schedule moves, but it enables the creation of more complexity. It is amazing with chips containing several billions of devices that most are right the first time.”

DeGeus went on to describe essentially the same story four times. The first was about the design flow. The other three stories were about IP, verification and software. He said that the biggest productivity increase in the past 50 years has come from reuse. “This allows designers to concentrate their efforts. What is difficult is not necessarily differentiated.” A lot of the productivity comes when someone else works on the details, performs the verification, makes sure they are viable in silicon, along with the many other tasks associated with creating a catalog item. The challenge with IP is the interfaces.

The third story is about the verification shift left — the largest user of EDA in terms of hours used. In the early days, all we had was the logic simulator, but today there are a host of tools available. DeGeus said the goal here is to get to the best possible coverage in the shortest possible time. “The state space makes it impossible to close on it, and the best we can do in shift left is to add more tooling. You make the tools faster, expand their scope, add debug and coverage to handle the data.” DeGeus added that analog/mixed-signal tends to complicate things a little. He noted that thinking about each of them as a separate tool creates problems and we should think of it as a continuum. “Unified compile is necessary between simulation and emulation so that issues created by different results are eliminated. Unified debug enables faster closure of problems.” Openness in the debugger enables special cases to be added by users and today they have over 100 apps available. The last of the verification engines that DeGeus talked about was the virtual prototype. This enables early development of software and to begin integration before hardware is ready. “This is going to continue the drive towards software and we are trying to embrace that by looking at the entire space from silicon to software.”

And finally, the fourth story is about software. DeGeus asked if there are lessons we have learned in hardware that may apply to the software world. “The first lesson is that exponential complexity will eventually get you. The sheer numbers require a degree of sophistication that is unsustainable. Second, a single bad incidence can have a huge impact. Every week we hear about technology having left some window open that enabled people to steal information or money. Security is a major issue, and when not managed well it jeopardizes the entire company. And third, disciplined sign-off drives predictable execution.” DeGeus pointed to the penalties in hardware: “This has led to systematic design and verification, and while not 100% foolproof, it is remarkably good at achieving quality. Software – you guys have a problem.” He talked about the patch mentality and said this will not work anymore. “Solving this requires increasing usage of formal techniques.” Today there are three software engineers to every one hardware engineer.

DeGeus talked about fuzzing in the software world and said this is so 1990s. Been there and done that. All the way through his talk, the slides had been building up cogs for each tool or technology he mentioned. When he mentioned IoT, the screen filled with lots of tiny whirring cogs that he said look a little like Angry Birds. “Each of those IoT bits is a window into the software world.” He said the IoT may be a little overstated today and that the market grows by $1B every time someone mentions it. However, each of them represents a tiny crack. As we move to more software we also have to consider security at all levels. “Security defects tend to be found later in the process.” Hardware has been less susceptible, but it has to provide the services for software. DeGeus said that the big difference now is that there are activists and their goal it to break in. The hardware world has not seen this type of onslaught, although he pointed to some recent events that show that this had been going on. “It is at the intersection of hardware and software, which is where super sophistication resides, that we need to be careful.”

To wrap things up, DeGeus returned to the 10X we expect from silicon and the 100X in opportunities it will create. “Smart everything will happen. Not long ago facial recognition was tough and expensive. Today it is not only possible, but in addition, by analyzing the 28 muscles in the face, you can do feeling recognition.” DeGeus says that the augmented brain is not too far into the future. He joked that he may be due for a haircut and a software upgrade.

“A gear is a highly optimized device that maximizes power transfer while minimizing friction and when you are dealing with things that have 12 zeros, you have to minimize the friction. Collaboration will become a core competence.”


nosnhojn says:

hey Brian. just wondering if this is a direct quote you have from the keynote re: shift left…

“The state space makes it impossible to close on it, and the best we can do in shift left is to add more tooling. You make the tools faster, expand their scope, add debug and coverage to handle the data.”

admittedly, I’m missing the context. but to me this reads that we’ve reached some theoretical limit on verification productivity and that the only way to increase the limit is to wait for/apply new/faster tools. effectively, increasing productivity is entirely an eda problem to be solved while users wait for the solution. is that it? or maybe this is me being too cynical re: shift left as an eda marketing ploy?


Brian Bailey says:

Hi Neil – yes this was a quote and you are right that it assumes we keep doing things the way they were done in the past. I believe that we still lack the right way to look at the problem because there are many areas in which we waste time performing verification that cannot lead to product failure. Is this an EDA ploy – I don’t think so, I think it is just a lack of understanding the problem and some day, someone with a lot more intelligent than I, will be able to reformulate the problem.

nosnhojn says:

thanks brian. I hope this person you speak of comes along soon! we’re an industry that suffers badly from tunnel vision. if we keep waiting for the tools, I think the theoretical limit on productivity becomes a reality. or maybe it’s already reality.


Leave a Reply

(Note: This name will be displayed publicly)