Will History Repeat Itself?

Is high-level synthesis really a disruptive technology? The jury is out.

popularity

Hands up — how many people read the books by Clayton Christensen, books such as The Innovator’s Dilemma? His books were talked about endlessly in the corridors of the EDA companies when they first came out. They all wanted to identify the next disruption and could find reasons why almost every new tool was going to be disruptive.

For people not familiar with his work, his main premise was that every once in a while a disruptive innovation comes along that has a huge impact on an industry. While the incumbent companies are aware of it, they do not go after it because their existing business model will not allow it. It probably does not provide enough profitability and will take scarce resources away from the competition they face in their core business.

Startups have a different set of business equations and can go after the disruptions. While they may not be highly profitable at first, businesses structure themselves for the new market, the new supply chain and the new customers looking to consume the disruptive technology. By the time the incumbents are losing market share to the disruptive innovation, it is too late for them to respond effectively. A new leader is born.

The last EDA disruption
The EDA world has seen a few disruptive innovations, but none quite as dramatic as the migration from gate-level design to the register transfer level (RTL). With that disruption, the new startup company — Synopsys — went from being little more than a research project to the largest EDA company. It displaced the gate-level incumbents, including Daisy and Valid, and almost took Mentor Graphics out, as well.

Since that time, every EDA startup wanted to become the next Synopsys. They wanted to find the next disruptive innovation that would make them the new No. 1 EDA company. Around the turn of the millennium, the industry thought that it had found it – high-level synthesis (HLS). Just as RTL synthesis had caused a complete retooling, so too would a migration to the Electronic System Level (ESL) and whoever had the successful synthesis tool would own the market, just as Synopsys had done two decades earlier. Insert scratching sound as we realize that was not the case and that this may have been a “false” disruption.

Defining high-level synthesis
Before we get too deep into the controversy, it might be helpful to define what is meant by high-level synthesis and review some of the early developments. Shawn McCloud, vice president of marketing at Calypto Design Systems, puts it this way: “The first HLS tools were far different from today’s incarnation. Remember Synopsys Concentric Behavioral Compiler, C Level, and Celoxica? These tools used a cycle-accurate C model that was nothing more than a glorified RTL abstraction. There was nothing “high-level” about these tools, and they provided virtually no benefit over RTL so ultimately died a slow, painful death. Unfortunately these tools also tarnished HLS for years to come.”

Jack Erickson, product marketing director for the System & Verification Group at Cadence, agrees. “The initial hype came too early, so we experienced a long trough of disillusionment.” Adds Brett Cline, vice president of sales and marketing at Forte Design Systems: “Various flavors of behavioral Verilog, VHDL and even C and C++ were tried and most failed. A few C-based tools continue to limp along but have major constraints for designs of any complexity and generally require proprietary extensions to the language to handle necessary hardware details.”

The input to HLS tools is an untimed or partially timed description of functionality. The tool, with various levels of assistance from the user, will transform that into an RTL description that has been optimized for power, performance or area. This is usually achieved by adding pipelining, unrolling or merging of loops, defining memory architectures and other forms of transformation. At the end of the process, the tool has decided what operations will be performed within each clock cycle and the manner in which operators can be re-used.

The language debate
Today, most tools use SystemC as their input language. “SystemC is a C++ class library that allows any C++ compiler to handle it as-is,” says Cline. “It adds capabilities to C++ necessary for detailed hardware descriptions, such as bit accuracy, timing/clock accuracy, concurrency and hierarchy. And, during the last 10 years it’s become a standard, IEEE-1666.”

Erickson believes this is an important point. “We now have a standardized input language so full methodologies can be put together.”

Phil Bishop, vice president of R&D for Cadence’s System & Verification Group, concurs. “One of the early deterrents to the greater use and adoption of HLS technology was the unavailability of a standard high-level hardware description language. Each of the early tool purveyors of HLS operated from a semi-custom C-based input format. These customized C language formats generally included pragmas and unique commands in the source code that acted as guides and constraints for the HLS process.”

But not everyone is in agreement with this choice of language. George Harper, vice president of marketing at Bluespec says, “Some EDA vendors narrowly define HLS as synthesis from C/C++/SystemC. These languages are based on the sequential von Neumann computation model and are therefore not well suited for describing architectures with massive, fine-grain, heterogeneous parallelism and concurrency. In hardware design, algorithm and architecture are tightly intertwined.”

Synopsys has a product that synthesizes algorithmic representations from the Simulink / MATLAB model-based design environment. It also has a C/C++-based synthesis product but has not chosen to extend this to SystemC.

Slow adoption
So what happened? Why has adoption been so slow? Nikolaos Kavvadias, CEO of Ajax Compilers, one of the new entrants into this area, believes that “the entry bar has lowered significantly from tens of thousands USD to about 2k to 5k USD, and this really helps broader adoption.”

“I don’t believe that the prevailing view is of expecting HLS to be disruptive,” says Kavvadias. “It seems that we never really lacked HLS, but the flow was not there. There were a lot of things missing (interfaces, integration, frontends, competitive processes to ASIC) for HLS to be disruptive in the past.”

There seems to be no disagreement about this. “While HLS forms a central role in this design flow, it is also critical to support virtual prototyping, RTL synthesis, and physical design,” adds Bishop. And while Cline agrees in principle, he also believes “HLS is still very much a disruptive technology that the industry is only now embracing.”

The Problem has changed
During the past decade, a number of aspects of the design flow have changed. Erickson describes it this way “The commercial IP market and internal re-use methodologies have matured so there is a smaller percentage of new functionality being developed in each chip. Today we are seeing high-level synthesis and verification beginning to transform the development productivity of new SoC functionality. But as a whole, the EDA industry is being transformed by a combination of system design, virtual platforms, assembly, commercial IP, and HLS.”

This is very different from the last disruption, when designs migrated from gates to being described at the RT level. Then it was the whole design that was affected. The entire design was described at the higher level of abstraction. This does not mean that companies migrated the whole design at the same time, but the benefits could be migrated to the complete chip eventually. That provided growth within a company and project team because after the technology had proven itself on one block, it could be applied to additional blocks, gaining even more benefit until the entire chip was described in the new abstraction. This also provided a growth path for the company supplying the tool because users would steadily require more licenses.

That is not the case with HLS, but that may not diminish its usefulness. “This abstraction difference provides a 4 to 10x productivity gain over RTL,” said McCloud.

Adds Erickson: “Being able to design and verify new algorithms and use HLS to implement them in multiple end-products is a very real benefit.” Cline agrees saying, “The ability to re-use functionality and easily modify it where needed is something they didn’t have with RTL. Long-time customers have told us for years that they initially purchased our HLS tool for productivity benefits, but the ultimate value is in the productivity of the design reuse.”

Cline also points out a problem with earlier solutions: “We had a big wake-up call in 2001 when one of our early customers told us that we had to meet hand-coded RTL and beat it to make it on the chip. This meant that we had to be able to get the same or better performance, better overall area, and do it in less time!”

Erickson adds that “We now have the capabilities to start at HLS and consistently meet or beat the QoR of flows that start with handwritten RTL for any class of design.”

The statement ‘any class of design’ is an important change. “One constraint was that early HLS tools only handled datapath designs, leaving all of the control to be done in RTL,” says Cline. “Today, that is not true with most modern tools handing nearly 100% of all RTL design requirements.”

Bishop expands on this problem. “By only being able to improve productivity to the datapath portions of designs, overall productivity gains were subject to Amdahl’s law,” he says.

Market size
Cline looks back on those early estimates and says “In the early days, it was estimated that most of the logic synthesis market would be a viable target for HLS. Ultimately, RTL logic synthesis would be commoditized or added to the HLS process. Approximately, 50% to 60% of that business would convert, which is somewhere around $150million to $200 million per year.”

Kavvadias sees a different market reality. “I would say that the estimated market has risen from a few million USD to maybe 30 million to 50 million USD.”

McCloud agrees. “When Gary Smith’s numbers come out later this month, HLS will break $30M for the first time with significant year-to-year growth.”

So can Cline’s number ever happen? “By moving other capabilities from the RTL flow, we expect that this market can be between $500 million and $1 billion per year in the next 10 years,” he says.

Verification support
These days, many designs are being constrained by the difficulties associated with verifying functionality at the system level. RTL simulators have run out of steam, and waiting for the implementation to be complete is too late in the process. Almost all of the vendors recognize the value and importance of tying synthesis into the verification picture.

“HLS is a key enabling technology for the early use of emulation and FPGAs,” says Harper. “And, more and more, that’s fundamental to addressing early, high-speed firmware development and verification.”

Erickson adds, “If a company implements HLS alongside a metric-driven verification methodology that allows them to do most of the verification work at the SystemC/TLM level, they see great benefits.”

In fact, the creation of a virtual prototype is itself becoming an essential element of the flow so that software development and integration can happen while the hardware is being designed and implemented.

Powerful optimizations
Another change that has happened since the introduction of early HLS tools is the rise in importance of power. RTL synthesis only had to take care of the performance/area tradeoff, but power has risen into the ranks of being a primary design consideration.

“Modern tools add power optimization capabilities directly in the core of the HLS tool, giving it the ability to trade off area, power and performance during the synthesis process,” says Cline.

This is an area that may take time to mature. Power is still ripe for a lot of research because optimization is often dependent on the input stimulus, meaning that it requires dynamic analysis rather than just static estimation. This will create a further tie between the design and verification flows and require the definition of use-case scenarios to drive the development cycle.

The future
Most vendors believe that HLS is now ready for mainstream adoption, but this relies on ESL flows having been fully defined and stable. The newest entrant to this field also sees possibilities outside of the traditional ASIC and FPGA flows.

“To expand the pie, software-oriented engineers must ride the wagon of HLS,” Kavvadias says. “The majority of these engineers work with MATLAB, Python (or CUDA or OpenCL), so the corresponding language frontends have to be implemented.”

This takes us out of the traditional EDA market and into the realm of high-performance computing. Kavvadias lists bioinformatics, big data analysis, neuromorphic computing and other areas. But back in the ASIC world he says, “In 10 to 20 years from now, at or past the end of Moore’s Law, HLS will be the preferred choice for squeezing all of performance potential out of 5nm or 8nm silicon.”

It appears that high-level synthesis is a technology that still has to go through some evolution even though many companies are already getting great benefits from it. At the end of the day it may have been the technology that enabled the disruption, but it may never have the dominant role that RTL synthesis possessed.



Leave a Reply


(Note: This name will be displayed publicly)