my experience with provigil, crushing up provigil, how long to get provigil out of system, par generic provigil, provigil pots syndrome, adderall xr provigil

Heterogeneous Hubbub

The combination of heterogenous architectures and RISC-V is encouraging new tool support for SoC teams.

popularity

It’s no secret that designers today would prefer not to be restricted in their architectural choices. And who can blame them? At the same time, this sentiment has boosted interest and usage of both heterogenous architectures as well as the RISC-V ISA.

To support this, companies across the design, test and verification ecosystem are ramping efforts. One such effort is the teaming of UltraSoC and Lauterbach working together on what they call a universal SoC development and debug environment which includes support for the RISC-V open-source processor architecture.

What I find interesting is how RISC-V came to be, and how it is ramping.

Krste Asanovic, co-founder and chief architect at SiFive, as well as chairman of the board of directors for the RISC-V Foundation, was happy to explain. “We actually designed RISC-V starting in a Berkeley project which was all about improving energy efficiency. That was the background to developing it.”

With so much riding on power and energy efficiency today across the board, the RISC-V ISA seems to have come along at just the right time. “There are multiple facets to low power and in particular how RISC-V plays there. One of them is we kept the ISA very simple and when you are working in circuit techniques for example trying near sub threshold design points and other points like that, having a simple design helps a lot. Also, we need fewer gates to build a computer and when you are operating in the extremes of low leakage, the fewer gates you have the better. There, the inherent simplicity helps a lot with the basic core design – just keeping the minimum stuff in the architecture to run code,” Asanovic said.

Another aspect of RISC-V in the context of low power design is the specialization, he continued. “Really, the best energy efficiency you’re going to get for anything that has significant compute is going to be adding new functions, custom instructions to tackle that application domain. We made the base RISC-V ISA lean and clean, and easy to extend to add those special functions on the side. If you think about in the market previously, you basically had to pick either a general core that ran lots of software or some kind of configurable core that didn’t run all the software and usually you’d have to pair these two together. And then what happens is you’re shifting data between the two of them going between the general core and the accelerator; that also takes a lot of energy. Low power designs, the current systems most of the power and energy goes into shifting data around, so you want to minimize that traffic if you can.”

Of course, simple does not mean easy, so how does that translate when designing an ISA?

Asanovic said, “I like to tell people we worked really, really hard to keep it simple. It’s not easy. We iterated many times on the design. We started in 2010 and froze the base ISA in 2014. Along the way we did about a dozen implementations, including the software, the compilers, the OS, everything, to try and trim the fat, and really get it as lean as we could, while still being efficient. I think people are surprised how well it performs given how small the instruction set is. The way we view it is that effort you save on designing the base core, you can spend on your power optimization. You have a given design effort budget for any project, and you can either spend it implementing and verifying a more complex instruction set or you can spend it tuning your design to lower the power.”

Along with this, one must be cognizant of the inherent tradeoff of time versus efficiency. If there is all the time in the world, any design could be made perfectly, but this is not possible with market window pressures.

On that same vein, when people are looking at very low power, and they are playing with circuit techniques and dealing with variability as they are operating near the threshold, that’s a more complex physical design challenge, and to be able to focus on that versus the instruction set is where you get advantages, Asanovic added.



Leave a Reply


(Note: This name will be displayed publicly)