Will Open-Source Processors Cause A Verification Shift?

Tools and methodologies exist, but who will actually do the verification is unclear.

popularity

While the promised flexibility of open source could have advantages and possibilities for processors and SoCs, where does the industry stand on verification approaches and methodologies from here? Single-source ISAs of the past relied on general industry verification technologies and methodologies, but open-source ISA-based processor users and adopters will need to review the verification flows of the processor and SoC.

Simon Davidmann, CEO, Imperas Software pointed out during a panel last week at the RISC-V Summit held in Silicon Valley that designs based on the RISC-V instruction set architecture bring interesting verification challenges.

This sentiment was echoed by Richard Ho, principal engineer/director at Google, who explained that Google’s interest in RISC-V and in open-source hardware comes from the company’s support of open source, in general. “A lot of Google infrastructure and software has been built on open source, and it’s our intent to be a strong contributor in this community.”

At the same time, Ho stressed that with any kind of open-source endeavor and open-source hardware in particular, the verification is very important — and, in fact, it should be open source.

“That’s our contribution to the open-source community,” he said. “We actually have open sourced our design environment, but I am a little bit sad that there hasn’t been that much else going on in the open-source part of it. A lot of the work has been in closed environments and closed tools, and I would like to see more of it open. In order to have open-source hardware that you can really re-use anywhere, you want to be able to test it. And you want to be able to modify it as much as the open-source RTL that’s based on an open-source ISA will be modified, so that’s really important.”

Verification and security
Ho contends that verification is a superset of ISA compliance. “We need to have compliance, but I’m concerned that a lot of users of these cores that have been in open source may stop there and say, ‘This core is compliant with the RISC-V 64I instruction set, and maybe that’s good enough.’ No, that’s not good enough. You do have to care about your microarchitecture. You do have to care about your pipeline stages, your bypasses, and all of those details that may actually come out and bite you when you go and use this hardware. It’s not enough to just FPGA prototype it, or to have Linux booting on it. Bugs could show up way after that. In the cores that we’ve worked with, it’s safe to say there’s always been some issue. Most cores out there will have some issue you can find, maybe not on the core surfaces, but often in the debug section or one of the extension instruction sets there.”

Security around open source is a huge concern, he said. “With RISC-V, security becomes big, and the reason why is that people like to have an open-source piece of hardware for security so you can review it. The danger is that the bad guys can review it, too. If you don’t do enough security verification at all levels, you’ve just told the bad guys how you might be able to have some vulnerabilities there. Although some work is going on in secure embedded cores, a lot more needs to be done. We need to have a lot more formal analysis. We need to actually identify what our vulnerabilities are before hackers get to it, because with the number of cores that are out there, only a limited number of people could do this analysis. We have to focus and get that.”

Michael Thompson, director of verification engineering at OpenHW Group, agreed. “There are problems that need to be addressed when thinking about how to deal with a processor. How do you deal with an open-source processor? How do you deal with a processor that’s readily extensible? That extensibility is built right in to the way RISC-V works. Those problems have been encountered by industry for quite a long time, and tried-and-true techniques are in place today to address those, so we have to just apply those techniques. Further, open source by its nature is collaborative, and there we need to stand on the shoulders of giants. The people in the open-source software community have worked out how large, dispersed teams can come together to get a job done. How do you get a whole bunch of different people, scattered around the world, working for different organizations, all working on the same problem? That, too, is a solved problem. That was solved by the open-source software community.”

But not all problems have been solved. A big concern is fragmentation, which makes all of these issues more complicated. “The RISC-V Foundation carefully watches what is going on here,” said Jerry Ardizzone, vice president of worldwide sales at Codasip. “You have the base integer instructions that is carefully locked down. You’ve got the standard extension space, which is monitored closely, and there’s a very good ratification process to that. And then there’s the customizable space. So it’s up to all of us to make sure it doesn’t get too fragmented and out of control, because if it does it won’t succeed. As a supplier, as a third-party member and as a user, it’s in the best interest of all of us to make sure it doesn’t get fragmented.”

Verification challenges
To face this problem from the point of adoption, Emerson Hsiao, senior vice president at Andes Technology, offered an analogy. “Let’s say you’re going to buy a new car from a dealer. Does anyone ever open the hood and see what’s underneath when buying a new car? You don’t do that. Why? Because you assume the car manufacturer knows what they are doing, and if there is a problem, the car goes back to the manufacturer. But we’re talking about a different scenario here. I interpret the shift in a verification environment for open-source ISA-based IP is actually pushing the responsibility more onto the end users, which is a problem because we are expecting our users to know what to look for under the hood. Granted there will be end users who know how to verify processor IP and know where to look. But the majority of end users probably don’t know where to look or where to check, so the challenge lies in the verification for RISC-V.”

There are roughly 100 RISC-V IP cores on the RISC-V Foundation website today. That number will grow, and Hsiao argues there should be some type of checklist for the companies building those IPs, along with a history or documentation of what has been tested and checked. That way, when an IP is used in an open source implementation, the adopter knows what kind of risk they are taking.

“That’s the problem we’re facing right now as an IP provider,” he said. “We’re a commercial RISC-V IP provider, so we’re actually taking a very different approach. We make sure that IP is tested and verified before we ship it. But there is one problem. RISC-V has this amazing feature of custom extensions, and this is a major challenge for verification. Anyone who has done custom instructions or extensions knows the difficulty. Even identifying requirements from design to verification is a major challenge. From the IP perspective, in addition to providing a design environment for customers who design their own extensions, we also are trying to modularize to make the verification task complete.”

Where the real frontier is now will be the integration of processors onto the SoC infrastructure in such areas cache coherency, making sure interrupts are being managed correctly, according to Dave Kelf, chief marketing officer at Breker Verification Systems. “This is going to be the kind of thing that is also going to need a lot of verification help, and it’s an area where the whole Portable Stimulus Standard is focused. It’s going to be interesting to see how RISC-V develops and platforms are based on it, but if you can imagine just the complexity of taking a new processor with an open instruction set, putting this onto a varied range of SoC platforms, and then the idea of adding instructions to it, clearly a lot of help is going to be required right through the verification process — particularly on the SoC platforms.”

How seismic is this shift?
Frank Schirrmeister, senior group director for product management and system development in the System & Verification Group at Cadence, isn’t convinced the magnitude of the shift is so large. “I agree there’s a shift, there’s no question about that. There’s a shift in verification going on, especially at the SoC level, and it’s very important. The question I have is how seismic it is, because processor verification itself has been done for quite some time, both for fixed ISAs and for variable ISAs like ARC and Tensilica and so forth. The commercial vendors even have been publishing articles about how they verify these processors, and it looks very much like Portable Stimulus for IP. A lot of software-driven verification is very important.”

Some SoC developers say they use an average of 5 trillion to 6 trillion emulator cycles, and 2 to 3 petacycles of FPGA-based verification. What’s really changing is who does verification, Schirrmeister said.

“With the openness of the hardware, and with the end user doing the modification, there is a shift in responsibility on who verifies. For us in EDA, there’s a huge set of opportunities for new and updated verification tools, especially when it comes to the closer integration with the rest of the SoC, which fall into seven domains. First, is software bring up. That’s where we work with companies like Imperas for processor models, for example. This is happening for the other ISAs, as well, and we already have tools for that. Second, there are issues around Portable Stimulus since it’s very important to have libraries that support the specific processors. Third is power analysis around the processors, which changes in the context of RISC-V. Fourth, you need to debug hardware and software, so you need to get the traces out. You need to do JTAG. Fifth is integration, such as IP-XACT modeling. Sixth is performance analysis, which is where interconnect IP fits in. That’s obviously different in the RISC-V environment than it may be in an ARC, MIPS or Tensilica environment. Seventh is verification IP, and verifying all these protocols. So in terms of this being a seismic event, it’s a change, it’s an addition, no question. It’s an extension of a lot of techniques we already do in other domains. It’s like being in a gold rush and having the picks and shovels to sell, and those shovels work with Arm, MIPS, ARC and Tensilica as well as RISC-V.”

So what exactly is the impact of open source? “I’m not sure open source comes into it, really,” said Imperas’ Davidmann. “What we have with RISC-V is the ability for people to do new things and to change it. It’s not like you get an Arm, it’s pre-verified, you plug it in, you do some integration testing, and you’re done. You’ve got however many hundreds of thousands of gates that you’ve designed, and I don’t know that open source comes into. It’s the fact that you’re designing a new processor, or maybe you’re buying one, which is different. But you’ve got this processor, and the whole purpose of RISC-V is this freedom to innovate and change it. So you’ve got to make changes. That means you need a verification environment, which you didn’t have before. You bought an Arm, it worked, you just plugged it in, and that’s the challenge with anything related to open source. It’s related to the fact that it’s an extendible ISA.”

Companies also need enough tools to have choices for different applications. “We’re thinking about tools all the time,” said Codasip’s Ardizzone. “It is not just the hardware or the implementation. It is a lot about the C compiler, for example. We’ve invested in that. We hired a lot of people that focus on building what we hope to be the best C compiler in the world for RISC-V, but we can’t do it alone. The entire industry needs to have different options available for different customers. Take LLVM, for example. Some people want to use a GNU compiler. So it is whatever you like, whatever your strategy is internally. And we need to support them all.”

There also has to be a business case behind these investments. Google’s Ho noted that open and free are very different. “The ‘free’ is ‘freedom,’ and freedom means that you can do stuff differently with RISC-V and the ISA you can extend. You can create your own processor with particular microarchitecture implementations. But that’s only successful if you can do the verification, which means that you need to have the collateral to do that. A processor design cycle by industry norms takes in verification, and that’s where the real IP is. If that is done in a closed proprietary system, that’s not really addressing the challenges we see ahead of us. That’s why it’s seismic. It’s seismic because we are actually challenging this concept that you have to pay for a particular thing that you have very little control over. It’s fine to pay for it, but if you can’t control it the way you want to do it, that’s not really buying you the real benefit of RISC-V and the impact of the open ISA.”

At the same time, this also speaks to configurability. “One of our partners said they were able to save half a memory in a PCI-type RISC-V environment because they added an instruction, and they didn’t need help with the memory anywhere. Those types of changes are classic co-design changes. You make your changes over here on the processor side, and then how do you actually optimize, balance it, and then verify that,” Schirrmeister said.

Support for open-source architectures?
In terms of support for open-source architecture design today, do today’s verification techniques suffice?

OpenHW Group’s Thompson invoked Spider Man in his reply, “With great power comes great responsibility.’ I say that flippantly, but it’s true. This is the problem that open source brings to verification. Now, all of a sudden, I have to do verification. I don’t buy the verification from Tensilica or from Arm or somebody else. I actually have to do it myself. And many organizations haven’t done it themselves previously. They lack the skills and experience to do that, which creates opportunity for EDA vendors and other people.”

The users who do the verification are changing. “The biggest shift in verification with open-source-based designs is actually not in the SoC verification requirements, it’s in the structure of who does it,” Schirrmeister said. “If I look into where’s verification done, and the figures about petacycles of FPGA cycles being run, that’s somebody who’s selling the IP. As an individual end user, do I have to do the same level or depth of integration of verification where I have to foresee all kinds of cases? Perhaps, but probably not. So the structure is an important one. It will be interesting to see, five years from now, how many of the designs that are internally verified are more on the academic side, or whether they will be verified by the companies that will give you a RISC-V template to generate a processor, and they take responsibility of the verification.”

How big this shift will become isn’t entirely clear. “We are dealing with an open ISA, and the ISA itself is just another processor,” said Ho. “But there’s a lot of open implementations of this ISA. Once you go to the realm of open implementation, where the microarchitecture is fully exposed to everybody, security verification is something that we have not as an industry done very well. We have focused on functional verification, which means that devices do the right thing, but we have not done very much analysis on whether you can access to data that you shouldn’t have access to. This requires formal verification, but it’s an area that we haven’t spent much time on. That is exactly why it’s a requirement that’s coming from this open source part of it, not so much the ISA. But once you have open implementation to the RTL, then it becomes absolutely critical because anyone can analyze it for weaknesses and data corruption.”

Conclusion
At the end of the day, it is not uncommon for SoC designers to grill their IP providers on their verification process, sometimes requesting coverage reports and other documentation as part of the package, pointed out Nicolae Tusinschi, design verification expert at OneSpin Solutions. “When it comes to a processor core implementing the complex RISC-V ISA, users are going to expect and demand even more. Commercial core suppliers will have to respond to these requirements, and even designers placing open-source cores in repositories will have to do a better job of verification to encourage adoption.”

Whatever the source of their RISC-V cores, many SoC teams will want to re-verify the designs themselves, Tusinschi believes, and third-party RISC-V solutions from EDA vendors will play a major role because both suppliers and users can run the verification. “The level of rigor required means that formal verification must be used; only formal tools can provide proof of correctness and ensure that a design has not been compromised by poor coding practices, security holes, or deliberate insertion of hardware Trojans.”

The need for such a solution is already clear. OneSpin has formally analyzed multiple RISC-V core implementations and found numerous issues. These include potentially exploitable code, a security risk, and an undocumented instruction, which is a serious trust vulnerability. The company has found more than a dozen design bugs in a single popular core. Fortunately, the tools and technology already exist to verify RISC-V cores to the level expected by the users and required of the suppliers.



Leave a Reply


(Note: This name will be displayed publicly)