From Cryptic Error Messages To Contradictory Commands

While EDA tools have advanced to enable an impressive level of design complexity, users say usability hasn’t always kept up with functionality.

popularity

By Ann Steffora Mutschler
For the past 30 years, semiconductor designers have increasingly relied on automated CAD tools to complete their projects. Over time, these tools have indeed improved from a functionality perspective, but sometimes usability has not kept up with users’ needs.

Depending on which tools and what type of use, some tools are easier to use than others, according to Mike Berry, senior director of engineering at Open-Silicon.

“In general, if you’re talking about the smaller, simpler sorts of designs, it’s fairly easy to go into a tool even if you are reasonably new to it and figure out how to do some of the basic functions. If you’re doing simulation, how do you actually start a simulation running? How do you bring a waveform up in a viewer to look at it? Those sorts of things are easy. If you are doing place and route, you find the button to push to say, ‘Go do a placement,’ or ‘Go insert clocks for me,’ ‘Go do detailed routing.’ It’s when you start getting into the more complex designs where things quickly start to become a lot more challenging because as the tools have evolved over the years, one of the things that we’ve seen happen is that more and more knobs and switches get added to tools, but old ones often aren’t taken away even though they might either be redundant or they might be obsolete or they might not even work anymore or they might contradict some other things that are in the tools. We run into that sort of thing sometimes,” he explained.

What makes a tool easy (or not) to use often comes down to the user interface. “If you’re working with a single tool and using it year after year after year, you get a feel for how it works and how to navigate. But moving between tools is very challenging,” Berry said. As Open-Silicon does a lot of design services, it often is requested or required to use different tools from program to program—a very difficult transition because the user interfaces are so different and in some cases, they’re just not very good.

“One of the things that we’ve talked about here for years is that it seems like in many cases the people developing the tools and developing those user interfaces are software people. This kind of makes sense that that’s who would be writing the underlying code but we can’t help but wonder do those software people ever really talk to the people that use the tools—the hardware designers—and understand, ‘It’s great to have this knob, but I don’t understand what it does or why I care that you did it.’ It’s a really cool programming exercise to figure out how to do that but it’s just not necessarily useful in the real world. That’s been a complaint of ours for years and years. The tools are not geared toward the users’ needs for the tools, even some of the functionality they provide,” he said.

Another issue that complicates EDA tool usage is error messages that come out of the tools. “In many cases the word I think you’ll hear from most people is ‘cryptic.’ You get this error message: ‘Stack trace fault at hex 3279,’ and then the tool sits there with a blinking cursor waiting for you to do the next thing, and it’s like, ‘What do I do with that?’” Berry pointed out. “That one may be an extreme but something about ‘couldn’t find net to query’ or something really, really cryptic. You have no idea what line number in the script it was at, or anything about it, and you just stare at it and think, ‘I don’t know what to do.’ We start doing binary searches through the code. Let’s delete the bottom half of the code, and run the top half and see if we still get that message. We start trying to start zero in on where the problem is and eventually usually you can find out what’s going on but the amount of debug time is really painful sometimes.”

One particular set of tools that Open-Silicon is using now is just terrible in terms of some basic features that should exist but don’t, Berry said. As a result, his team has resorted to writing their own TCL algorithms to access very low-level database functions to do some things that just should be there. “The database obviously has all of the information you need and you can in fact query it, but instead of right clicking on a net and asking for some information about it, you have to write 20 lines of code that essentially uses the API that exists to get into the database.”

Tool providers chime in
Jeff Wilson, Calibre product marketing manager at Mentor Graphics, noted that users always want the tools to be easier to use. “We have is a spectrum. For example, in a fab there are people who really understand Calibre so other people have someone they go to and ask about. In fabless companies we probably need something that’s a little more turnkey. They do have expertise they can go to ask questions. But fabless guys are all over the spectrum as well. They are leading-edge guys who basically take the deck and run stuff. If it’s a test engineer we’re learning what particular applications we want to lead with and which ones we would probably not lead with just because of the learning curve involved. We’ve had very good success being able to work with customers to be able to prove out the data points, but now we’re in this space of fine tuning that to be able to say for this particular customer, this is the message and the packaging that we’d need to go with.”

Connected to this is the work EDA vendors do on an ongoing basis to broaden the reach of the tools.

Robert Ruiz, senior product line manager for test automation at Synopsys, noted that last year the company started addressing the ability for a broader set of users to easily set up the tools rather than just the traditional person using volume diagnostic tools. Additionally, Synopsys test tools now support the LEF DEF standard as well as a SEMI standard that describes tester output in order to allow users to gather data from many different types of testers.

Similarly, Geir Eide, DFT product marketing manager at Mentor Graphics, said usability is becoming extremely important as lines blur between domains. “We have tools that are made for yield or product engineers that are based on test technology. We have to make sure it doesn’t look like another test tool because it’s not really for test people. This is something we have tried to take into account and it’s going to continue to be important as we see technology versus domain crossover-type things. People always want easier to use tools but the meaning of ‘easy to use’ is changing a little. A lot of the tools we have are built on DFT technology but the users are either product engineers or failure analysis engineers that know absolutely nothing about DFT or the underlying technology so we have to try to make that part of it invisible. You don’t have to force everybody to take courses in DFT just to utilize some of the data that that provides.”

At the end of the day, Open-Silicon’s Berry gave credit to the EDA tool companies for the advances in the algorithms tools are using, and the way they approach doing certain functions. “There’s been a lot of algorithmic development and lots of optimizations within the tools to help with runtimes and help with making it possible to manage huge sets of data in ways that were just not possible not very many years ago. Certainly there’s a lot of development work going on there. The advent of some of the new design verification methodologies starting back with Specman, Vera, VMM, OVM and now UVM—that’s been a huge improvement in verification productivity and thoroughness. That’s a very, very powerful recent addition to the tool flows.”



Leave a Reply


(Note: This name will be displayed publicly)