For effective DRC, designers need to optimize not only debugging itself, but also set-up and documentation.
Debugging design violations found by design rule checking (DRC) has always taken a significant share of the time needed to get a design to tapeout. And debug time only increases as the number and complexity of DRC expands with each new process node. Any steps you can take to make your DRC debug process more efficient directly improves your productivity.
One technique for minimizing debug time is to use signoff DRC tools within the design environment to help designers run DRC jobs and debug DRC errors during design, reducing the number of DRC iterations required. However, if you don’t set up these signoff DRC jobs correctly to generate valuable data that will make DRC debug effective, you may be losing any benefit you hope to gain. Why? Here are a few reasons…
Visualizing DRC results in the design environment in the midst of hundreds of design layers can be challenging—designers spend a lot of time turning design layers off and on to see specific error results, which is not only frustrating, but hampers their productivity. In addition, designers are often looking at hundreds, or even thousands, of DRC violations, listed in no particular priority or order. Without some kind of results categorization, designers must manually identify, categorize, and prioritize results, a time-intensive task that reduces the amount of time and resources available for the actual debugging work.
Another issue in today’s design flows is the use of multiple tools from different electronic design automation (EDA) companies for specific parts of the design flow. Computer-assisted design (CAD) engineers often spend a significant amount of time making all of these tools interoperable. This interoperability is particularly significant for signoff physical/circuit verification flows, which (along with parasitic extraction) are typically the last steps before a design tapeout.
The last stage in DRC results debug is to document the results debug status and share it with the stakeholders. Generating documentation to report DRC status is another tedious and time-consuming job that typically involves designers manually taking hundreds (if not thousands) of screenshots and including them in a PDF file to share with the stakeholders. All of these actions consume a significant amount of the designer’s valuable time.
So what can designers do? Electronic design automation (EDA) vendors are constantly working on tool innovations focused on performance, accuracy, interoperability, automation, and improved debug capabilities to help the design community to optimize their design quality and successfully tapeout their designs on schedule.
For maximum efficiency and productivity, designers need to optimize all three stages of the DRC process: setting up the DRC run to aid debug, performing DRC debugging, and documenting the DRC debug status. Let’s examine what that might look like for each step.
Rule categorization and filtering
Rather than focus on fixing DRC errors sequentially, designers can improve their efficiency and productivity if they prioritize their DRC debug process by organizing results to perform targeted DRC debugging in a systematic way.
There are some pre-existing rule categories. Rule check groups (such as CELL or CHIP) that contain the list of checks required for each design level are defined by the foundry in the rule files. Likewise, rule checks are associated by default with layers in the rule files. Designers do not have to modify the DRC rule deck from the foundry to see these categorizations, which makes it easy to incorporate this categorization as part of the standard DRC flow.
However, by pre-defining additional user-defined rule categorizations, designers can set up their DRC runs to generate fully categorized results, which can then be used to perform systematic and prioritized DRC debug. Letting the tool automatically classify and sort DRC results allows designers to begin focused DRC debug immediately.
Some EDA tools also enable designers to create customized categories of checks based on their specific design needs and/or related to specific DRC rules. By using both foundry rule check groups and custom groupings, designers will be able to quickly visualize DRC rules in different ways during debugging.
Focused results debugging and visualization
With pre-categorized DRC error results available, designers can now perform DRC debug in their preferred order of priority, and systematically resolve the DRC results for maximum benefit and efficiency (Figure 1).
Figure 1. Unsorted results require time-intensive manual categorization and prioritizing, while pre-categorized results enable designers to begin debugging quickly and efficiently.
Designers typically start the debug process by using their design environment to visualize a result marker in the context of the design. However, the result marker is often buried under hundreds of design layers, making it virtually impossible for the designer to understand the DRC result. To clearly see the result marker, designers must turn on only the design layer(s) relevant to that DRC rule, and turn off all other design layers. Because these design layers change for every DRC rule, designers waste a lot of time flicking switches, when they could be focusing on design-critical activities. Employing a solution that can dynamically and automatically change the design layers displayed eliminates the tedious work of changing the layers display and allows designers to focus on the actual DRC debug (Figure 2).
Figure 2. The default display of all layers makes it difficult to identify the layers applicable to the DRC result marker. Automatically limiting the display to the relevant layers speeds up debugging.
DRC results documentation
Typically, results reporting happens every day during tapeout, allowing stakeholders at other locations to review the status and continue with the DRC debug and fixing process. These status reports can include design, tapeout, physical failure, waivers, and other data that must be shared both internally and/or externally, without sending any proprietary design data. Designers spend a lot of time writing documentation and manually capturing snapshots to generate this status report, which takes their focus away from design-critical activities.
Adopting a solution that automates the production of DRC status information eliminates the need for these time-intensive manual tasks while ensuring accurate and thorough reporting, which can be crucial to improving debugging and reporting flows (Figure 3).
Figure 3. Sample of automatically-generated DRC result waivers report.
Summary
In modern-day design flows, replacing manual debug methodologies with automated, repeatable solutions integrated into the design tools eliminates multiple time-intensive manual tasks, freeing up more time for actual debugging and error fixing critical to meeting tapeout and time-to-market schedules. Automating manual setup and debugging tasks not only makes the verification flow more efficient, but can also improve debugging accuracy and reliability. Managing increased design complexity while meeting ever-tighter market deadlines requires companies to look for and adopt innovative strategies that help them achieve their business goals.
Leave a Reply