Systems & Design
WHITEPAPERS

Batch Filters: A Better, Faster Way To Filter Large DRC Results Databases

How to optimize time and resources for debugging DRC errors.

popularity

Reviewing massive DRC results databases (RDBs) can be a time-consuming stage in traditional debug flows, due primarily to the loading, filtering, and display times associated with these large datasets. Finding the most effective approach to filtering results data is important to optimize both results debug time and resource usage. While smaller databases that load quickly in GUI applications can take advantage of built-in filters, larger databases may impact schedules and resources. The Calibre RVE batch filtering process allows designers to apply filter expressions to an RDB and write the matched results to smaller output files using less memory.

Debug DRC results databases faster with external batch filters
When working with giga-scale DRC RDBs, design teams can take advantage of external batch filtering operations to save significant time and resources and create more efficient debugging flows. Design teams can create custom batch filters in any text editor, then apply these filters to a DRC RDB without having to load the RDB into memory. These custom batch filters can be easily saved and reused on other DRC RDBs, as well as shared across teams. By creating smaller targeted DRC results databases faster, design teams can focus their time and resources on debugging critical DRC errors more quickly and efficiently, improving results while reducing time to tapeout.

To read more, click here.



Leave a Reply


(Note: This name will be displayed publicly)