The Power Of Big Data: Or How To Make Perfect 30-Minute Brownies In Only 30 Minutes

Ensuring all the design data needed for debug and optimization is captured and available.

popularity

You’re scrolling online, and the picture stops you in your tracks, grabs you, captivates you. Glistening chocolate pieces are, determinedly yet slowly, oozing down a moist brownie with a crisped-to-perfection, powdered topping. It sits there, confident, flaking lazily onto a bone-white china plate. It looks delicious—mouthwatering—and, apparently, you can make it with just a 30-minute investment!

It sounds easy: a few ingredients, a couple of bowls, and a well-greased baking sheet. But you know, and we all know, it’s a lie. It’s always been a lie! This is no 30-minute investment. There is the prepping, flour sieving, and going to the store to get the semi-sweet chocolate pieces that you thought you had but, for the life of you, can’t find (although you will find them tomorrow). The 30-minute dream is now a half-day “adventure” of driving, searching, disappointment, and frustration that ultimately leads to making do with “whatever you can muster”—and a sad excuse for a brownie.

It may seem like a leap – a much less chocolatey and dreamy-goodness-centric one – but the design-implementation debug and optimization process works exactly the same way!

What seems like a simple debug exercise into why a block isn’t implementable with your desired power, performance, and area (PPA) targets very quickly turns into a hair-pulling ordeal. You’re trying to work around the unfortunate fact that you didn’t capture all the data that you need, and it’s a 24-hour turnaround (or more) to get it. After five hours of repeatedly trying to design between the lines, it turns out that those semi-sweet chocolate pieces are definitely not there. You have to bite the bullet and go and get the data that you need. And then maybe do it again. And maybe again.

This is how design debug and optimization has always worked. And you’ve always generally ended up with less delicious, less crispy, less delightful brownies made from whatever you could muster. But it doesn’t have to be this way. What you need is a companion who will prepare everything you need for a true 30-minute bake, gathering and analyzing the big data ingredients along the way.

Your tireless preparer

This help is all about the five Ps: “Proper Preparation Prevents Poor Performance.” It’s an always-on, always-ready design-data companion that pre-prepares all the data that you might need, just in case you do. It buddies up industry-standard databases with Synopsys’ unique single data model and curates a broad and consistent view of all your design data such that you never have to go searching again. It’s an autonomous agent, constantly capturing all the design activities that you’re doing in case one day you might need it. And oh, how frequently you do.

The key is that you can worry more about the process of cooking up a design and less about what goes into it. Your companion effortlessly brings focus and efficiency to the process of debugging and optimization. It curates a live, configurable, and filterable ensemble of big data that can be shaped and morphed into whatever works best for you. And if you want to share your data? Maybe show someone where you’re struggling, or maybe show them a neat way to visualize something? It’s just a few clicks and a link-share away.

It’s as if someone knows what you want to cook and prepares all the ingredients in those measured, tiny little ramekins, such that you never have to think about it and can get on with the cooking, I mean designing.

Your teacher

Why is this happening? This is the perennial question when debugging a design. Even if you know what is happening, understanding why, is not just a small next step but often a much bigger leap in the overall process. Was it something you did? Something you didn’t do? Something innate to the design or something endemic to your methodology that will surface in many designs on this technology?

Understanding the why comes from piecing together many parts of the what (state your design is in), synthesizing that with fundamental aspects of the design itself and the target technology process, mixing in some adjunct constraints such as testability and verifiability, and hoping something causal can be extracted. The cause could be logic style, clocks, floorplan-related congestion, or maybe all (or none) of these.

Substituting 90% cocoa chocolate for semi-sweet chunks will mean you have to up the sugar to offset the bitterness, and designs work in much the same way. What you may assume to be an equivalent piece of logic or macro configuration may need some cajoling to make it work and knowing what that cajoling is can sometimes be hard.

Your trusty companion is your teacher, using big data and analytics to know how design works; there to lead you in the debug process and help you get to the true root causes of issues much, much faster. Your companion does all this heavy lifting, making data fully cross-probable, all in the comfort of a web browser and without having to individually open dozens of design databases in the process. Across projects, across designs, across blocks, across any stage of the flow, the data can be brought to task to show whatever you need to give you that leg up in knowing what to do next. Is this design just not convergable at all? Does it just need better architectural choices during synthesis? Does the crosstalk you baked in with this new floorplan have to be relieved before this design will close? All these questions can be answered with the power of big data and the deep data insight that your companion brings.

Your sous chef

You’re the creative force in the design process, but who wouldn’t want another safe set of hands that can not only prepare and analyze with the best of them, but who also brings a character and design flair of their own? This character is shaped from the knowledge built up not just from this design, but potentially every design your team or entire company has ever done together. Enter stage left, big-data enabled machine learning and your companion’s predictive and prescriptive analytics abilities.

Imagine that you had a design companion who could take debug to the next level by suggesting multiple recipes—tool consumable recipes (Tcl scripts)—that could help uncover untapped PPA within your design. Then imagine that these recipes would improve over time based on an intimate knowledge of your team’s design style and specific structural or process-related challenges. And then imagine you could leverage all the analytics data you ever created and further augment the system with your own recipes and ideas—a fully programmable way to harness and create a truly data-driven design process. Well, imagine no more.

With your companion by your side, it’s as if two best buddies (plus all the teammates you want to bring along with you) go on a worldwide cooking odyssey to sample all the best that came before and then create new takes on the best, every step of the way.

Serving up a delicious productivity boost

Synopsys DesignDash is your ideal companion for the challenging task of baking the perfect design in the advertised (or better!) time. Its unique compounding of big-data systems, design-centric data science, and machine learning holistically addresses your design process. It’s also tightly integrated with the Synopsys Digital Design Family, enabling far deeper levels of insight than data-reporting extraction alone could ever offer, so that together, you can write the ultimate recipe book.

With productivity being challenged along so many vectors, DesignDash’s ability to bring efficiency and effectiveness to your team in such a comprehensive way not only helps to deliver new advances in your design space, but also helps to make everyone a smarter cookie, allowing you to amass lots of brownie points along the way.



Leave a Reply


(Note: This name will be displayed publicly)