Addressing Library Characterization And Verification Challenges Using ML


At advanced process nodes, Liberty or library (.lib) requirements are more demanding due to design complexities, increased number of corners required for timing signoff, and the need for statistical variation modeling. This results in an increase in size, complexity, and the number of .lib characterizations. Validation and verification of these complex and large .lib files is a challenging task... » read more

Inverse Design of Inflatable Soft Membranes Through Machine Learning


Abstract "Across fields of science, researchers have increasingly focused on designing soft devices that can shape-morph to achieve functionality. However, identifying a rest shape that leads to a target 3D shape upon actuation is a non-trivial task that involves inverse design capabilities. In this study, a simple and efficient platform is presented to design pre-programmed 3D shapes starting... » read more

Growth Spurred By Negatives


The success and health of the semiconductor industry is driven by the insatiable appetite for increasingly complex devices that impact every aspect of our lives. The number of design starts for the chips used in those devices drives the EDA industry. But at no point in history have there been as many market segments driving innovation as there are today. Moreover, there is no indication this... » read more

Greener Design Verification


Chip designs are optimized for lower cost, better performance, or lower power. The same cannot be said about verification, where today very little effort is spent on reducing execution cost, run time, or power consumption. Admittedly, one is a per unit cost while the other is a development cost, but could the industry be doing more to make development greener? It can take days for regression... » read more

Next Generation Reservoir Computing


Abstract: "Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems using observed time-series data. Importantly, it requires very small training data sets, uses linear optimization, and thus requires minimal computing resources. However, the algorithm uses randomly sampled matrices to define the underlying recurrent neural n... » read more

Is Programmable Overhead Worth The Cost?


Programmability has fueled the growth of most semiconductor products, but how much does it actually cost? And is that cost worth it? The answer is more complicated than a simple efficiency formula. It can vary by application, by maturity of technology in a particular market, and in the context of much larger systems. What's considered important for one design may be very different for anothe... » read more

How Inferencing Differs From Training in Machine Learning Applications


Machine learning (ML)-based approaches to system development employ a fundamentally different style of programming than historically used in computer science. This approach uses example data to train a model to enable the machine to learn how to perform a task. ML training is highly iterative with each new piece of training data generating trillions of operations. The iterative nature of the tr... » read more

Manufacturing Shifts To AI Of Things


AI is being infused into the Internet of Things, setting the stage for significant improvements in manufacturing productivity, improved uptime, and reduced costs — regardless of market segment. The traditional approach to improving manufacturing equipment reliability and efficiency is regular scheduled maintenance. While that is an improvement over just fixing or replacing equipment when i... » read more

Amdahl Limits On AI


Software and hardware both place limits on how fast an application can run, but finding and eliminating the limitations is becoming more important in this age of multicore heterogeneous processing. The problem is certainly not new. Gene Amdahl (1922-2015) recognized the issue and published a paper about it in 1967. It provided the theoretical speedup for a defined task that could be expected... » read more

The Future Of Smart Cameras Is 64-Bit Processing


The future of smart camera technology brings with it profound transformations in the way we interact with each other and the world around us. From smart cities that are safer and more efficient to rainforests that are monitored for illegal logging, the increasing need for advanced vision technology is growing. Diverse and complex use cases leveraging artificial intelligence (AI) and machine lea... » read more

← Older posts Newer posts →