Standard Benchmarks For AI Innovation


There is no standard measurement for machine learning performance today, meaning there is no single answer for how companies build a processor for ML across all use cases while balancing compute and memory constraints. For the longest time, every group would pick a definition and test to suit their own needs. This lack of common understanding of performance hinders customers' buying decis... » read more

Tapping Into Purpose-Built Neural Network Models For Even Bigger Efficiency Gains


Neural networks can be categorized as a set of algorithms modelled loosely after the human brain that can ‘learn’ by incorporating new data. Indeed, many benefits can be derived from developing purpose-built “computationally efficient” neural network models. However, to ensure your model is effective, there are several key requirements that need to be considered. One critical conside... » read more

Infrastructure Impacts Data Analytics


Semiconductor data analytics relies upon timely, error-free data from the manufacturing processes, but the IT infrastructure investment and engineering effort needed to deliver that data is, expensive, enormous, and still growing. The volume of data has ballooned at all points of data generation as equipment makers add more sensors into their tools, and as monitors are embedded into the chip... » read more

Forward And Backward Compatibility In IC Designs


Future-proofing of designs is becoming more difficult due to the accelerating pace of innovation in architectures, end markets, and technologies such as AI and machine learning. Traditional approaches for maintaining market share and analyzing what should be in the next rev of a product are falling by the wayside. They are being replaced by best-guesses about market trends and a need to bala... » read more

Brute-Force Analysis Not Keeping Up With IC Complexity


Much of the current design and verification flow was built on brute force analysis, a simple and direct approach. But that approach rarely scales, and as designs become larger and the number of interdependencies increases, ensuring the design always operates within spec is becoming a monumental task. Unless design teams want to keep adding increasing amounts of margin, they have to locate th... » read more

What’s Next In AI, Chips And Masks


Aki Fujimura, chief executive of D2S, sat down with Semiconductor Engineering to talk about AI and Moore’s Law, lithography, and photomask technologies. What follows are excerpts of that conversation. SE: In the eBeam Initiative’s recent Luminary Survey, the participants had some interesting observations about the outlook for the photomask market. What were those observations? Fujimur... » read more

Difficult Memory Choices In AI Systems


The number of memory choices and architectures is exploding, driven by the rapid evolution in AI and machine learning chips being designed for a wide range of very different end markets and systems. Models for some of these systems can range in size from 10 billion to 100 billion parameters, and they can vary greatly from one chip or application to the next. Neural network training and infer... » read more

Deploying Accurate Always-On Face Unlock


Accurate face verification has long been considered a challenge due to the number of variables, ranging from lighting to pose and facial expression. This white paper looks at a new approach — combining classic and modern machine learning (deep learning) techniques — that achieves 98.36% accuracy, running efficiently on Arm ML-optimized platforms, and addressing key security issues such a... » read more

Combining Machine Learning With Advanced Outlier Detection To Improve Quality And Lower Cost


In semiconductor manufacturing, a low defect rate of manufactured integrated circuits is crucial. To minimize outgoing device defectivity, thousands of electrical tests are run, measuring tens of thousands of parameters, with die that are outside of specified parameters considered as fails. However, conventional test techniques often fall short of guaranteeing acceptable quality levels. Given t... » read more

Model Variation And Its Impact On Cell Characterization


EDA (Electronic Design Automation) cell characterization tools have been used extensively to generate models for timing, power and noise at a rapidly growing number of process corners. Today, model variation has become a critical component of cell characterization. Variation can impact circuit timing due to process, voltage, and temperature changes and can lead to timing violations, resulting i... » read more

← Older posts Newer posts →