A Hierarchical And Tractable Mixed-Signal Verification Methodology For First-Generation Analog AI Processors


Artificial intelligence (AI) is now the key driving force behind advances in information technology, big data and the internet of things (IoT). It is a technology that is developing at a rapid pace, particularly when it comes to the field of deep learning. Researchers are continually creating new variants of deep learning that expand the capabilities of machine learning. But building systems th... » read more

Test Challenges Mount As Demands For Reliability Increase


An emphasis of improving semiconductor quality is beginning to spread well beyond just data centers and automotive applications, where ICs play a role in mission- and safety-critical applications. But this focus on improved reliability is ratcheting up pressure throughout the test community, from lab to fab and into the field, in products where transistor density continues to grow — and wh... » read more

AI: Engineering Tool Or Threat To Jobs?


Semiconductor Engineering sat down to talk about using AI for designing and testing complex chips with Michael Jackson, corporate vice president for R&D at Cadence; Joel Sumner, vice president of semiconductor and electronics engineering at National Instruments; Grace Yu, product and engineering manager at Meta; David Pan, professor in the Department of Electrical and Computer Engineering a... » read more

HW-SW Co-Design Solution For Building Side-Channel-Protected ML Hardware


A technical paper titled "Hardware-Software Co-design for Side-Channel Protected Neural Network Inference" was published (preprint) by researchers at North Carolina State University and Intel. Abstract "Physical side-channel attacks are a major threat to stealing confidential data from devices. There has been a recent surge in such attacks on edge machine learning (ML) hardware to extract the... » read more

Review of Tools & Techniques for DL Edge Inference


A new technical paper titled "Efficient Acceleration of Deep Learning Inference on Resource-Constrained Edge Devices: A Review" was published in "Proceedings of the IEEE" by researchers at University of Missouri and Texas Tech University. Abstract: Successful integration of deep neural networks (DNNs) or deep learning (DL) has resulted in breakthroughs in many areas. However, deploying thes... » read more

New Method Improves Machine Learning Models’ Reliability, With Less Computing Resources (MIT, U. of Florida, IBM Watson)


A new technical paper titled "Post-hoc Uncertainty Learning using a Dirichlet Meta-Model" was published (preprint) by researchers at MIT, University of Florida, and MIT-IBM Watson AI Lab (IBM Research). The work demonstrates how to quantify the level of certainty in its predictions, while using less compute resources. “Uncertainty quantification is essential for both developers and users o... » read more

The Next Disruption


Machine learning (ML) is an inherently disruptive technology because the algorithm architectures are evolving so fast and are very compute intensive, requiring innovative silicon for acceptable performance. This blog looks at where we’ve been and where ML is going – into another market ready for disruption. ML started in the data center In the early days of the ML explosion – a mere 8 o... » read more

Simulating Reality: The Importance Of Synthetic Data In AI/ML Systems For Radar Applications


Artificial intelligence and machine learning (AI/ML) are driving the development of next-generation radar perception. However, these AI/ML-based perception models require enough data to learn patterns and relationships to make accurate predictions on new, unseen data and scenarios. In the field of radar applications, the data used to train these models is often collected from real-world meas... » read more

Using Machine Learning To Automate Debug Of Simulation Regression Results


Regression failure debug is usually a manual process wherein verification engineers debug hundreds, if not thousands of failing tests. Machine learning (ML) technologies have enabled an automated debug process that not only accelerates debug but also eliminates errors introduced by manual efforts. This white paper discusses how verification engineers can more efficiently analyze, bin, triage... » read more

Achieve 10X Faster CDC Debug Leveraging Machine Learning


Over the years, system-on-chip (SoC) design sizes have crossed the billion-gate mark. Higher complexity has been introduced within semiconductor designs to deliver desired functionality. The number of asynchronous clock and reset domains is growing heavily within these complex SoCs, leading to millions of clock domain crossing (CDC) violations at the SoC level. Each of these violations ... » read more

← Older posts Newer posts →