Convolutional Neural Network With INT4 Optimization


Xilinx provides an INT8 AI inference accelerator on Xilinx hardware platforms — Deep Learning Processor Unit (XDPU). However, in some resource-limited, high-performance and low-latency scenarios (such as the resource-power-sensitive edge side and low-latency ADAS scenario), low bit quantization of neural networks is required to achieve lower power consumption and higher performance than provi... » read more

Manufacturing Bits: Dec. 1


New phase-change materials The National Institute of Standards and Technology (NIST) has developed an open source machine learning algorithm for use in discovering and developing new materials. NIST’s technology, called CAMEO, has already been used by researchers to discover a new phase-change memory material. CAMEO, which stands for Closed-Loop Autonomous System for Materials Exploration... » read more

Forward And Backward Compatibility In IC Designs


Future-proofing of designs is becoming more difficult due to the accelerating pace of innovation in architectures, end markets, and technologies such as AI and machine learning. Traditional approaches for maintaining market share and analyzing what should be in the next rev of a product are falling by the wayside. They are being replaced by best-guesses about market trends and a need to bala... » read more

Week In Review: Design, Low Power


AI Mythic debuted its Analog Matrix Processor for edge AI applications such as smart home, AR/VR, drones, video surveillance, smart city, and industrial. The M1108 AMP combines 108 tiles made up of an array of flash cells and ADCs, a 32-bit RISC-V nano-processor, a SIMD vector engine, SRAM, and a high-throughput Network-on-Chip router. It uses 40nm technology and the company says typical power... » read more

Security Gaps In Open Source Hardware And AI


Semiconductor Engineering sat down to discuss security risks across multiple market segments with Helena Handschuh, security technologies fellow at Rambus; Mike Borza, principal security technologist for the Solutions Group at Synopsys; Steve Carlson, director of aerospace and defense solutions at Cadence; Alric Althoff, senior hardware security engineer at Tortuga Logic; and Joe Kiniry, princi... » read more

What Interested You In 2020


In business you are always told to follow the money, but for us it is more important to follow the readership. If we are not writing what you want to read, then we are missing the mark. I like to review the ones that have garnered the most attention, in part to see if that will influence what I write about for 2021, but also to find out where the industry is looking for the most help. As Sem... » read more

Using AI And Bugs To Find Other Bugs


Debug is starting to be rethought and retooled as chips become more complex and more tightly integrated into packages or other systems, particularly in safety- and mission-critical applications where life expectancy is significantly longer. Today, the predominant bug-finding approaches use the ubiquitous constrained random/coverage driven verification technology, or formal verification techn... » read more

Brute-Force Analysis Not Keeping Up With IC Complexity


Much of the current design and verification flow was built on brute force analysis, a simple and direct approach. But that approach rarely scales, and as designs become larger and the number of interdependencies increases, ensuring the design always operates within spec is becoming a monumental task. Unless design teams want to keep adding increasing amounts of margin, they have to locate th... » read more

Computational Software


Electronics technology is evolving rapidly, becoming pervasive in our lives. There are more smart phones in use than there are people on earth, driver assistance is now common in automobiles, commercial airplanes have increasingly sophisticated infotainment, and wearable health monitors exist for a plethora of missions. Every device is generating and communicating massive amounts of data, inclu... » read more

The Growing Market For Specialized Artificial Intelligence IP In SoCs


Over the past decade, designers have developed silicon technologies that run advanced deep learning mathematics fast enough to explore and implement artificial intelligence (AI) applications such as object identification, voice and facial recognition, and more. Machine vision applications, which are now often more accurate than a human, are one of the key functions driving new system-on-chip (S... » read more

← Older posts Newer posts →