Low Power Still Leads, But Energy Emerges As Future Focus


In 2021 and beyond, chips used in smartphones, digital appliances, and nearly all major applications will need to go on a diet. As the amount of data being generated continues to swell, more processors are being added everywhere to sift through that data to determine what's useful, what isn't, and how to distribute it. All of that uses power, and not all of it is being done as efficiently as... » read more

Convolutional Neural Network With INT4 Optimization


Xilinx provides an INT8 AI inference accelerator on Xilinx hardware platforms — Deep Learning Processor Unit (XDPU). However, in some resource-limited, high-performance and low-latency scenarios (such as the resource-power-sensitive edge side and low-latency ADAS scenario), low bit quantization of neural networks is required to achieve lower power consumption and higher performance than provi... » read more

Brute-Force Analysis Not Keeping Up With IC Complexity


Much of the current design and verification flow was built on brute force analysis, a simple and direct approach. But that approach rarely scales, and as designs become larger and the number of interdependencies increases, ensuring the design always operates within spec is becoming a monumental task. Unless design teams want to keep adding increasing amounts of margin, they have to locate th... » read more

What’s Next In AI, Chips And Masks


Aki Fujimura, chief executive of D2S, sat down with Semiconductor Engineering to talk about AI and Moore’s Law, lithography, and photomask technologies. What follows are excerpts of that conversation. SE: In the eBeam Initiative’s recent Luminary Survey, the participants had some interesting observations about the outlook for the photomask market. What were those observations? Fujimur... » read more

Artificial Intelligence For Sustainable And Energy Efficient Buildings


According to the goals of Europe’s green deal missions, the continent strives for becoming carbon neutral by 2050. Since buildings are a major contributor to the overall consumption of energy, improving their energy efficiency can be a key to a more sustainable and greener Europe. On the way towards zero-emission buildings, several challenges have to be met: In modern energy systems, several ... » read more

The Benefits Of Using Embedded Sensing Fabrics In AI Devices


AI chips, regardless of the application, are not regular ASICs and tend to be very large, this essentially means that AI chips are reaching the reticle limits in-terms of their size. They are also usually dominated by an array of regular structures and this helps to mitigate yield issues by building in tolerance to defect density due to the sheer number of processor blocks. The reason behind... » read more

AI Design In Korea


Like many in the semiconductor design businesses, Arteris IP is actively working with the Korean chip companies. This shouldn’t be a surprise. If a company is building an SoC of any reasonable size, it needs network-on-chip (NoC) interconnect for optimal QoS (bandwidth and latency regulation and system-level arbitration) and low routing congestion, even in application-centric designs such as ... » read more

Compiling And Optimizing Neural Nets


Edge inference engines often run a slimmed-down real-time engine that interprets a neural-network model, invoking kernels as it goes. But higher performance can be achieved by pre-compiling the model and running it directly, with no interpretation — as long as the use case permits it. At compile time, optimizations are possible that wouldn’t be available if interpreting. By quantizing au... » read more

How ML Enables Cadence Digital Tools To Deliver Better PPA


Artificial intelligence (AI) and machine learning (ML) are emerging as powerful new ways to do old things more efficiently, which is the benchmark that any new and potentially disruptive technology must meet. In chip design, results are measured in many different ways, but common metrics are power (consumed), performance (provided), and area (required), collectively referred to as PPA. These me... » read more

From Data Center To End Device: AI/ML Inferencing With GDDR6


Created to support 3D gaming on consoles and PCs, GDDR packs performance that makes it an ideal solution for AI/ML inferencing. As inferencing migrates from the heart of the data center to the network edge, and ultimately to a broad range of AI-powered IoT devices, GDDR memory’s combination of high bandwidth, low latency, power efficiency and suitability for high-volume applications will be i... » read more

← Older posts Newer posts →