中文 English

New Uses For AI In Chips


Artificial intelligence is being deployed across a number of new applications, from improving performance and reducing power in a wide range of end devices to spotting irregularities in data movement for security reasons. While most people are familiar with using machine learning and deep learning to distinguish between cats and dogs, emerging applications show how this capability can be use... » read more

AI At The Edge: Optimizing AI Algorithms Without Sacrificing Accuracy


The ultimate measure of success for AI will be how much it increases productivity in our daily lives. However, the industry has huge challenges in evaluating progress. The vast number of AI applications is in constant churn: finding the right algorithm, optimizing the algorithm, and finding the right tools. In addition, complex hardware engineering is rapidly being updated with many different s... » read more

MIT: Stackable AI Chip With Lego-style Design


New technical paper titled "Reconfigurable heterogeneous integration using stackable chips with embedded artificial intelligence" from researchers at MIT, along with Harvard University, Tsinghua University, Zhejiang University, and others. Partial Abstract: "Here we report stackable hetero-integrated chips that use optoelectronic device arrays for chip-to-chip communication and neuromorphic... » read more

Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices


Abstract:  "Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Memory, in particular Phase Change Memory (PCM), for software-equivalent accurate i... » read more

Accelerating Inference of Convolutional Neural Networks Using In-memory Computing


Abstract: "In-memory computing (IMC) is a non-von Neumann paradigm that has recently established itself as a promising approach for energy-efficient, high throughput hardware for deep learning applications. One prominent application of IMC is that of performing matrix-vector multiplication in (1) time complexity by mapping the synaptic weights of a neural-network layer to the devices of a... » read more

Accelerating Inference of Convolutional Neural Networks Using In-memory Computing


Abstract: "In-memory computing (IMC) is a non-von Neumann paradigm that has recently established itself as a promising approach for energy-efficient, high throughput hardware for deep learning applications. One prominent application of IMC is that of performing matrix-vector multiplication in (1) time complexity by mapping the synaptic weights of a neural-network layer to the devices of an ... » read more

Startup Funding: December 2020


AI hardware startups were hot in our December startup-funding focus, with two companies landing rounds exceeding $100M and plenty of others seeing investment. Two Chinese EDA companies received funding in a bid to boost the country's semiconductor ecosystem. One company providing control systems for fabs achieved $8M in Series A, and both autonomous driving and electric vehicles pulled in lots ... » read more

5 Predictions For AI Innovation In 2021


By Arun Venkatachar and Stelios Diamantidis Artificial intelligence (AI) has emerged as one of the most important watchwords in all of technology. The once-utopian vision of developing machines that can think and behave like humans is becoming more of a reality as engineering innovations enable the performance required to process and interpret previously unimaginable amounts of data efficien... » read more

The Emergence Of Hardware As A Key Enabler For The Age Of Artificial Intelligence


Over the past few decades, software has been the engine of innovation for countless applications. From PCs to mobile phones, well-defined hardware platforms and instruction set architectures (ISA) have enabled many important advancements across vertical markets. The emergence of abundant-data computing is changing the software-hardware balance in a dramatic way. Diverse AI applications in fa... » read more

Power/Performance Bits: July 21


AI hardware Researchers at Purdue University, University of California San Diego, Argonne National Laboratory, University of Louisville, Brookhaven National Laboratory, and University of Iowa developed hardware that can learn skills, offloading some of the energy needed by AI software. "Software is taking on most of the challenges in AI. If you could incorporate intelligence into the circui... » read more

← Older posts