NeuroSim Simulator for Compute-in-Memory Hardware Accelerator: Validation and Benchmark


Abstract:   "Compute-in-memory (CIM) is an attractive solution to process the extensive workloads of multiply-and-accumulate (MAC) operations in deep neural network (DNN) hardware accelerators. A simulator with options of various mainstream and emerging memory technologies, architectures, and networks can be a great convenience for fast early-stage design space exploration of CIM hardw... » read more

Will Monolithic 3D DRAM Happen?


As DRAM scaling slows, the industry will need to look for other ways to keep pushing for more and cheaper bits of memory. The most common way of escaping the limits of planar scaling is to add the third dimension to the architecture. There are two ways to accomplish that. One is in a package, which is already happening. The second is to sale the die into the Z axis, which which has been a to... » read more

FeFETs Bring Promise And Challenges


Ferroelectric FETs (FeFETs) and memory (FeRAM) are generating high levels of interest in the research community. Based on a physical mechanism that hasn’t yet been commercially exploited, they join the other interesting new physics ideas that are in various stages of commercialization. “FeRAM is very promising, but it's like all promising memory technologies — it takes a while to get b... » read more

5G as a wireless power grid


Abstract "5G has been designed for blazing fast and low-latency communications. To do so, mm-wave frequencies were adopted and allowed unprecedently high radiated power densities by the FCC. Unknowingly, the architects of 5G have, thereby, created a wireless power grid capable of powering devices at ranges far exceeding the capabilities of any existing technologies. However, this potential c... » read more

Week In Review: Auto, Security, Pervasive Computing


COVID-19/Medical Mentor's parent company Siemens is making its Additive Manufacturing (AM) Network, along with its 3D printers, available to the global medical community. MEMS is at the forefront of SARS-CoV-2 testing, writes Alissa M. Fitzgerald, founder of AMFitzgerald in a blog on SEMI.org. Fitzgerald points out a MEMS silicon PCR chip, developed by Northrup et. al. at Lawrence Livermore... » read more

Scaling Up Compute-In-Memory Accelerators


Researchers are zeroing in on new architectures to boost performance by limiting the movement of data in a device, but this is proving to be much harder than it appears. The argument for memory-based computation is familiar by now. Many important computational workloads involve repetitive operations on large datasets. Moving data from memory to the processing unit and back — the so-called ... » read more

Challenges In Printed And Disposable Chips


Printing inexpensive chips using technology developed for newspapers and magazines is gaining traction across a wide range of applications, from photovoltaic cells to sensors on a flexible substrate. But it's also adding a slew of new challenges that are unique to this approach. The world of flexible hybrid electronics (FHE) — printing integrated circuits on or attaching thin IC chips to a... » read more

Stacking Memory On Logic, Take Two


True 3D-ICs, where a memory die is stacked on top of a logic die using through-silicon vias, appear to be gaining momentum. There are a couple reasons why this is happening, and a handful of issues that need to be considered before even seriously considering this option. None of this is easy. On a scale of 1 to 10, this ranks somewhere around 9.99, in part because the EDA tools needed to rem... » read more

System Bits: Sept. 17


Quantum computing R&D in Germany IBM is teaming with the Fraunhofer Society for research and development of quantum computing technology, backed by the German government, which is providing €650 million (about $715.4 million) in funding over two years for the program. IBM has agreed to install a Q System One system at one of its facilities in Germany for the program. The system has 20... » read more

System Bits: Aug. 5


Algorithm could advance quantum computing Scientists at the Los Alamos National Laboratory report the development of a quantum computing algorithm that promises to provide a better understanding of the quantum-to-classical transition, enabling model systems for biological proteins and other advanced applications. “The quantum-to-classical transition occurs when you add more and more parti... » read more

← Older posts Newer posts →