Enabling Training of Neural Networks on Noisy Hardware


Abstract:  "Deep neural networks (DNNs) are typically trained using the conventional stochastic gradient descent (SGD) algorithm. However, SGD performs poorly when applied to train networks on non-ideal analog hardware composed of resistive device arrays with non-symmetric conductance modulation characteristics. Recently we proposed a new algorithm, the Tiki-Taka algorithm, that overcomes t... » read more

Accelerating Inference of Convolutional Neural Networks Using In-memory Computing


Abstract: "In-memory computing (IMC) is a non-von Neumann paradigm that has recently established itself as a promising approach for energy-efficient, high throughput hardware for deep learning applications. One prominent application of IMC is that of performing matrix-vector multiplication in (1) time complexity by mapping the synaptic weights of a neural-network layer to the devices of a... » read more

Von Neumann Is Struggling


In an era dominated by machine learning, the von Neumann architecture is struggling to stay relevant. The world has changed from being control-centric to one that is data-centric, pushing processor architectures to evolve. Venture money is flooding into domain-specific architectures (DSA), but traditional processors also are evolving. For many markets, they continue to provide an effective s... » read more

Week In Review: Design, Low Power


M&A Microchip Technology acquired LegUp Computing, a provider of a high-level synthesis compiler that automatically generates high-performance FPGA hardware from software. The LegUp HLS tool will be used alongside Microchip’s VectorBlox Accelerator Software Design kit and VectorBlox Neural Networking IP generator to provide a complete front-end solution stack for C/C++ algorithm develope... » read more

Are Better Machine Training Approaches Ahead?


We live in a time of unparalleled use of machine learning (ML), but it relies on one approach to training the models that are implemented in artificial neural networks (ANNs) — so named because they’re not neuromorphic. But other training approaches, some of which are more biomimetic than others, are being developed. The big question remains whether any of them will become commercially viab... » read more

System Bits: Aug. 27


A ring of 18 carbon atoms Scientists at IBM Research – Zurich and Oxford University write about allotropes of carbon – the many versions of atomic carbon formations, such as diamonds and graphite. “Carbon, one of the most abundant elements in the universe, can exist in different forms - called allotropes - giving it completely different properties from color to shape to hardness. For... » read more

Blog Review: May 29


Cadence's Meera Collier traces the evolution of computing through the series of bottlenecks the industry has needed to overcome and what's being done to address the latest one. Mentor's Rebecca Lord checks out the use of differential signals to mitigate the effects of electromagnetic interference, noise, and crosstalk in PCBs. Synopsys' Taylor Armerding considers whether Ireland's slow en... » read more

Using Analog For AI


If the only tool you have is a hammer, everything looks like a nail. But development of artificial intelligence (AI) applications and the compute platforms for them may be overlooking an alternative technology—analog. The semiconductor industry has a firm understanding of digital electronics and has been very successful making it scale. It is predictable, has good yield, and while every de... » read more

Week in Review: IoT, Security, Auto


Internet of Things Is Google developing a Pixel Watch wearable? Perhaps, if recent job listings are any indication. The company recently was looking to hire someone as vice president of hardware engineering, wearables. Last month, Fossil Group sold smartwatch technology intellectual property to Google for $40 million, while Google hired certain members of Fossil’s wearables R&D team. ... » read more

System Bits: July 16


Test tube AI neural network In a significant step towards demonstrating the capacity to program artificial intelligence into synthetic biomolecular circuits, Caltech researchers have developed an artificial neural network made out of DNA that can solve a classic machine learning problem: correctly identifying handwritten numbers. The work was done in the laboratory of Lulu Qian, assistant p... » read more

← Older posts Newer posts →