中文 English

Power/Performance Bits: March 16


Adaptable neural nets Neural networks go through two phases: training, when weights are set based on a dataset, and inference, when new information is assessed based on those weights. But researchers at MIT, Institute of Science and Technology Austria, and Vienna University of Technology propose a new type of neural network that can learn during inference and adjust its underlying equations to... » read more

Power/Performance Bits: May 19


Neuromorphic magnetic nanowires Researchers from the University of Texas at Austin, University of Texas at Dallas, and Sandia National Laboratory propose a neuromorphic computing method using magnetic components. The team says this approach can cut the energy cost of training neural networks. "Right now, the methods for training your neural networks are very energy-intensive," said Jean Ann... » read more

Power/Performance Bits: March 8


Configurable analog chip Researchers at Georgia Tech built a new configurable computing device, the Field-Programmable Analog Array (FPAA) SoC, that uses analog technology supported by digital components and can be built up to a hundred times smaller while using a thousand times less electrical power than comparable digital floating-gate configurable devices. Professionals familiar with F... » read more