Power/Performance Bits: May 19


Neuromorphic magnetic nanowires Researchers from the University of Texas at Austin, University of Texas at Dallas, and Sandia National Laboratory propose a neuromorphic computing method using magnetic components. The team says this approach can cut the energy cost of training neural networks. "Right now, the methods for training your neural networks are very energy-intensive," said Jean Ann... » read more

Blog Review: Dec. 11


Arm's Urmish Thakker investigates ways to make recurrent neural networks run on resource constrained devices with limited cache and compute resources by reducing the number of RNN computations, without the need to retrain the original RNN model. Mentor's Brent Klingforth digs into the challenges of designing rigid-flex PCBs and how advanced capabilities in modern tools, like awareness of sta... » read more

The Next Big Chip Companies


Rambus’ Mike Noonen looks at why putting everything on a single die no longer works, what comes after Moore’s Law, and what the new business model looks like for chipmakers. https://youtu.be/X6Kca8Vm-wA » read more