A technical paper titled “Retention-aware zero-shifting technique for Tiki-Taka algorithm-based analog deep learning accelerator” was published by researchers at Pohang University of Science and Technology, Korea University, and Kyungpook National University.
“We present the fabrication of 4 K-scale electrochemical random-access memory (ECRAM) cross-point arrays for analog neural network training accelerator and an electrical characteristic of an 8 × 8 ECRAM array with a 100% yield, showing excellent switching characteristics, low cycle-to-cycle, and device-to-device variations. Leveraging the advances of the ECRAM array, we showcase its efficacy in neural network training using the Tiki-Taka version 2 algorithm (TTv2) tailored for non-ideal analog memory devices,” states the paper.
Find the technical paper here. Published June 2024.
Kyungmi Noh et al. ,Retention-aware zero-shifting technique for Tiki-Taka algorithm-based analog deep learning accelerator.Sci. Adv.10,eadl3350(2024).DOI:10.1126/sciadv.adl3350
Related Reading
Running More Efficient AI/ML Code With Neuromorphic Engines
Once a buzzword, neuromorphic engineering is gaining traction in the semiconductor industry.
Leave a Reply