A new technical paper titled “A Comprehensive Technique Based on Machine Learning for Device and Circuit Modeling of Gate-All-Around Nanosheet Transistors” was published by researchers at National Yang Ming Chiao Tung University.
Abstract (excerpt)
“Machine learning (ML) is poised to play an important part in advancing the predicting capability in semiconductor device compact modeling domain. One major advantage of ML-based compact modeling is its ability to capture complex relationships and patterns in large datasets. Therefore, in this paper a novel design scheme based on dynamically adaptive neural network (DANN) is proposed to develop fast and accurate compact model (CM). This framework constitutes a powerful yet computationally efficient methodology and exhibits emergent dynamic behaviors. This paper demonstrates that the compact model based on ML can be designed to replicate the performance of conventional compact model for nanodevices. For this work, gate-all-around (GAA) nanosheet (NS) device characteristics are comprehensively analyzed for process variability sources using the proposed model. “
Find the technical paper here. Published October 2023.
R. Butola, Y. Li and S. R. Kola, “A Comprehensive Technique Based on Machine Learning for Device and Circuit Modeling of Gate-All-Around Nanosheet Transistors,” in IEEE Open Journal of Nanotechnology, vol. 4, pp. 181-194, 2023, doi: 10.1109/OJNANO.2023.3328425.
Related Reading
What Designers Need To Know About GAA
Gate-all-around is set to replace finFET, but it brings its own set of challenges and unknowns.
Leave a Reply