기본 정보
연구 분야
프로젝트
발행물
구성원
article|
인용수 4
·2025
Decoupling Strategy to Separate Training and Inference with Three-Dimensional Neuromorphic Hardware Composed of Neurons and Hybrid Synapses
Jung Woo Lee, See‐On Park, Seong‐Yun Yun, Yeeun Kim, Hyun Myung, Shinhyun Choi, Yang‐Kyu Choi
IF 16ACS Nano
초록

Monolithic 3D integration of neuron and synapse devices is considered a promising solution for energy-efficient and compact neuromorphic hardware. However, achieving optimal performance in both training and inference remains challenging as these processes require different synapse devices with reliable endurance and long retention. Here, we introduce a decoupling strategy to separate training and inference using monolithically integrated neuromorphic hardware with layer-by-layer fabrication. This 3D neuromorphic hardware includes neurons consisting of a single transistor (1T-neuron) in the first layer, long-term operational synapses composed of a single thin-film transistor with a SONOS structure (1TFT-synapses) in the second layer for inference, and durable synapses composed of a memristor (1M-synapses) in the third layer for training. A 1TFT-synapse, utilizing a charge-trap layer, exhibits long retention properties favorable for inference tasks. In contrast, a 1M-synapse, leveraging anion movement at the interface, demonstrates robust endurance for repetitive weight updates during training. With the proposed hybrid synapse architecture, frequent training can be performed using the 1M-synapses with robust endurance, while intermittent inference can be managed using the 1TFT-synapses with long-term retention. This decoupling of synaptic functions is advantageous for achieving a reliable spiking neural network (SNN) in neuromorphic computing.

키워드
Neuromorphic engineeringSynapseDecoupling (probability)Computer scienceArtificial neural networkSpiking neural networkMemristorInferenceComputer architectureArtificial intelligence
타입
article
IF / 인용수
16 / 4
게재 연도
2025