기본 정보
연구 분야
프로젝트
논문
구성원
preprint|
green
·인용수 0
·2023
Improved weight initialization for deep and narrow feedforward neural network
Hyun-Woo Lee, Yunho Kim, Seung Yeop Yang, Hayoung Choi
arXiv (Cornell University)
초록

Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of \textquotedblleft dying ReLU," where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method.

키워드
InitializationActivation functionComputer scienceArtificial neural networkArtificial intelligenceFeedforward neural networkDeep learningFeed forwardFunction (biology)Machine learning
타입
preprint
IF / 인용수
- / 0
게재 연도
2023

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.