기본 정보
연구 분야
프로젝트
논문
구성원
article|
·
인용수 14
·2023
Correlation Recurrent Units: A Novel Neural Architecture for Improving the Predictive Performance of Time-Series Data
Sunghyun Sim, Dohee Kim, Hyerim Bae
IF 20.8IEEE Transactions on Pattern Analysis and Machine Intelligence
초록

Time-series forecasting (TSF) is a traditional problem in the field of artificial intelligence, and models such as recurrent neural network, long short-term memory, and gate recurrent units have contributed to improving its predictive accuracy. Furthermore, model structures have been proposed to combine time-series decomposition methods such as seasonal-trend decomposition using LOESS. However, this approach is learned in an independent model for each component, and therefore, it cannot learn the relationships between the time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time-series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using four univariate and four multivariate time-series datasets. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results indicate that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.

키워드
UnivariateArtificial neural networkAutocorrelationRecurrent neural networkComputer scienceCruTime seriesArtificial intelligenceSeries (stratigraphy)Multivariate statistics
타입
article
IF / 인용수
20.8 / 14
게재 연도
2023

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.