기본 정보
연구 분야
논문
구성원
article|
인용수 0
·2025
PerAct-L: Linear Self-Attention Mechanism For Computational Optimization of Robotic Transformer
Yejin Lee, Hoyoung Song, Seong-Hwan Kim, Taekjin Jang, Young-Lim Choi, Hyun-Seok Kim
Journal of Korea Multimedia Society
초록

In this study, we propose PerAct-L, an enhanced version of the PerAct robot manipulation model based on the Perceiver IO architecture, specifically designed to reduce its computational complexity. PerAct-L incorporates a linear low-rank self-attention mechanism that projects the Key and Value matrices into a fixed low-dimensional space. This approach reduces the time complexity of the attention operation from O(N²) to O(N<sub>k</sub>), effectively lowering both computational requirements and memory usage. Experiments conducted in the RLBench environment show that PerAct-L improves the success rate by 13.6 % over the original PerAct model when performing the complex &quotPut Item in Drawer&quot task. Additionally, in a previously unseen environment with an expanded rotation range, it achieves a performance gain of 12.8 %. Further analysis of the latent vector activation distribution after cross-attention reveals that PerAct-L tends to distribute information more evenly. This suggests that computational optimization contributes to enhanced representational diversity and robustness in manipulation. These findings validate that low-rank attention mechanisms can not only improve real-time performance and adaptability but also enhance expressivity, demonstrating the structural scalability and practical potential of PerAct-L.

키워드
Robustness (evolution)AdaptabilityScalabilityComputational complexity theoryRobotComputational modelMechanism (biology)
타입
article
IF / 인용수
- / 0
게재 연도
2025

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.