기본 정보
연구 분야
프로젝트
발행물
구성원
article|
인용수 0
·2025
Multi-Modal Locomotion Mode Recognition in the Real World for Robotic Hip Complex Exoskeletons
Hyesoo Shin, Sangdo Kim, Sunwoo Kim, Jongwon Lee, Jinkyu Kim, KangGeon Kim
IF 5.3IEEE Robotics and Automation Letters
초록

Lower limb exoskeletons assist users by supporting joint movements. Since joint motion patterns vary depending on how the user moves, accurately recognizing the type of movement (locomotion mode) is crucial for controlling the exoskeleton and ensuring user safety. Inspired by how humans use multiple types of sensory information to control movement, we developed a multi-modal locomotion mode recognition (LMR) system that uses both mechanical and visual sensor data to identify locomotion modes. Our approach utilizes two fusion methods: intermediate fusion, which combines the data in the form of features, and late fusion, which integrates the sensor data by averaging the recognition results from each sensor. By fusing these two different modalities, the prediction accuracy improved by an average of 11.7% with the test data. Through comparisons with uni-modal LMR systems that rely on a single type of sensor data for locomotion mode recognition, we found that the improved performance of the multi-modal LMR system is due to the visual information's ability to generalize different gait patterns across users and the mechanical sensor data's consistency within the same classes.

키워드
ExoskeletonModalComputer scienceMode (computer interface)Artificial intelligenceComputer visionHuman–computer interactionSimulationMaterials science
타입
article
IF / 인용수
5.3 / 0
게재 연도
2025