The main objective of this study is to develop an algorithmic pipeline that recognizes human locomotion activities using motion sensor data from smartphones.The pipeline aims to minimize classification errors caused by individual differences and variations in sensor measurement locations.In particular, the dataset provided for the 2024 SHL recognition challenge comprises three types of sensor modalities, with certain motion sensor randomly missing.To address this challenge, our team, 'HELP', presents an algorithmic pipeline that combines a convolutional neural network architecture with hand-crafted feature engineering to accommodate diverse features from motion sensor modalities.We also specify the preprocessing schemes used to create the augmented input for training the proposed pipeline.We then conduct experiments to compare its performance with existing machine learning classifiers, verifying its relative superiority.