10 May 2024 | Nouf Abdullah Almujally, Danyal Khan, Naif Al Mudawi, Mohammed Alonazi, Abdulwahab Alazeb, Asaad Algarni, Ahmad Jalal, Hui Liu
This article presents a biosensor-driven IoT wearable system for accurate body motion tracking and localization. The system utilizes smartphone sensors to recognize both physical and location-based human activities, such as walking, running, jumping, indoor, and outdoor activities. The system processes raw sensor data using a Butterworth filter for inertial sensors and a median filter for GPS data, followed by Hamming windowing for segmentation. Features are extracted from inertial and GPS data, selected using variance threshold feature selection, and augmented using permutation-based data augmentation. The augmented features are optimized using the Yeo–Johnson power transformation algorithm before being classified using a multi-layer perceptron (MLP). The system is evaluated using K-fold cross-validation on the Extrasensory and Sussex Huawei Locomotion (SHL) datasets. The results show high accuracy, with 96% and 94% for physical activities and 94% and 91% for location-based activities over the Extrasensory and SHL datasets, outperforming previous state-of-the-art methods. The system's contributions include separate denoising filters for inertial and GPS sensors, a robust methodology for concurrent feature extraction, dedicated processing streams for localization and locomotion, a novel data augmentation technique, and an advanced feature optimization algorithm. The research addresses challenges in human activity recognition, including sensor heterogeneity, noise in raw data, variations in sampling frequencies, data drift, and privacy concerns. The system is designed to provide accurate and reliable motion and location recognition, with applications in healthcare, sports, security, and real-time location tracking. The proposed system demonstrates high performance in activity recognition, with a focus on improving accuracy and robustness through advanced data processing and machine learning techniques.This article presents a biosensor-driven IoT wearable system for accurate body motion tracking and localization. The system utilizes smartphone sensors to recognize both physical and location-based human activities, such as walking, running, jumping, indoor, and outdoor activities. The system processes raw sensor data using a Butterworth filter for inertial sensors and a median filter for GPS data, followed by Hamming windowing for segmentation. Features are extracted from inertial and GPS data, selected using variance threshold feature selection, and augmented using permutation-based data augmentation. The augmented features are optimized using the Yeo–Johnson power transformation algorithm before being classified using a multi-layer perceptron (MLP). The system is evaluated using K-fold cross-validation on the Extrasensory and Sussex Huawei Locomotion (SHL) datasets. The results show high accuracy, with 96% and 94% for physical activities and 94% and 91% for location-based activities over the Extrasensory and SHL datasets, outperforming previous state-of-the-art methods. The system's contributions include separate denoising filters for inertial and GPS sensors, a robust methodology for concurrent feature extraction, dedicated processing streams for localization and locomotion, a novel data augmentation technique, and an advanced feature optimization algorithm. The research addresses challenges in human activity recognition, including sensor heterogeneity, noise in raw data, variations in sampling frequencies, data drift, and privacy concerns. The system is designed to provide accurate and reliable motion and location recognition, with applications in healthcare, sports, security, and real-time location tracking. The proposed system demonstrates high performance in activity recognition, with a focus on improving accuracy and robustness through advanced data processing and machine learning techniques.