Article,
Robust Sensor Fusion and Biomimetic Control of a Lower-Limb Exoskeleton With Multimodal Sensors
Affiliations
- [1] Aalborg University [NORA names: AAU Aalborg University; University; Denmark; Europe, EU; Nordic; OECD]
Abstract
This article presents a systematic approach to robustly control a lower-limb exoskeleton in real-time using multimodal sensors. The control adopts two sensor bands that combine an array of force-sensitive resistors (FSR) and an inertial measurement unit (IMU) to measure both force myography (FMG) signals and limb motion. A robust sensor fusion algorithm that combines FMG and IMU signals with artificial neural networks is developed to accurately estimate the wearer’s hip and knee rotation angles. Moreover, a mathematical model of the lower-limb exoskeleton is calibrated and validated with real-time experimental data. Finally, a model-based controller is designed to track position references generated from the network through linear matrix inequalities. The biomimetic control algorithm is tested in simulation and a physical setup to show the effectiveness of the novel control method. Note to Practitioners—This work addresses the challenge of real-time control of lower-limb exoskeletons. Our approach to address the trajectory generation problem involves emulating the movements of a healthy limb. This is achieved through a robust sensor fusion algorithm to integrate data from two multimodal sensors, namely, FMG and IMU sensors, enabling a reliable estimation of the hip and knee’s angular positions. Signals are processed and neural networks are trained with exoskeleton encoder measurements as targets. Physical tests show both the accuracy and robustness of the control method. Potential applications of the new method include real-time gait analysis, control of upper limb exoskeletons, and study of the induction of neural plasticity.