Machine Learning to Improve Orientation Estimation in Sports Situations Challenging for Inertial Sensor Use

Marit P. van Dijk*, Manon Kok, Monique A.M. Berger, Marco J.M. Hoozemans, Dirk Jan H.E.J. Veeger

*Corresponding author for this work

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

In sports, inertial measurement units are often used to measure the orientation of human body segments. A Madgwick (MW) filter can be used to obtain accurate inertial measurement unit (IMU) orientation estimates. This filter combines two different orientation estimates by applying a correction of the (1) gyroscope-based estimate in the direction of the (2) earth frame-based estimate. However, in sports situations that are characterized by relatively large linear accelerations and/or close magnetic sources, such as wheelchair sports, obtaining accurate IMU orientation estimates is challenging. In these situations, applying the MW filter in the regular way, i.e., with the same magnitude of correction at all time frames, may lead to estimation errors. Therefore, in this study, the MW filter was extended with machine learning to distinguish instances in which a small correction magnitude is beneficial from instances in which a large correction magnitude is beneficial, to eventually arrive at accurate body segment orientations in IMU-challenging sports situations. A machine learning algorithm was trained to make this distinction based on raw IMU data. Experiments on wheelchair sports were performed to assess the validity of the extended MW filter, and to compare the extended MW filter with the original MW filter based on comparisons with a motion capture-based reference system. Results indicate that the extended MW filter performs better than the original MW filter in assessing instantaneous trunk inclination (7.6 vs. 11.7° root-mean-squared error, RMSE), especially during the dynamic, IMU-challenging situations with moving athlete and wheelchair. Improvements of up to 45% RMSE were obtained for the extended MW filter compared with the original MW filter. To conclude, the machine learning-based extended MW filter has an acceptable accuracy and performs better than the original MW filter for the assessment of body segment orientation in IMU-challenging sports situations.

Original languageEnglish
Article number670263
Pages (from-to)1-12
Number of pages12
JournalFrontiers in Sports and Active Living
Volume3
Issue numberAugust
Early online date3 Aug 2021
DOIs
Publication statusPublished - Aug 2021

Bibliographical note

Funding Information:
Funding. This work was supported by ZonMw under project number 546003002. This project, named WheelPower: wheelchair sports and data science push it to the limit is a cooperative effort between TU Delft, UMCG, THUAS, VU Amsterdam and is in cooperation with several sports federations collected under the umbrella of NOC*NSF.

Publisher Copyright:
© Copyright © 2021 van Dijk, Kok, Berger, Hoozemans and Veeger.

Funding

Funding. This work was supported by ZonMw under project number 546003002. This project, named WheelPower: wheelchair sports and data science push it to the limit is a cooperative effort between TU Delft, UMCG, THUAS, VU Amsterdam and is in cooperation with several sports federations collected under the umbrella of NOC*NSF. This work was supported by ZonMw under project number 546003002. This project, named WheelPower: wheelchair sports and data science push it to the limit is a cooperative effort between TU Delft, UMCG, THUAS, VU Amsterdam and is in cooperation with several sports federations collected under the umbrella of NOC∗NSF.

FundersFunder number
National Science Foundation
ZonMw546003002
Technische Universiteit Delft

    Keywords

    • inertial measurement unit
    • kinematics
    • machine learning
    • madgwick filter
    • orientation estimation
    • sports

    Fingerprint

    Dive into the research topics of 'Machine Learning to Improve Orientation Estimation in Sports Situations Challenging for Inertial Sensor Use'. Together they form a unique fingerprint.

    Cite this