Will You Be My Quarantine: A Computer Vision and Inertial Sensor Based Home Exercise System. Albert, Justin; Zhou, Lin; Gloeckner, Pawel; Trautmann, Justin; Ihde, Lisa; Eilers, Justus; Kamal, Mohammed; Arnrich, Bert (2020). (Vol. 14)
The quarantine situation inflicted by the COVID-19 pandemic has left many people around the world isolated at home. Despite the large variety of mobile device-based self exercise tools for training plans, activity recognition or repetition counts, it remains challenging for an inexperienced person to perform fitness workouts or learn a new sport with the correct movements at home. As a proof of concept, a home exercise system has been developed in this contribution. The system takes computer vision and inertial sensor data recorded for the same type of exercise as two independent inputs, and processes the data from both sources into the same representations on the levels of raw inertial measurement unit (IMU) data and 3D movement trajectories. Moreover, a Key Performance Indicator (KPI) dashboard was developed for data import and visualization. The usability of the system was investigated with an example use case where the learner equipped with IMUs performed a kick movement and was able to compare it to that from a coach in the video.
Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study. Albert, Justin; Owolabi, Victor; Gebel, Arnd; Brahms, Markus Clemens; Granacher, Urs; Arnrich, Bert in MDPI Sensors (2020). 20(18)
Gait analysis is an important tool for the early detection of neurological diseases and for the assessment of risk of falling in elderly people. The availability of low-cost camera hardware on the market today and recent advances in Machine Learning enable a wide range of clinical and health-related applications, such as patient monitoring or exercise recognition at home. In this study, we evaluated the motion tracking performance of the latest generation of the Microsoft Kinect camera, Azure Kinect, compared to its predecessor Kinect v2 in terms of treadmill walking usinggold standard Vicon multi-camera motion capturing system and the 39 marker Plug-in Gait model. Five young and healthy subjects walked on a treadmill at three different velocities while data were recorded simultaneously with all three camera systems. An easy-to-administer camera calibration method developed here was used to spatially align the 3D skeleton data from both Kinect cameras and the Vicon system. With this calibration, the spatial agreement of joint positions between the two Kinect cameras and the reference system was evaluated. In addition, we compared the accuracy of certain spatio-temporal gait parameters, i.e., step length, step time, step width, and stride time calculated from the Kinect data, with the gold standard system. Our results showed that the improved hardware and the motion tracking algorithm of the Azure Kinect camera led to a significantly higher accuracy of the spatial gait parameters than the predecessor Kinect v2, while no significant differences were found between the temporal parameters. Furthermore, we explain in detail how this experimental setup could be used to continuously monitor the progress during gait rehabilitation in older people.