Xinyu Yi1
Yuxiao Zhou2
Feng Xu1
1Tsinghua University
2ETH Zurich
Accepted by SIGGRAPH 2024 (Conference track)
Existing inertial motion capture techniques use the human root coordinate frame to estimate local poses and treat it as an inertial frame by default. We argue that when the root has linear acceleration or rotation, the root frame should be considered non-inertial theoretically. In this paper, we model the fictitious forces that are non-neglectable in a non-inertial frame by an auto-regressive estimator delicately designed following physics. With the fictitious forces, the force-related IMU measurement (accelerations) can be correctly compensated in the non-inertial frame and thus Newton’s laws of motion are satisfied. In this case, the relationship between the accelerations and body motions is deterministic and learnable, and we train a neural network to model it for better motion capture. Furthermore, to train the neural network with synthetic data, we develop an IMU synthesis by simulation strategy to better model the noise model of IMU hardware and allow parameter tuning to fit different hardware. This strategy not only establishes the network training with synthetic data but also enables calibration error modeling to handle bad motion capture calibration, increasing the robustness of the system.
This work was supported by the National Key R&D Program of China (2023YFC3305600, 2018YFA0704000), the NSFC (No.62021002), and the Key Research and Development Project of Tibet Autonomous Region (XZ202101ZY0019G). This work was also supported by THUIBCS, Tsinghua University, and BLBCI, Beijing Municipal Education Commission. We thank Notiom for the extensive support on inertial sensors, and Liangdi Ma, Yunzhe Shao, Shaohua Pan for the help on live demos. Feng Xu is the corresponding author.
@inproceedings{yi2024pnp, title={Physical Non-inertial Poser (PNP): Modeling Non-inertial Effects in Sparse-inertial Human Motion Capture}, author={Yi, Xinyu and Zhou, Yuxiao and Xu, Feng}, booktitle={SIGGRAPH 2024 Conference Papers}, year={2024} }