Kanglei Zhou
A Mixed Reality Training System for Hand-Object Interaction in Simulated Microgravity Environments
Zhou, Kanglei; Chen, Chen; Ma, Yue; Leng, Zhiying; Shum, Hubert P.H.; Li, Frederick W.B.; Liang, Xiaohui
Authors
Chen Chen
Yue Ma
Zhiying Leng
Dr Hubert Shum hubert.shum@durham.ac.uk
Associate Professor
Dr Frederick Li frederick.li@durham.ac.uk
Associate Professor
Xiaohui Liang
Abstract
As human exploration of space continues to progress, the use of Mixed Reality (MR) for simulating microgravity environments and facilitating training in hand-object interaction holds immense practical significance. However, hand-object interaction in microgravity presents distinct challenges compared to terrestrial environments due to the absence of gravity. This results in heightened agility and inherent unpredictability of movements that traditional methods struggle to simulate accurately. To this end, we propose a novel MR-based hand-object interaction system in simulated microgravity environments, leveraging physics-based simulations to enhance the interaction between the user’s real hand and virtual objects. Specifically, we introduce a physics-based hand-object interaction model that combines impulse-based simulation with penetration contact dynamics. This accurately captures the intricacies of hand-object interaction in microgravity. By considering forces and impulses during contact, our model ensures realistic collision responses and enables effective object manipulation in the absence of gravity. The proposed system presents a cost-effective solution for users to simulate object manipulation in microgravity. It also holds promise for training space travelers, equipping them with greater immersion to better adapt to space missions. The system reliability and fidelity test verifies the superior effectiveness of our system compared to the state-of-the-art CLAP system.
Citation
Zhou, K., Chen, C., Ma, Y., Leng, Z., Shum, H. P., Li, F. W., & Liang, X. (in press). A Mixed Reality Training System for Hand-Object Interaction in Simulated Microgravity Environments. In Proceedings of the 2023 International Symposium on Mixed and Augmented Reality
Conference Name | ISMAR 23: International Symposium on Mixed and Augmented Reality |
---|---|
Conference Location | Sydney, Australia |
Start Date | Oct 16, 2023 |
End Date | Oct 20, 2023 |
Acceptance Date | Aug 10, 2023 |
Deposit Date | Aug 16, 2023 |
Publicly Available Date | Sep 27, 2023 |
Publisher | Institute of Electrical and Electronics Engineers |
Book Title | Proceedings of the 2023 International Symposium on Mixed and Augmented Reality |
Public URL | https://durham-repository.worktribe.com/output/1718617 |
Publisher URL | https://ieeexplore.ieee.org/xpl/conhome/1000465/all-proceedings |
This file is under embargo due to copyright reasons.
You might also like
C2SPoint: A classification-to-saliency network for point cloud saliency detection
(2023)
Journal Article
Gamifying Experiential Learning Theory
(2023)
Conference Proceeding
IAACS: Image Aesthetic Assessment Through Color Composition And Space Formation
(2023)
Journal Article
Aesthetic Enhancement via Color Area and Location Awareness
(2022)
Conference Proceeding