MeCap: Whole-Body Digitization for Low-Cost VR/AR Headsets (UIST ’19)

Low-cost, smartphone-powered VR/AR headsets are becoming more popular. These basic devices – little more than plastic or cardboard shells – lack advanced features, such as controllers for the hands, limiting their interactive capability. Moreover, even high-end consumer headsets lack the ability to track the body and face. For this reason, interactive experiences like social VR are underdeveloped. We introduce MeCap, which enables commodity VR headsets to be augmented with powerful motion capture (“MoCap”) and usersensing capabilities at very low cost (under $5). Using only a pair of hemi-spherical mirrors and the existing rear-facing camera of a smartphone, MeCap provides real-time estimates of a wearer’s 3D body pose, hand pose, facial expression, physical appearance and surrounding environment – capabilities which are either absent in contemporary VR/AR systems or which require specialized hardware and controllers. We evaluate the accuracy of each of our tracking features, the results of which show imminent feasibility.

Download PDF

Ahuja, K., Harrison, C., Goel, M. and Xiao, R. (2019). MeCap: Whole-Body Digitization for Low-Cost VR/AR Headsets. In Proceedings of the 32nd Annual Symposium on User Interface Software & Technology (UIST ’19). ACM, New York, NY, USA. 453-462. DOI: 10.1145/3332165.3347889

MeCap uses two spherical mirrors to generate a stereoscopic view of the world
By combining two views with pose estimation algorithms, we can obtain the 3D pose of body points
MeCap can be used to sense body and hand pose from a head-mounted VR rig – without any external hardware
MeCap can also be used to build environment maps for improved in-world object rendering
Previous Article
Next Article