PatternTrack: Multi-Device Tracking Using Infrared, Structured-Light Projections from Built-in LiDAR (CHI ’25)

As augmented reality devices (e.g., smartphones and headsets) proliferate in the market, multi-user AR scenarios are set to become more common. Co-located users will want to share coherent and synchronized AR experiences, but this is surprisingly cumbersome with current methods. In response, we developed PatternTrack, a novel tracking approach that repurposes the structured infrared light patterns emitted by VCSEL-driven depth sensors, like those found in the Apple Vision Pro, iPhone, iPad, and Meta Quest 3. Our approach is infrastructure-free, requires no pre-registration, works on featureless surfaces, and provides the real-time 3D position and orientation of other users’ devices. In our evaluation — tested on six different surfaces and with inter-device distances of up to 260 cm — we found a mean 3D positional tracking error of 11.02 cm and a mean angular error of 6.81°.

Download PDF

Kim, D., Xiao, R. and Harrison, C. (2025). PatternTrack: Multi-Device Tracking Using Infrared, Structured-Light Projections from Built-in LiDAR. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI ’25). ACM, New York, NY, USA. Article 552, 14 pages. DOI: https://doi.org/10.1145/3706598.3713388.

Previous Article
Next Article