Propagate And Calibrate:
Real-time Passive Non-line-of-sight Tracking

1Shanghai AI Laboratory, 2Northwestern Polytechnical University
*Equal contribution, Corresponding author

Reconstruct the trajectory of a person out of sight in real time with PAC-Net.


Non-line-of-sight (NLOS) tracking has drawn increasing attention in recent years, due to its ability to detect object motion out of sight. Most previous works on NLOS tracking rely on active illumination, e.g., laser, and suffer from high cost and elaborate experimental conditions. Besides, these techniques are still far from practical application due to oversimplified settings.

In contrast, we propose a purely passive method to track a person walking in an invisible room by only observing a relay wall, which is more in line with real application scenarios, e.g., security.

To excavate imperceptible changes in videos of the relay wall, we introduce difference frames as an essential carrier of temporal-local motion messages.

In addition, we propose PAC-Net, which consists of alternating propagation and calibration, making it capable of leveraging both dynamic and static messages on a frame-level granularity.

To evaluate the proposed method, we build and publish the first dynamic passive NLOS tracking dataset, NLOS-Track, which fills the vacuum of realistic NLOS datasets. NLOS-Track contains thousands of NLOS video clips and corresponding trajectories. Both real-shot and synthetic data are included.



Propagate and Calibrate

To exploit both raw-frame stream and difference-frame stream, we propose a concise dual architecture, PAC-Net (Propagate And Calibration Network). Instead of using a two-branch architecture to process two streams separately, PAC-Net integrates the motion continuity prior to its workflow with a specially designed alternating recurrent architecture.

Pipeline of using PAC-Net to perform real-time NLOS Tracking

Visualization of tracking pipeline with PAC-Net


We apply a zero-initialization to the hidden state since we have no knowledge about the hidden scene before the video stream comes in. Besides, we perform a warm-up strategy by disentangling a few early steps as Warm-up Stage from the original tracking procedure, which provides an appropriate initial hidden state for the following Tracking Stage.

NLOS-Track Dataset

Synthetic Data

In order to reduce the gap between the synthetic data and the photo-realistic data, we use the Cycles render engine in Blender, which is a physically-based path tracer and provides excellent performance in rendering realistic images. All 3D human-like characters and skeleton models for walking animation are acquired from Mixamo, a free animation platform of Adobe.

Rendered scene in Blender
Various characters used in synthetic data

Real-shot Data

We use a consumer-grade micro SLR camera (Canon EOS RP) to capture videos of the relay wall at 25 FPS. To obtain the ground truth of the walking trajectory, we stick a USB camera (HIKVISION E14a) to the ceiling, which records the whole process of people walking from a top view at 25 FPS as well. From the top viewed videos, we use Aruco codes to locate the walking person’s coordinate frame by frame at a sub-centimeter precision.

Real-shot scene from outside the room
Real-shot scene from the camera view

Tracking Results

Results on Synthetic Data

We visualize the tracking results after propagtion and calibration in two rows. The red squares highlight the difference between trajectories after propagation and calibration, which is consistent with our expectations. The course of warm-up is also visualized with grey dashed lines.

Visualization of tracking results on synthetic data.

Comparison between trajectories after propagation and calibration on synthetic data

Results on Real-shot Data

We compare results of different methods and observe obvious jitters of baseline model, slight stabilization but discontinuities of C-Net, and error accumulation of P-Net. These observations confirm that it is necessary to capture both motion and position infomation with PAC-Net.

Visualization of tracking results on synthetic data.

Comparison of tracking results with different methods on real-shot data


  author    = {Wang, Yihao and Wang, Zhigang and Zhao, Bin and Wang, Dong and Chen, Mulin and Li, Xuelong},
  title     = {Propagate And Calibrate: Real-time Passive Non-line-of-sight Tracking},
  journal   = {CVPR},
  year      = {2023},