Biases in Aerial Video-Based Vehicle Trajectory Generation: An Empirical Evaluation

Bo Cao et al.

IEEE Transactions on Intelligent Transportation Systems2026https://doi.org/10.1109/tits.2026.3658135article
ABDC A
Weight
0.50

Abstract

Real-world vehicle trajectory data are essential for transportation research. To generate high-precision and long-distance trajectory data from aerial videos, the deep learning-based and video stitching technologies have been used in recent studies. However, while these new technologies enhance data-driven studies, they may introduce potential biases that were often overlooked in the past, and cannot be easily removed. To clarify these biases in aerial video-based trajectory datasets and their impacts in transportation research, this study focuses on two aspects: 1) Firstly, we investigated the biases in the modules of typical trajectory generation procedures and categorized them into three types: vehicle box bias, motion parameter bias and intervehicle parameter bias. To evaluate these biases quantitatively, we conducted a field experiment by using high-precision inertial navigation and drone simultaneously, and hold the biases self-evident using the state-of-the-art long-distance trajectory dataset respectively. The evaluation results indicate that trajectory generation from aerial videos tends to introduce biases, such as elongated vehicle boxes and reduced velocities, both of which are correlated with the distance to the edges of the video frame. Similar pattern is not observed in the acceleration parameter. But accurately reproducing hard acceleration and deceleration events remains challenging. Moreover, in multi-video trajectory dataset, unrealistic fluctuations in velocity, acceleration and intervehicle parameters are observed at the video stitching area. And these fluctuations exceed dynamics thresholds. 2) Secondly, we validated some impacts of these biases in traffic studies through three case analyses, and discussed valuable suggestions for mitigating biases. This can provide valuable References for researches that construct and process trajectory data. This study may be the first to evaluate the biases in aerial video-based trajectory generation and seek to clarify their impacts in downstream studies. The codes and data for trajectory generation are available at https://github.com/wut-panda/long-distance_vehicle_trajectory_generation.

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1109/tits.2026.3658135

Or copy a formatted citation

@article{bo2026,
  title        = {{Biases in Aerial Video-Based Vehicle Trajectory Generation: An Empirical Evaluation}},
  author       = {Bo Cao et al.},
  journal      = {IEEE Transactions on Intelligent Transportation Systems},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1109/tits.2026.3658135},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Biases in Aerial Video-Based Vehicle Trajectory Generation: An Empirical Evaluation

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.50

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.50 × 0.4 = 0.20
M · momentum0.50 × 0.15 = 0.07
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.