Hi! I'm Xue Liao.
I'm a researcher specializing in 3D Vision, 3D Reconstruction, Computer Graphics, and Generative Models. I am currently pursuing my MPhil at HKUST (GZ) under the supervision of Prof. Zeyu Wang and Prof. Pan Hui.
My current work primarily focuses on 3D Gaussian Splatting, inverse rendering, relighting, and image generation and editing. I am also interested in visualization, video generation, and world models.
I'll be joining the University of Notre Dame as a PhD student in Fall 2026. See you there!
Kang Du*, Xue Liao*, Junpeng Xia, Chaozheng Guo, Yi Gu, Yirui Guan, Sheng Huang, Zeyu Wang# (* equal contribution, # corresponding author)
CVPR 2026
Illumination inconsistency is a fundamental challenge in multi-view 3D reconstruction. Variations in sunlight direction, cloud cover, and shadows break the constant-lighting assumption underlying both classical multi-view stereo (MVS) and structure from motion (SfM) pipelines and recent neural rendering methods, leading to geometry drift, color inconsistency, and shadow imprinting. This issue is especially critical in UAV-based reconstruction, where long flight durations and outdoor environments make lighting changes unavoidable. However, existing datasets either restrict capture to short time windows, thus lacking meaningful illumination diversity, or span months and seasons, where geometric and semantic changes confound the isolated study of lighting robustness. We introduce UAVLight, a controlled-yet-real benchmark for illumination-robust 3D reconstruction. Each scene is captured along repeatable, geo-referenced flight paths at multiple fixed times of day, producing natural lighting variation under consistent geometry, calibration, and viewpoints. With standardized evaluation protocols across lighting conditions, UAVLight provides a reliable foundation for developing and benchmarking reconstruction methods that are consistent, faithful, and relightable in real outdoor environments.
Kang Du*, Xue Liao*, Junpeng Xia, Chaozheng Guo, Yi Gu, Yirui Guan, Sheng Huang, Zeyu Wang# (* equal contribution, # corresponding author)
CVPR 2026
Illumination inconsistency is a fundamental challenge in multi-view 3D reconstruction. Variations in sunlight direction, cloud cover, and shadows break the constant-lighting assumption underlying both classical multi-view stereo (MVS) and structure from motion (SfM) pipelines and recent neural rendering methods, leading to geometry drift, color inconsistency, and shadow imprinting. This issue is especially critical in UAV-based reconstruction, where long flight durations and outdoor environments make lighting changes unavoidable. However, existing datasets either restrict capture to short time windows, thus lacking meaningful illumination diversity, or span months and seasons, where geometric and semantic changes confound the isolated study of lighting robustness. We introduce UAVLight, a controlled-yet-real benchmark for illumination-robust 3D reconstruction. Each scene is captured along repeatable, geo-referenced flight paths at multiple fixed times of day, producing natural lighting variation under consistent geometry, calibration, and viewpoints. With standardized evaluation protocols across lighting conditions, UAVLight provides a reliable foundation for developing and benchmarking reconstruction methods that are consistent, faithful, and relightable in real outdoor environments.