Skip to main content
Visitor homeEvents home
Event Detail

Doctoral Dissertation Oral Defense, Huiqun Huang

Wednesday, November 12, 2025 11:15 AM – 12:15 PM
  • Location
    Homer Babbidge Library
  • Description
    Abstract:  Accurate and efficient modeling of urban mobility, along with the prediction of vehicle and human trajectories and object detection in traffic, are crucial for ensuring the safety and resilience of intelligent transportation systems and smart cities. However, the dynamic nature of external environments (such as weather conditions, road networks, neighboring vehicles, and passenger behaviors) or intrinsic changes in target data can lead to significant shifts in data distribution. These shifts can invalidate the trained deep learning models and foster overconfidence in model outputs. In this thesis, we introduce both the learning-based and statistical-based methods to address these issues.First, we design an attention based method for the citywide anomaly event prediction. This method effectively models the spatio-temporal characteristics of urban anomaly events and quantifies the varying impacts of urban mobility on the occurrence of anomaly events. Second, we present an extreme-aware framework to predict the citywide urban mobility under anomalous situations. The proposed framework decomposes the regional urban mobility into spatio-temporally varying regular and extreme dynamics. It minimizes the citywide urban mobility prediction loss under distribution shift. Third, we introduce a conformal prediction based and Gaussian process regression based framework to quantify the output uncertainty of existing trajectory prediction models (base models). This framework aims to improve the prediction accuracy of the base models and reduce the uncertainty of the predicted trajectories under distribution shift. Finally, we introduce an uncertainty-aware adversarial training framework that enhance the resiliency of existing collaborative object detection models for autonomous driving against adversarial attacks. More specifically, this framework alleviates the impacts of adversarial attacks by providing output uncertainty estimation through learning-based module and conformal prediction-based calibration.
  • Website
    https://events.uconn.edu/engineering/event/1499711-doctoral-dissertation-oral-defense-huiqun-huang
  • Categories
    Conferences & Speakers

More from Master Calendar