Doctoral Dissertation Oral Defense of Hao Du
Monday, September 23, 2024 2:00–3:00 PM
- DescriptionTitle: Novel Mechanisms of Chronic Kidney Diseases Field of Study: Biomedical Science Ph.D. Program
- Websitehttps://events.uconn.edu/graduate-school-theses-and-dissertation-defense/event/173260-doctoral-dissertation-oral-defense-of-hao-d
More from Graduate Dissertations
- Sep 243:00 PMDoctoral Dissertation Oral Defense of Shanglin ZhouTitle: Model Sparsification on Emerging Applications and Technologies Ph.D. Candidate: Shanglin Zhou Major Advisor: Dr. Caiwen Ding Co-Major Advisor: Dr. Krishna Pattipati Associate Advisors: Dr. Cunxi Yu, Dr. Zhijie Shi Date/Time: Tuesday, September 24th, 2024, 3:00pm - 4:00pm Location: Virtual Abstract: Deep neural networks (DNNs) with higher accuracy often lead to larger models, increasing storage and energy demands. Model sparsification can reduce size but risks compromising accuracy. Balancing these factors is challenging, especially as traditional processors struggle with the requirements of low-power, real-time DNNs, highlighting the need for more efficient solutions. In this thesis, we explore model sparsification in emerging applications. We propose an optimization approach using Surrogate Lagrangian Relaxation (SLR) for weight sparsification, streamlining the typical time-consuming three-step pipeline. It enables faster convergence and maintains high accuracy, even during hard-pruning, with rapid recovery in retraining. We explore two primary emerging applications: (1) Energy-harvesting devices that require dynamic power management. We introduce EVE, an AutoML framework using SLR-based sparsification to find optimal multi-models with shared weights, reducing memory use and adapting to changing environments. (2) Diffractive Optical Neural Networks (DONNs) that are fast and energy-efficient but suffer from accuracy degradation due to interpixel interactions. We propose a physics-aware optimization framework for DONNs, incorporating SLR-based sparsification and roughness modeling to smooth phase changes and preserve accuracy. Additionally, we extend our research to multi-task learning (MTL) with DONNs, which traditionally requires manual reconfiguration. We propose LUMEN-PRO, an automated MTL framework that utilizes a flexible DONN backbone. By rotating shared layers instead of storing task-specific ones, LUMEN-PRO reduces memory usage and enhances accuracy, enabling efficient and high-performance DONNs across various tasks.
- Sep 253:00 PMDoctoral Dissertation Oral Defense of Dillon PattersonTitle: Catching Caribou in the Age of Climate Change: How Regulation and Caribou Population Decline Threaten the Alaska Native Way of Life Field of Study: Anthropology
- Sep 2710:00 AMDoctoral Dissertation Oral Defense of Xuetong Yuan.Dissertation title: Conditions on conditionals: evaluativity, discourse sensitivity, and conditionals without 'if' Field of study: Linguistics
- Sep 271:30 PMDoctoral Dissertation Oral Defense of Maggie KhuuTitle: Investigation of thyrotropin-releasing hormone neurons in the mouse lateral hypothalamic area Department: Physiology and Neurobiology
- Oct 112:30 PMDoctoral Dissertation Oral Defense of Kaylee Jangula MootzPhD English Literature Defense. "The Temporality of Kinship: Time and the Family in Native American and African American Literature"
- Oct 22:00 PMDoctoral Dissertation Oral Defense of Keliang WangTitle: Essays on the Integration of Machine Learning and Discrete Optimization Field of study: Operations and Information Management