Georgia Tech Coronavirus (Covid-19) Information

Ph.D. Proposal Oral Exam - Afshin Abdi

Title:  Distributed Learning and Inference in Deep Models


Dr. Fekri, Advisor

Dr. AlRegib, Chair

Dr. Romberg


The objective of the proposed research is developing new methods and analyzing their performance for distributed learning and inference of deep models, especially on nodes with limited computational power. We consider two classes of related problems; 1) distributed training of deep models, and 2) compression and restructuring of deep models for efficient deployment and reduced inference times on devices with limited resources. First, we argue that distributed training can be recast as the problem of central estimation officer (CEO) in information theory and based on it, we will develop a framework for compression, communication and parameter estimation of deep models. Next, for efficient implementations, we observe that neural networks with sparse and local connectivity structures are more suitable for extensive distribution and parallel implementation due to their lower communication requirements. Hence, we propose to restructure the neural network by rearranging neurons in each layer and partitioning the model into sub-models such that the number of connections among sub-models is minimized.

Event Details


  • Wednesday, December 4, 2019
    12:30 pm - 2:30 pm
Location: Room 5234, Centergy

Accessibility Information

Per accessibility compliance standards, this page may have links to files that would require the downloading of additional software:

  • Click here to download Microsoft Products.
  • Click here to download Adobe Reader.