Federated and Fog Learning:
Federated learning has generated significant interest, with nearly all works focused on a "star" topology where nodes/devices are each connected to a central server. We migrate away from this architecture and extend it through the network dimension to the case where there are multiple layers of nodes between the end devices and the server. Specifically, we develop multi-stage hybrid federated learning (MH-FL), a hybrid of intra- and inter-layer model learning that considers the network as a multi-layer cluster-based structure, each layer of which consists of multiple device clusters. MH-FL considers the topology structures among the nodes in the clusters, including local networks formed via device-to-device (D2D) communications. It orchestrates the devices at different network layers in a collaborative/cooperative manner (i.e., using D2D interactions) to form local consensus on the model parameters, and combines it with multi-stage parameter relaying between layers of the tree-shaped hierarchy.
- Su Wang, Seyyedali Hosseinalipour, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, Weifeng Su, and Mung Chiang, "Towards Cooperative Federated Learning over Heterogeneous Edge/Fog Networks," Accepted to IEEE Communications Magazine, Mar 2023.
- Bhargav Ganguly and Vaneet Aggarwal, "Online Federated Learning via Non-Stationary Detection and Adaptation amidst Concept Drift," Accepted to IEEE/ACM Transactions on Networking, Jul 2023.
- Seyyedali Hosseinalipour, Su Wang, Nicolo Michelusi, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, and Mung Chiang, "Parallel Successive Learning for Dynamic Distributed Model Training over Heterogeneous Wireless Networks," Accepted to IEEE/ACM Transactions on Networking, May 2023.
- Bhargav Ganguly, Seyyedali Hosseinalipour, Kwang Taik Kim, Christopher G. Brinton, Vaneet Aggarwal, David J. Love, and Mung Chiang, "Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized Floating Aggregation Point," Accepted to IEEE/ACM Transactions on Networking, Mar 2023.
- Seyyedali Hosseinalipour, Sheikh Shams Azam, Christopher G. Brinton, Nicolo Michelusi, Vaneet Aggarwal, David J. Love, Huaiyu Dai, "Multi-Stage Hybrid Federated Learning over Large-Scale Wireless Fog Networks," AIEEE/ACM Transactions on Networking, vol. 30, no. 4, pp. 1569-1584, Aug. 2022, doi: 10.1109/TNET.2022.3143495.
- Seyyedali Hosseinalipour, Christopher G. Brinton, Vaneet Aggarwal, Huaiyu Dai, and Mung Chiang, "From Federated Learning to Fog Learning: Towards Large-Scale Distributed Machine Learning in Heterogeneous Networks," IEEE Communications Magazine, vol. 58, no. 12, pp. 41-47, December 2020.
- Anis Elgabli, Jihong Park, Amrit S. Bedi, Chaouki Ben Issaid, Mehdi Bennis, and Vaneet Aggarwal, "Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning," IEEE Transactions on Communications, vol. 69, no. 1, pp. 164-181, Jan 2021
- Anis Elgabli, Jihong Park, Amrit S. Bedi, Mehdi Bennis, and Vaneet Aggarwal,"GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning," Journal of Machine Learning Research, Mar 2020.
- Anis Elgabli, Jihong Park, Amrit S. Bedi, Mehdi Bennis, and Vaneet Aggarwal, "Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning," in Proc. ICASSP, May 2020.
- Anis Elgabli, Jihong Park, Amrit Bedi, Mehdi Bennis, and Vaneet Aggarwal, "Communication Efficient Framework for Decentralized Machine Learning," in Proc. CISS, Mar 2020
Home