报告题目:Enhancing Federated Learning by Sparsifying Transmitted Model Updates
报告时间:2024年6月17日上午9:00
报告地点:304永利集团官网入口中心校区王湘浩楼A521
报 告 人:Yipeng Zhou
报告人简介:
Dr Yipeng Zhou is a senior lecturer with the School of Computing, Faculty of Science and Engineering, Macquarie University. Before joining Macquarie University, he was a research fellow with the University of South Australia, and a lecturer with Shenzhen University, respectively. He got his Ph.D. and M.Phil degrees from The Chinese University of Hong Kong, and B.S. degree from University of Science and Technology of China, respectively. He received 2023 Macquarie University Vice-Chancellor's Research Excellence Award for Innovative Technology, and 2023 IEEE Open Journal of the Communications Society Best Editor Award. He was the recipient of 2018 Australia Research Council Discover Early Career Researcher Award (DECRA). His research interests lie in federated learning, data privacy-preservation, networking, etc. He has published 100+ papers in top venues, including IEEE INFOCOM, ICML, IJCAI, ICNP, IWQoS, IEEE ToN, JSAC, TPDS, TMC, TMM, etc.
报告内容简介:
Federated learning facilitates the collaborative training of a machine learning model among geographically dispersed clients by exchanging model updates with a central server via Internet communication. However, transmitting these updates between the server and numerous decentralized clients over the Internet consumes considerable bandwidth and is susceptible to malicious attacks. This presentation showcases our various contributions aimed at improving communication efficiency and preserving privacy in federated learning. Our focus lies in sparsifying the transmission of model updates between the server and clients. By meticulously evaluating both the learning value, communication cost and privacy cost of transmitting each individual model update, we effectively mitigate the exposure of low-value updates to minimize communication and privacy costs. Extensive experiments conducted on real datasets demonstrate that our algorithms can significantly reduce communication costs and bolster privacy protection compared to the state-of-the-art federated learning baselines.
主办单位:304永利集团官网入口
304永利集团官网入口软件学院
304永利集团官网入口计算机科学技术研究所
符号计算与知识工程教育部重点实验室
仿真技术教育部重点实验室
网络技术及应用软件教育部工程研究中心
304永利集团官网入口国家级计算机实验教学示范中心