报告题目:A new encoding computation method for Distributed multi-task learning
报告时间:2024年9月6日下午15:30-16:30
报告地点:伟德bevictor中文版犀浦校区7教7510
报告人:程民权
摘要: Distributed multi-task learning (MTL) enables the joint training of multiple models by leveraging correlations between tasks, which can lead to better generalization performance. However, MTL in distributed settings faces significant communication bottlenecks, particularly in large-scale learning scenarios involving numerous tasks. This talk considers a distributed multi-task learning (MTL) system where distributed workers aim to learn different models orchestrated by a central server. To mitigate the communication bottlenecks in both the uplink and downlink, we propose a novel approach leveraging matrix multiplication. We have presented the results of achieving communication loads for both uplink and downlink under arbitrary layouts, which outperform known results and are close to the theoretical optimum. Specifically, when N = K, we achieve the optimal communication loads for both uplink and downlink as prescribed by information theory, marking a perfect attainment of the theoretical minimum.
报告人简介:程民权,广西师范大学计算机科学与工程学院教授、博士生导师,2012年博士毕业于(日本)筑波大学,并获得筑波大学董事长特别表彰。现为首批广西高校引进海外高层次人才,广西杰出青年基金获得者,中国保密协会隐私保护专业委员会委员,主持多项国家和省部级项目。主要研究领域为编码密码理论、组合设计及其在分布式系统、信息安全与隐私、多媒体版权保护、机器学习等领域的应用。目前已发表SCI学术论文55篇,其中11篇发表在信息论国际顶级期刊《IEEE Transactions on Information Theory》上(6篇曾入选该杂志月度前 50 热门文章),1篇发表在计算机网络领域国际顶级期刊《IEEE/ACMTransactions on Networking》,18篇发表通信领域顶级期刊《IEEE Transactions on Communications》上,7篇发表在《Designs Codes and Cryptography》上。共主持9个基金项目,其中国家自然科学基金2项,广西自然科学基金4项。
窗体底端