Mingjia Shi
Mingjia Shi
Verified email at - Homepage
Cited by
Cited by
DLB: a dynamic load balance strategy for distributed training of deep neural networks
Q Ye, Y Zhou, M Shi, Y Sun, J Lv
IEEE Transactions on Emerging Topics in Computational Intelligence, 2022
PRIOR: Personalized Prior for Reactivating the Information Overlooked in Federated Learning
M Shi, Y Zhou, K Wang, H Zhang, S Huang, Q Ye, J Lv
NeurIPS 2023, 2023
FLSGD: free local SGD with parallel synchronization
Q Ye, Y Zhou, M Shi, J Lv
The Journal of Supercomputing 78 (10), 12410-12433, 2022
Communication-efficient Federated Learning with Single-Step Synthetic Features Compressor for Faster Convergence
Y Zhou, M Shi, Y Li, Y Sun, Q Ye, J Lv
ICCV 2023, 5031-5040, 2023
DeFTA: A Plug-and-Play Peer-to-Peer Decentralized Federated Learning Framework
Y Zhou, M Shi, Y Tian, Q Ye, J Lv
Information Sciences, 2024
A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training
K Wang*, Y Zhou*, M Shi*, Z Yuan, Y Shang, X Peng, H Zhang, Y You
arXiv preprint arXiv:2405.17403, 2024
Federated cINN Clustering for Accurate Clustered Federated Learning
JL Yuhao Zhou, Minjia Shi, Yuxin Tian, Yuanxi Li, Qing Ye
ICASSP2024, 2023
Unconstrained Feature Model and Its General Geometric Patterns in Federated Learning: Local Subspace Minority Collapse
M Shi, Y Zhou, Q Ye, J Lv
ICONIP, 449-464, 2023
The system can't perform the operation now. Try again later.
Articles 1–8