Haku

Model compression via pattern shared sparsification in analog federated learning under communication constraints

QR-koodi

Model compression via pattern shared sparsification in analog federated learning under communication constraints

Abstract

Recently, it has been shown that analog transmission based federated learning enables more efficient usage of communication resources compared to the conventional digital transmission. In this paper, we propose an effective model compression strategy enabling analog FL under constrained communication bandwidth. To this end, the proposed approach is based on pattern shared sparsification by setting the same sparsification pattern of parameter vectors uploaded by edge devices, as opposed to each edge device independently applying sparsification. In particular, we propose specific schemes for determining the sparsification pattern and characterize the convergence of analog FL leveraging these proposed sparsification strategies, by deriving a closed-form upper boun d of convergence rate and residual error. The closed-form expression allows to capture the effect of communication bandwidth and power budget to the performance of analog FL. In terms of convergence analysis, the model parameter obtained with the proposed schemes is proven to converge to the optimum of model parameter. Numerical results show that leveraging the proposed pattern shared sparsification consistently improves the performance of analog FL in various settings of system parameters. The improvement in performance is more significant under scarce communication bandwidth and limited transmit power budget.

Tallennettuna: