ElastiCL: Elastic Quantization for Communication Efficient Collaborative Learning in IoT
Bharath Sudharsan, Dhruv Sheth, Shailesh Arya, Federica Rollo, Piyush Yadav, Pankesh Patel, John G. Breslin, and Muhammad Intizar Ali
In Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems, 2021
Transmitting updates of high-dimensional models between client IoT devices and the central aggregating server has always been a bottleneck in collaborative learning - especially in uncertain real-world IoT networks where congestion, latency, bandwidth issues are common. In this scenario, gradient quantization is an effective way to reduce bits count when transmitting each model update, but with a trade-off of having an elevated error floor due to higher variance of the stochastic gradients. In this paper, we propose ElastiCL, an elastic quantization strategy that achieves transmission efficiency plus a low error floor by dynamically altering the number of quantization levels during training on distributed IoT devices. Experiments on training ResNet-18, Vanilla CNN shows that ElastiCL can converge in much fewer transmitted bits than fixed quantization level, with little or no compromise on training and test accuracy.