FEDERATED LEARNING FOR MOBILE KEYBOARD PREDICTION

FEDERATED LEARNING FOR MOBILE KEYBOARD PREDICTION

28 Feb 2019 | Andrew Hard, Kanishka Rao, Rajiv Mathews, Swaroop Ramaswamy, Françoise Beaufays Sean Augenstein, Hubert Eichner, Chloé Kiddon, Daniel Ramage
This paper explores the use of federated learning to train a recurrent neural network (RNN) language model for next-word prediction in Google's Gboard virtual keyboard. The authors compare server-based training using stochastic gradient descent (SGD) with federated learning, which involves training on client devices and aggregating updates on a server. The federated learning approach is shown to achieve better prediction recall compared to the server-based method, demonstrating the feasibility of training language models on client devices without exporting sensitive user data to servers. The paper also highlights the benefits of federated learning in terms of user privacy and control over their data, as well as the simplicity of incorporating privacy by default in distributed training and aggregation across a population of client devices. The results are validated through experiments on both server-hosted logs data and client-held data, as well as live production experiments with a subset of Gboard users. The federated CIFG model outperforms both the server-trained CIFG and the n-gram baseline model in terms of recall metrics, particularly on client cache data, which is believed to more accurately represent the true typing distribution.This paper explores the use of federated learning to train a recurrent neural network (RNN) language model for next-word prediction in Google's Gboard virtual keyboard. The authors compare server-based training using stochastic gradient descent (SGD) with federated learning, which involves training on client devices and aggregating updates on a server. The federated learning approach is shown to achieve better prediction recall compared to the server-based method, demonstrating the feasibility of training language models on client devices without exporting sensitive user data to servers. The paper also highlights the benefits of federated learning in terms of user privacy and control over their data, as well as the simplicity of incorporating privacy by default in distributed training and aggregation across a population of client devices. The results are validated through experiments on both server-hosted logs data and client-held data, as well as live production experiments with a subset of Gboard users. The federated CIFG model outperforms both the server-trained CIFG and the n-gram baseline model in terms of recall metrics, particularly on client cache data, which is believed to more accurately represent the true typing distribution.
Reach us at info@study.space
[slides] Federated Learning for Mobile Keyboard Prediction | StudySpace