Demonstration of federated learning in a resource-constrained networked environment
Abstract
Many modern applications in the area of smart computing are based on machine learning techniques. To train machine learning models, a large amount of data is usually required, which is often not readily available at a central location. Federated learning enables the training of machine learning models from distributed datasets at client devices without transmitting the data to a central place, which has benefits including preserving the privacy of user data and reducing communication bandwidth. In this demonstration, we show a federated learning system deployed in an emulated wide-area communications network with dynamic, heterogeneous, and intermittent resource availability, where the network is emulated using a CORE/EMANE emulator. In our system, the environment is decentralized and each client can ask for assistance by other clients. The availability of clients is intermittent so only those clients that are available can provide assistance. A graphical interface illustrates the network connections and the user can adjust these connections through the interface. A user interface displays the training progress and each client's contribution to training.