Flower: A Friendly Federated Learning Framework

Flower: A Friendly Federated Learning Framework

Subscribers:
348,000
Published on ● Video Link: https://www.youtube.com/watch?v=NaOVX-lp5Fo



Duration: 20:25
4,199 views
60


A Google TechTalk, 2020/7/29, presented by Nicholas Lane, University of Cambridge.
ABSTRACT: Full title: Flower: A Friendly Federated Learning Framework .. and a first look into the carbon footprint of federated methods

Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared prediction model, while keeping their training data on the device, thereby decoupling the ability to do machine learning from the need to store the potentially privacy sensitive user data in the cloud. However, despite rapid the progress made in FL in recent years, it still remains far too difficult to evaluate FL algorithms under a full range of realistic system constraints (viz. compute, memory, energy, wired/wireless networking) and scale (thousands of federated devices and larger). As a consequence, our understanding of how these factors influence FL performance and should shape the future evolution of FL algorithms remains in a very underdeveloped state.

In this talk, I will describe how we have begun to address this situation by developing Flower -- an open-source framework (http://flower.dev) built to help bridge this gap in evaluation and design. Through Flower, it becomes relatively simple to measure the impact of common real-world FL situations, such as if participating devices have limited compute resources (e.g., an embedded device), or when network speeds are highly varied and unstable. I will highlight early empirical observations, made using Flower, as to what the implications are for existing algorithms under the types of heterogeneous large-scale FL systems we anticipate will increasingly appear. Finally, to showcase the potential and flexibility of Flower, I will show how it can even be used to make assessments of the carbon footprint of FL in various settings -- to the best of our knowledge, this is the first time FL has been studied from the perspective of its environmental impact. "




Other Videos By Google TechTalks


2021-09-29A Geometric View on Private Gradient-Based Optimization
2021-09-29BB84: Quantum Protected Cryptography
2021-09-29Fast and Memory Efficient Differentially Private-SGD via JL Projections
2021-09-29Leveraging Public Data for Practical Synthetic Data Generation
2021-07-13Efficient Exploration in Bayesian Optimization – Optimism and Beyond by Andreas Krause
2021-07-13Learning to Explore in Molecule Space by Yoshua Bengio
2021-07-13Resource Allocation in Multi-armed Bandits by Kirthevasan Kandasamy
2021-07-13Grey-box Bayesian Optimization by Peter Frazier
2021-06-10Is There a Mathematical Model of the Mind? (Panel Discussion)
2021-06-04Dataset Poisoning on the Industrial Scale
2021-06-04Flower: A Friendly Federated Learning Framework
2021-06-04Adaptive Federated Optimization
2021-06-04Orchard: Differentially Private Analytics at Scale
2021-06-04Byzantine-Resilient High-Dimensional SGD with Local Iterations on Heterogenous Data
2021-06-04Workshop on Federated Learning and Analytics: Pre-recorded Talks Day 2 Track 2 Q&A Privacy/Security
2021-06-04Google Workshop on Federated Learning and Analytics: Talks from Google
2021-06-04Workshop on Federated Learning & Analytics: Pre-recorded Talks Day 2 Track 1 Q&A Optimization/System
2021-06-04Generative Models for Effective ML on Private, Decentralized Datasets
2021-06-04TensorFlow Federated Tutorial Session
2021-06-04Learning on Large-Scale Data with Security & Privacy
2021-06-04Attack of the Tails: Yes, you Really can Backdoor Federated Learning