[PDF] Efficient Data Distribution Estimation for Accelerated Federated Learning | Semantic Scholar (2024)

Skip to search formSkip to main contentSkip to account menu

Semantic ScholarSemantic Scholar's Logo
  • Corpus ID: 270226467
@inproceedings{Wang2024EfficientDD, title={Efficient Data Distribution Estimation for Accelerated Federated Learning}, author={Yuanli Wang and Lei Huang}, year={2024}, url={https://api.semanticscholar.org/CorpusID:270226467}}
  • Yuanli Wang, Lei Huang
  • Published 3 June 2024
  • Computer Science, Engineering

This work studies the overhead of client selection algorithms in a large scale FL environment, and proposes an efficient data distribution summary calculation algorithm to reduce the overhead in a real-world large scale FL environment.

Figures and Tables from this paper

  • figure 1
  • table 1
  • table 2

Ask This Paper

BETA

AI-Powered

Our system tries to constrain to information found in this paper. Results quality may vary. Learn more about how we generate these answers.

Feedback?

13 References

HACCS: Heterogeneity-Aware Clustered Client Selection for Accelerated Federated Learning
    Joel WolfrathN. SreekumarDhruv KumarYuanli WangA. Chandra

    Computer Science

    2022 IEEE International Parallel and Distributed…

  • 2022

HACCS is a Heterogeneity-Aware Clustered Client Selection system that identifies and exploits the statistical heterogeneity by representing all distinguishable data distributions instead of individual devices in the training process, and can provide 18% −38% reduction in time to convergence compared to the state of the art.

  • 19
  • Highly Influential
  • PDF
Contextual Client Selection for Efficient Federated Learning Over Edge Devices
    Qiying PanHangrui CaoYifei ZhuJiangchuan LiuBo Li

    Computer Science, Engineering

    IEEE Transactions on Mobile Computing

  • 2024

This article introduces a novel client selection framework that judiciously leverages correlations across local datasets to accelerate training and designs a novel Neural Contextual Combinatorial Bandit algorithm to establish relationships between client features and rewards, enabling intelligent selection of client combinations.

  • 2
Federated Multi-Task Learning with Non-Stationary and Heterogeneous Data in Wireless Networks
    Hongwei ZhangM. TaoYuanming ShiXiaoyan BiK. Letaief

    Computer Science, Engineering

    IEEE Transactions on Wireless Communications

  • 2024

An adaptive FMTL framework is developed, which can accelerate the model training convergence and reduce the computation complexity while ensuring model accuracy, and is validated in the edge learning model.

  • 4
A Review of Client Selection Methods in Federated Learning
    S. MayhoubTareq M. Shami

    Computer Science

    Archives of Computational Methods in Engineering

  • 2023

This paper critically reviews recent CS methods for FL and analyses the CS methods, their functionality, and their limitations, and provides a comparison of the used approaches in terms of how they are evaluated.

  • 2
Online Federated Learning via Non-Stationary Detection and Adaptation Amidst Concept Drift
    Bhargav GangulyV. Aggarwal

    Computer Science

    IEEE/ACM Transactions on Networking

  • 2024

This paper introduces a multiscale algorithmic framework which combines theoretical guarantees of FedAvg and FedOMD algorithms in near stationary settings with a non-stationary detection and adaptation technique to ameliorate FL generalization performance in the presence of concept drifts.

Client Selection in Federated Learning: Principles, Challenges, and Opportunities
    Lei FuHuan ZhangGe GaoMi ZhangXin Liu

    Computer Science

    IEEE Internet of Things Journal

  • 2023

This article systematically presents recent advances in the emerging field of FL client selection and its challenges and research opportunities to facilitate practitioners in choosing the most suitable client selection mechanisms for their applications, as well as inspire researchers and newcomers to better understand this exciting research topic.

  • 50
  • PDF
Client Selection in Federated Learning under Imperfections in Environment
    Sumit RaiA. KumariDilip K. Prasad

    Computer Science

    AI

  • 2022

A novel sampling method named the irrelevance sampling technique that selects a subset of clients based on quality and quantity of data on edge devices that achieves 50–80% faster convergence even in highly skewed data distribution in the presence of free riders.

FedScale: Benchmarking Model and System Performance of Federated Learning at Scale
    Fan LaiYinwei Dai Mosharaf Chowdhury

    Computer Science

    ICML

  • 2022

FedScale is presented, a federated learning benchmarking suite with realistic datasets and a scalable runtime to enable reproducible FL research and highlight potential opportunities for heterogeneity-aware co-optimizations in FL.

Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning
    Yann FraboniRichard VidalLaetitia KameniMarco Lorenzi

    Computer Science

    ICML

  • 2021

It is proved that clustered sampling leads to better clients representatitivity and to reduced variance of the clients stochastic aggregation weights in FL and is compatible with existing methods and technologies for privacy enhancement, and for communication reduction through model compression.

Accelerated Training via Device Similarity in Federated Learning
    Yuanli WangJoel WolfrathN. SreekumarDhruv KumarA. Chandra

    Computer Science, Engineering

    EdgeSys@EuroSys

  • 2021

This work analyses the impact of data heterogeneity on device selection, model convergence, model accuracy, and fault tolerance in a federated learning setting and proposes three methods for identifying groups of devices with similar data distributions that can significantly improve the model convergence without compromising model accuracy.

  • 11
  • PDF

...

...

Related Papers

Showing 1 through 3 of 0 Related Papers

    [PDF] Efficient Data Distribution Estimation for Accelerated Federated Learning | Semantic Scholar (2024)
    Top Articles
    Latest Posts
    Article information

    Author: Nicola Considine CPA

    Last Updated:

    Views: 5443

    Rating: 4.9 / 5 (69 voted)

    Reviews: 92% of readers found this page helpful

    Author information

    Name: Nicola Considine CPA

    Birthday: 1993-02-26

    Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

    Phone: +2681424145499

    Job: Government Technician

    Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

    Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.