Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ IEEE Accessarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025 . Peer-reviewed
License: CC BY
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025
Data sources: DOAJ
versions View all 2 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

A Survey of Differential Privacy Techniques for Federated Learning

Authors: Wang Xin; Li Jiaqian; Ding Xueshuang; Zhang Haoji; Sun Lianshan;

A Survey of Differential Privacy Techniques for Federated Learning

Abstract

The problem of data privacy protection in the information age deserves people’s attention. As a distributed machine learning technology, federated learning can effectively solve the problem of privacy security and data silos. Differential privacy(DP) technology is applied in federated learning(FL). By adding noise to raw data and model parameters, it can further enhance the degree of data privacy protection. Over the years, differential privacy technology based on federated learning framework has been developed, which is divided into central differential privacy federated learning(CDPFL) and local differential privacy federated learning(LDPFL). Although differential privacy may reduce the accuracy and convergence of federated learning models while protecting data privacy, researchers have proposed a variety of optimization methods to balance privacy protection and model performance. This paper comprehensively expounds the research status of differential privacy techniques based on the federated learning framework, first providing detailed introductions to federated learning and differential privacy technologies, and then summarizing the development status of two types of federated learning differential privacy(DPFL) techniques respectively; for CDPFL, the paper divides the discussion into first proposal of CDP and typical application examples, the impact of Gaussian mechanisms on model accuracy, optimization based on asynchronous differential privacy, and insights from other scholars; for LDPFL, the paper divides the discussion into first proposal of LDP and typical application examples, processing multidimensional data and improving model accuracy, existing methods and optimization for reducing communication costs, balancing privacy protection and data usability, LDPFL based on the Shuffle model, and insights from other scholars; following this, the paper addresses and summarizes the unique challenges introduced by incorporating differential privacy into federated learning and proposes solutions; finally, based on a summary of existing optimization techniques, the paper outlines future directions and specifically discusses three research ideas for enhancing the optimization effects of federated differential privacy: advanced optimization strategies combining Bayesian methods and the Alternating Direction Method of Multipliers (ADMM), integrating lattice homomorphic encryption techniques from cryptography to achieve more efficient differential privacy protection in federated learning, and exploring the application of zero-knowledge proof techniques in federated learning for privacy protection.

Related Organizations
Keywords

Differential privacy, zero-knowledge proofs, federated learning, privacy protection, lattice-based homomorphic encryption, Electrical engineering. Electronics. Nuclear engineering, TK1-9971

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
1
Average
Average
Average
gold