Secure Noise Sampling for DP Collaborative Learning
MPC protocol for differentially private noise sampling in federated learning, supporting arbitrary discrete distributions
Designed an MPC protocol for DP noise sampling via chained private table lookups, supporting arbitrary discrete distributions in the semi-honest model with up to n−1 corruptions.
- Achieves 450× runtime and 13× communication improvement over prior SOTA for 32 parties.
- Yields 3–17.7% accuracy gains over per-client noise addition against colluding clients across 4 NLP benchmarks.
Authors: Olive Franzese, Congyu Fang, Radhika Garg, Xiao Wang, Somesh Jha, Nicolas Papernot, Adam Dziedzic
Venue: ACM CCS 2025
Paper: ePrint 2025/1025 · Code