M-estimators under user-level local differential privacy constraints

L.  Ramesh1 E. Han 2 M. Avella-Medina3 C. Rush4
  • 1

    GE Healthcare, Bangalore, India [lekshmi.ramesh@gehealthcare.com]

  • 2

    Department of Computer Science, Columbia University, New York, USA [lh3117@columbia.edu]

  • 3

    Department of Statistics, Columbia University, New York, USA [marco.avella@columbia.edu]

  • 4

    Department of Statistics, Columbia University, New York, USA [cynthia.rush@columbia.edu]

Keywords: convex optimization, local differential privacy – M-estimation – noisy gradient descent – user-level privacy

1 Abstract

We consider a first order convex optimization-based framework for computing M-estimators under local differential privacy (LDP) constraints. We assume a group of users communicates with an untrusted central server that learns the parameters of an underlying model. Furthermore, each user contributes multiple data points to the server. Contrary to most works that aim to protect a single data point, we focus on user-level privacy, that aims to protect the entire set of data points belonging to a user [Levy et al., 2021, Bassily and Sun, 2023]. Our main algorithm is a noisy version of the standard gradient descent algorithm, combined with a user-level LDP mean estimation procedure to privately compute the average gradient across users at each step. This extends the noisy gradient descent algorithm of Avella-Medina et al. [2023] that was developed in the context of the standard central model for DP where every user contributes one data point to the server. We establish near optimal rates of convergence for our estimator, which show that the estimation error decreases with the number of users and the number of observations per user.

References

  • Avella-Medina et al. [2023] M. Avella-Medina, C. Bradshaw, and P.-L. Loh. Differentially private inference via noisy optimization. The Annals of Statistics, 51(5):2067–2092, 2023.
  • Bassily and Sun [2023] R. Bassily and Z. Sun. User-level private stochastic convex optimization with optimal rates. In International Conference on Machine Learning, 2023.
  • Levy et al. [2021] D. Levy, Z. Sun, K. Amin, S. Kale, A. Kulesza, M. Mohri, and A. T. Suresh. Learning with user-level privacy. In Advances in Neural Information Processing Systems, 2021.