content type paperpublished December 2020
Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses
AuthorsRaef Bassily, Vitaly Feldman, Criztobal Guzman, Kunal Talwar
AuthorsRaef Bassily, Vitaly Feldman, Criztobal Guzman, Kunal Talwar
Uniform stability is a notion of algorithmic stability that bounds the worst case change in the model output by the algorithm when a single data point in the dataset is replaced. An influential work of Hardt et al. (2016) provides strong upper bounds on the uniform stability of the stochastic gradient descent (SGD) algorithm on sufficiently smooth convex losses. These results led to important progress in understanding of the generalization properties of SGD and several applications to differentially private convex optimization for smooth losses.