From 178ffbfb864b5dad12f682353c61195ebda58246 Mon Sep 17 00:00:00 2001 From: Yuchen Pei Date: Thu, 21 Mar 2019 14:32:23 +0100 Subject: minor --- posts/2019-03-13-a-tail-of-two-densities.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/posts/2019-03-13-a-tail-of-two-densities.md b/posts/2019-03-13-a-tail-of-two-densities.md index 0286cd6..34d563e 100644 --- a/posts/2019-03-13-a-tail-of-two-densities.md +++ b/posts/2019-03-13-a-tail-of-two-densities.md @@ -16,7 +16,7 @@ a study of [tail bounds](https://en.wikipedia.org/wiki/Concentration_inequality) of the divergence between two probability measures, with the end goal of applying it to [stochastic gradient descent](https://en.wikipedia.org/wiki/Stochastic_gradient_descent). -It should be suitable for anyone familiar with probability theory. +This post should be suitable for anyone familiar with probability theory. I start with the definition of $\epsilon$-differential privacy (corresponding to max divergence), followed by -- cgit v1.2.3