diff options
author | Yuchen Pei <me@ypei.me> | 2019-03-21 14:32:23 +0100 |
---|---|---|
committer | Yuchen Pei <me@ypei.me> | 2019-03-21 14:32:23 +0100 |
commit | 178ffbfb864b5dad12f682353c61195ebda58246 (patch) | |
tree | 75eee2bc299475645f40af79fb3957f93e51dd62 | |
parent | 5e6670b7de3838f57e845f3c7b2d687490f950f4 (diff) |
minor
-rw-r--r-- | posts/2019-03-13-a-tail-of-two-densities.md | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/posts/2019-03-13-a-tail-of-two-densities.md b/posts/2019-03-13-a-tail-of-two-densities.md index 0286cd6..34d563e 100644 --- a/posts/2019-03-13-a-tail-of-two-densities.md +++ b/posts/2019-03-13-a-tail-of-two-densities.md @@ -16,7 +16,7 @@ a study of [tail bounds](https://en.wikipedia.org/wiki/Concentration_inequality) of the divergence between two probability measures, with the end goal of applying it to [stochastic gradient descent](https://en.wikipedia.org/wiki/Stochastic_gradient_descent). -It should be suitable for anyone familiar with probability theory. +This post should be suitable for anyone familiar with probability theory. I start with the definition of $\epsilon$-differential privacy (corresponding to max divergence), followed by |