aboutsummaryrefslogtreecommitdiff
path: root/posts
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2019-03-21 14:32:23 +0100
committerYuchen Pei <me@ypei.me>2019-03-21 14:32:23 +0100
commit178ffbfb864b5dad12f682353c61195ebda58246 (patch)
tree75eee2bc299475645f40af79fb3957f93e51dd62 /posts
parent5e6670b7de3838f57e845f3c7b2d687490f950f4 (diff)
minor
Diffstat (limited to 'posts')
-rw-r--r--posts/2019-03-13-a-tail-of-two-densities.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/posts/2019-03-13-a-tail-of-two-densities.md b/posts/2019-03-13-a-tail-of-two-densities.md
index 0286cd6..34d563e 100644
--- a/posts/2019-03-13-a-tail-of-two-densities.md
+++ b/posts/2019-03-13-a-tail-of-two-densities.md
@@ -16,7 +16,7 @@ a study of [tail bounds](https://en.wikipedia.org/wiki/Concentration_inequality)
of the divergence between
two probability measures, with the end goal of applying it to [stochastic
gradient descent](https://en.wikipedia.org/wiki/Stochastic_gradient_descent).
-It should be suitable for anyone familiar with probability theory.
+This post should be suitable for anyone familiar with probability theory.
I start with the definition of $\epsilon$-differential privacy
(corresponding to max divergence), followed by