aboutsummaryrefslogtreecommitdiff
path: root/posts
diff options
context:
space:
mode:
authorYuchen Pei <me@ypei.me>2019-03-20 14:17:06 +0100
committerYuchen Pei <me@ypei.me>2019-03-20 14:17:06 +0100
commit2c8cf1a3a027980abd078632b8efa69a4faae0d0 (patch)
tree9adac83a29846b585bb0aca4b7f4b6cf850dc37d /posts
parent601309e8bc8162d29b8dd3ff4ef716805f17e506 (diff)
added more wikipedia links
Diffstat (limited to 'posts')
-rw-r--r--posts/2019-03-13-a-tail-of-two-densities.md2
-rw-r--r--posts/2019-03-14-great-but-manageable-expectations.md8
2 files changed, 5 insertions, 5 deletions
diff --git a/posts/2019-03-13-a-tail-of-two-densities.md b/posts/2019-03-13-a-tail-of-two-densities.md
index 26f4ad5..37e32e5 100644
--- a/posts/2019-03-13-a-tail-of-two-densities.md
+++ b/posts/2019-03-13-a-tail-of-two-densities.md
@@ -27,7 +27,7 @@ as well as the effect of mixing mechanisms, by presenting the subsampling theore
(a.k.a. amplification theorem).
In [Part 2](/posts/2019-03-14-great-but-manageable-expectations.html), I discuss the Rényi differential privacy, corresponding to
-the Rényi divergence, a study of the moment generating functions of the
+the Rényi divergence, a study of the [moment generating functions](https://en.wikipedia.org/wiki/Moment-generating_function) of the
divergence between probability measures to derive the tail bounds.
Like in Part 1, I prove a composition theorem and a subsampling theorem.
diff --git a/posts/2019-03-14-great-but-manageable-expectations.md b/posts/2019-03-14-great-but-manageable-expectations.md
index 4412920..e2ff3c9 100644
--- a/posts/2019-03-14-great-but-manageable-expectations.md
+++ b/posts/2019-03-14-great-but-manageable-expectations.md
@@ -8,8 +8,8 @@ comments: true
This is Part 2 of a two-part blog post on differential privacy.
Continuing from [Part 1](/posts/2019-03-13-a-tail-of-two-densities.html),
I discuss the Rényi differential privacy, corresponding to
-the Rényi divergence, a study of the moment generating functions of the
-divergence between probability measures to derive the tail bounds.
+the Rényi divergence, a study of the [moment generating functions](https://en.wikipedia.org/wiki/Moment-generating_function)
+of the divergence between probability measures in order to derive the tail bounds.
Like in Part 1, I prove a composition theorem and a subsampling theorem.
@@ -79,7 +79,7 @@ functions $f$ and $g$:
$$G_\lambda(f || g) = \int f(y)^{\lambda} g(y)^{1 - \lambda} dy; \qquad \kappa_{f, g} (t) = \log G_{t + 1}(f || g).$$
For probability densities $p$ and $q$, $G_{t + 1}(p || q)$ and
-$\kappa_{p, q}(t)$ are the $t$th moment generating function and cumulant
+$\kappa_{p, q}(t)$ are the $t$th moment generating function and [cumulant](https://en.wikipedia.org/wiki/Cumulant)
of the divergence variable $L(p || q)$, and
$$D_\lambda(p || q) = (\lambda - 1)^{-1} \kappa_{p, q}(\lambda - 1).$$
@@ -112,7 +112,7 @@ Using the Chernoff bound (6.7), we can bound the divergence variable:
$$\mathbb P(L(p || q) \ge \epsilon) \le {\mathbb E \exp(t L(p || q)) \over \exp(t \epsilon))} = \exp (\kappa_{p, q}(t) - \epsilon t). \qquad (7.7)$$
-For a function $f: I \to \mathbb R$, denote its Legendre transform by
+For a function $f: I \to \mathbb R$, denote its [Legendre transform](https://en.wikipedia.org/wiki/Legendre_transformation) by
$$f^*(\epsilon) := \sup_{t \in I} (\epsilon t - f(t)).$$