aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--posts/2019-03-14-great-but-manageable-expectations.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/posts/2019-03-14-great-but-manageable-expectations.md b/posts/2019-03-14-great-but-manageable-expectations.md
index b52a97c..2520954 100644
--- a/posts/2019-03-14-great-but-manageable-expectations.md
+++ b/posts/2019-03-14-great-but-manageable-expectations.md
@@ -56,7 +56,7 @@ The Rényi divergence is an interpolation between the max divergence and
the KL-divergence, defined as the log moment generating function /
cumulants of the divergence variable:
-$$D_\lambda(p || q) = (\lambda - 1)^{-1} \log \mathbb E \exp((\lambda - 1) L(p || q)) = (\lambda - 1)^{-1} \log \int {p(y)^\lambda \over q(y)^{\lambda - 1}} dx.$$
+$$D_\lambda(p || q) = (\lambda - 1)^{-1} \log \mathbb E \exp((\lambda - 1) L(p || q)) = (\lambda - 1)^{-1} \log \int {p(y)^\lambda \over q(y)^{\lambda - 1}} dy.$$
Indeed, when $\lambda \to \infty$ we recover the max divergence, and
when $\lambda \to 1$, by recognising $D_\lambda$ as a derivative in