From 5750cdc9c333aa58786661707169a10a865b7e27 Mon Sep 17 00:00:00 2001 From: Yuchen Pei Date: Sun, 17 May 2020 18:56:13 +0200 Subject: minor type fix. --- posts/2019-03-14-great-but-manageable-expectations.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'posts') diff --git a/posts/2019-03-14-great-but-manageable-expectations.md b/posts/2019-03-14-great-but-manageable-expectations.md index b52a97c..2520954 100644 --- a/posts/2019-03-14-great-but-manageable-expectations.md +++ b/posts/2019-03-14-great-but-manageable-expectations.md @@ -56,7 +56,7 @@ The Rényi divergence is an interpolation between the max divergence and the KL-divergence, defined as the log moment generating function / cumulants of the divergence variable: -$$D_\lambda(p || q) = (\lambda - 1)^{-1} \log \mathbb E \exp((\lambda - 1) L(p || q)) = (\lambda - 1)^{-1} \log \int {p(y)^\lambda \over q(y)^{\lambda - 1}} dx.$$ +$$D_\lambda(p || q) = (\lambda - 1)^{-1} \log \mathbb E \exp((\lambda - 1) L(p || q)) = (\lambda - 1)^{-1} \log \int {p(y)^\lambda \over q(y)^{\lambda - 1}} dy.$$ Indeed, when $\lambda \to \infty$ we recover the max divergence, and when $\lambda \to 1$, by recognising $D_\lambda$ as a derivative in -- cgit v1.2.3