From c24a34b2d2bb7632c56ac5018d076aa4b1784337 Mon Sep 17 00:00:00 2001 From: Yuchen Pei Date: Wed, 20 Feb 2019 13:51:14 +0100 Subject: addressing lucas' comments - https://ypei.me/posts/2019-02-14-raise-your-elbo.html#isso-20 --- posts/2019-02-14-raise-your-elbo.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) (limited to 'posts/2019-02-14-raise-your-elbo.md') diff --git a/posts/2019-02-14-raise-your-elbo.md b/posts/2019-02-14-raise-your-elbo.md index 33149c1..34e7386 100644 --- a/posts/2019-02-14-raise-your-elbo.md +++ b/posts/2019-02-14-raise-your-elbo.md @@ -239,10 +239,12 @@ restricted to $\epsilon I$ is called elliptical k-means algorithm. As a transition to the next models to study, let us consider a simpler mixture model obtained by making one modification to GMM: change $(x; \eta_k) \sim N(\mu_k, \Sigma_k)$ to -$\mathbb P(x = w; \eta_k) = \eta_{kw}$ so $\eta$ is a stochastic matrix. +$\mathbb P(x = w; \eta_k) = \eta_{kw}$ where $\eta$ is a stochastic matrix +and $w$ is an arbitrary element of the space for $x$. So now the space for both $x$ and $z$ are finite. We call this model the simple mixture model (SMM). + As in GMM, at E-step $r_{ik}$ can be explicitly computed using Bayes\' theorem. @@ -253,6 +255,9 @@ $$\begin{aligned} \eta_{k, w} &= {\sum_i r_{ik} 1_{x_i = w} \over \sum_i r_{ik}}. \qquad (2.8) \end{aligned}$$ +where $1_{x_i = w}$ is the [indicator function](https://en.wikipedia.org/wiki/Indicator_function), +and evaluates to $1$ if $x_i = w$ and $0$ otherwise. + Two trivial variants of the SMM are the two versions of probabilistic latent semantic analysis (pLSA), which we call pLSA1 and pLSA2. -- cgit v1.2.3