aboutsummaryrefslogtreecommitdiff
path: root/posts/2019-02-14-raise-your-elbo.org
diff options
context:
space:
mode:
Diffstat (limited to 'posts/2019-02-14-raise-your-elbo.org')
-rw-r--r--posts/2019-02-14-raise-your-elbo.org14
1 files changed, 7 insertions, 7 deletions
diff --git a/posts/2019-02-14-raise-your-elbo.org b/posts/2019-02-14-raise-your-elbo.org
index f0de7d1..9cc79a7 100644
--- a/posts/2019-02-14-raise-your-elbo.org
+++ b/posts/2019-02-14-raise-your-elbo.org
@@ -137,7 +137,7 @@ $p(x_{1 : m}; \theta)$.
Represented as a DAG (a.k.a the plate notations), the model looks like
this:
-[[/assets/resources/mixture-model.png]]
+[[/assets/mixture-model.png]]
where the boxes with $m$ mean repitition for $m$ times, since there $m$
indepdent pairs of $(x, z)$, and the same goes for $\eta$.
@@ -309,7 +309,7 @@ $$p(d_i = u, x_i = w | z_i = k; \theta) = p(d_i ; \xi_k) p(x_i; \eta_k) = \xi_{k
The model can be illustrated in the plate notations:
-[[/assets/resources/plsa1.png]]
+[[/assets/plsa1.png]]
So the solution of the M-step is
@@ -380,7 +380,7 @@ pLSA1, $(x | z = k) \sim \text{Cat}(\eta_{k, \cdot})$.
Illustrated in the plate notations, pLSA2 is:
-[[/assets/resources/plsa2.png]]
+[[/assets/plsa2.png]]
The computation is basically adding an index $\ell$ to the computation
of SMM wherever applicable.
@@ -429,7 +429,7 @@ $$p(z_{i1}) = \pi_{z_{i1}}$$
So the parameters are $\theta = (\pi, \xi, \eta)$. And HMM can be shown
in plate notations as:
-[[/assets/resources/hmm.png]]
+[[/assets/hmm.png]]
Now we apply EM to HMM, which is called the
[[https://en.wikipedia.org/wiki/Baum%E2%80%93Welch_algorithm][Baum-Welch
@@ -592,7 +592,7 @@ later in this section that the posterior $q(\eta_k)$ belongs to the same
family as $p(\eta_k)$. Represented in a plate notations, a fully
Bayesian mixture model looks like:
-[[/assets/resources/fully-bayesian-mm.png]]
+[[/assets/fully-bayesian-mm.png]]
Given this structure we can write down the mean-field approximation:
@@ -701,7 +701,7 @@ As the second example of fully Bayesian mixture models, Latent Dirichlet
allocation (LDA) (Blei-Ng-Jordan 2003) is the fully Bayesian version of
pLSA2, with the following plate notations:
-[[/assets/resources/lda.png]]
+[[/assets/lda.png]]
It is the smoothed version in the paper.
@@ -813,7 +813,7 @@ $$L(p, q) = \sum_{k = 1 : T} \mathbb E_{q(\theta_k)} \log {p(\theta_k) \over q(\
The plate notation of this model is:
-[[/assets/resources/dpmm.png]]
+[[/assets/dpmm.png]]
As it turns out, the infinities can be tamed in this case.