Published October 14, 2019
| Version v1
Technical note
Open
Notes on Semi-Supervised Expectation Maximization
Description
This work considers the Expectation Maximization (EM) algorithm in the semi-supervised setting. First, the general form for semi-supervised version of maximum likelihood is derived from the Latent Variable Model (LVM). Since the involved integrals are usually intractable, a surrogate objective function based on the Evidence Lower Bound (ELBO) is introduced. Next, we derive the equations of the semi-supervised EM. Finally, the concrete equations for a fitting a Gaussian Mixture Model (GMM) using labeled and unlabeled data are deduced.
Files
em.pdf
Files
(298.0 kB)
Name | Size | Download all |
---|---|---|
md5:a419280adb5a92c75444ccc8db8de886
|
298.0 kB | Preview Download |