Show simple item record

dc.contributor.authorJordan, Michaelen_US
dc.contributor.authorXu, Leien_US
dc.date.accessioned2004-10-20T20:49:25Z
dc.date.available2004-10-20T20:49:25Z
dc.date.issued1995-04-21en_US
dc.identifier.otherAIM-1520en_US
dc.identifier.otherCBCL-111en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/7195
dc.description.abstract"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.en_US
dc.format.extent9 p.en_US
dc.format.extent291671 bytes
dc.format.extent476864 bytes
dc.format.mimetypeapplication/postscript
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.relation.ispartofseriesAIM-1520en_US
dc.relation.ispartofseriesCBCL-111en_US
dc.subjectlearningen_US
dc.subjectneural networksen_US
dc.subjectEM algorithmen_US
dc.subjectclusteringen_US
dc.subjectmixture modelsen_US
dc.subjectstatisticsen_US
dc.titleOn Convergence Properties of the EM Algorithm for Gaussian Mixturesen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record