Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

The EM Algorithm—an Old Folk-song Sung to a Fast New Tune

The EM Algorithm—an Old Folk-song Sung to a Fast New Tune SummaryCelebrating the 20th anniversary of the presentation of the paper by Dempster, Laird and Rubin which popularized the EM algorithm, we investigate, after a brief historical account, strategies that aim to make the EM algorithm converge faster while maintaining its simplicity and stability (e.g. automatic monotone convergence in likelihood). First we introduce the idea of a ‘working parameter’ to facilitate the search for efficient data augmentation schemes and thus fast EM implementations. Second, summarizing various recent extensions of the EM algorithm, we formulate a general alternating expectation–conditional maximization algorithm AECM that couples flexible data augmentation schemes with model reduction schemes to achieve efficient computations. We illustrate these methods using multivariate t-models with known or unknown degrees of freedom and Poisson models for image reconstruction. We show, through both empirical and theoretical evidence, the potential for a dramatic reduction in computational time with little increase in human effort. We also discuss the intrinsic connection between EM-type algorithms and the Gibbs sampler, and the possibility of using the techniques presented here to speed up the latter. The main conclusion of the paper is that, with the help of statistical considerations, it is possible to construct algorithms that are simple, stable and fast. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of the Royal Statistical Society Series B (Statistical Methodology) Oxford University Press

The EM Algorithm—an Old Folk-song Sung to a Fast New Tune

Loading next page...
 
/lp/oxford-university-press/the-em-algorithm-an-old-folk-song-sung-to-a-fast-new-tune-22eRgCK1jd

References (144)

Copyright
© 1997 Royal Statistical Society
ISSN
1369-7412
eISSN
1467-9868
DOI
10.1111/1467-9868.00082
Publisher site
See Article on Publisher Site

Abstract

SummaryCelebrating the 20th anniversary of the presentation of the paper by Dempster, Laird and Rubin which popularized the EM algorithm, we investigate, after a brief historical account, strategies that aim to make the EM algorithm converge faster while maintaining its simplicity and stability (e.g. automatic monotone convergence in likelihood). First we introduce the idea of a ‘working parameter’ to facilitate the search for efficient data augmentation schemes and thus fast EM implementations. Second, summarizing various recent extensions of the EM algorithm, we formulate a general alternating expectation–conditional maximization algorithm AECM that couples flexible data augmentation schemes with model reduction schemes to achieve efficient computations. We illustrate these methods using multivariate t-models with known or unknown degrees of freedom and Poisson models for image reconstruction. We show, through both empirical and theoretical evidence, the potential for a dramatic reduction in computational time with little increase in human effort. We also discuss the intrinsic connection between EM-type algorithms and the Gibbs sampler, and the possibility of using the techniques presented here to speed up the latter. The main conclusion of the paper is that, with the help of statistical considerations, it is possible to construct algorithms that are simple, stable and fast.

Journal

Journal of the Royal Statistical Society Series B (Statistical Methodology)Oxford University Press

Published: Jan 6, 2002

Keywords: Data augmentation; Expectation–conditional maximization algorithm; Expectation–conditional maximization either algorithm; Gibbs sampler; Incomplete data; Markov chain Monte Carlo method; Missing data; Model reduction; Multivariate t -distributions; Poisson model; Positron emission tomogrpahy; Rate of convergence; Sage algorithm

There are no references for this article.