,
i
,
,
,
,
,
,
The
EM
Algorithm
and
Extensions
GEOFFREYJ.McLACHLAN
THRIYAMBAKAM KRISHNAN
A Wiley-Interscience Publication
JOHN WILEY
& SONS, INC.
New York • Chichester • Brisbane • Toronto • Singapore • Weinheim
This text is printed on acid-free paper.
Copyright © 1997 by John Wiley
& Sons, Inc.
All rights reserved. Published simultaneously in Canada.
Reproduction
or
translation
of
any part
of
this work beyond
that permitted by Section 107
or
108
of
the 1976 United
States Copyright Act without the permission
of
the copyright
owner is unlawful. Requests for permission
or
further
information should
be addressed to the Permissions Department,
John Wiley
& Sons, Inc., 605 Third Avenue, New York, NY
10158-0012
Library
of
Congress Cataloging
in
Publication Data:
McLachlan, Geoffrey J.,
1946-
The EM algorithm and extensions /
GJ.
McLachlan and T. Krishnan.
p. em. - (Wiley series in probability and statistics.
Applied probability and statistics)
Includes bibliographical references and index.
ISBN 0-471-12358-7 (alk. paper)
I. Expectation-maximization algorithms.
2.
Estimation theory.
3. Missing observations (Statistics)
I.
Krishnan,
T.
(Thriyambakam),
1938-
. II. Title. III. Series.
QA276.8.M394 1996
519.5'44-DC20
96-38417
CIP
Printed in the United States
of
America
10 9 8 7 6 5 4 3 2 I
[
,
,
To
Beryl, Jonathan, and Robbie
;
~
".
t'
"
I
t:
,
,
Contents
Preface
1.
General Introduction
1.1
Introduction, I
1.2
Maximum Likelihood Estimation, 3
1.3
Newton-Type
Methods,S
1.3.1
Introduction, 5
1.3.2 Newton-Raphson Method, 5
1.3.3 Quasi-Newton Methods, 6
1.3.4 Modified Newton Methods, 7
1.4
Introductory Examples, 9
1.4.1
Introduction, 9
1.4.2 Example
1.1:
A Multinomial Example, 9
1.4.3 Example
1.2:
Estimation
of
Mixing Proportions,
16
1.5
Formulation
of
the EM Algorithm,
21
1.5.1
EM Algorithm,
21
1.5.2 Example
1.3:
Censored Exponentially Distributed
Survival Times,
23
1.5.3
E-
and M-Steps for Regular Exponential Family, 26
1.5.4 Example
1.4:
Censored Exponentially Distributed Survival
Times (Example
1.3
Continued), 27
1.5.5 Generalized EM Algorithm, 28
1.5.6 GEM Algorithm Based on One Newton-Raphson Step, 28
1.5.7 EM Gradient Algorithm,
30
1.5.8 EM Mapping,
30
1.6
EM Algorithm for Maximum a Posteriori and Maximum Penalized
Likelihood Estimation, 30
1.6.1
Maximum a Posteriori Estimation, 30
1.6.2 Example
1.5:
A Multinomial Example (Example
1.1
,
Continued),
31
'"
XIII
1
..
VII
·
..
VIII
1.6.3 Maximum Penalized Likelihood Estimation, 32
1.7
BriefSummary
of
the Properties
of
the EM Algorithm, 32
1.8
History
of
the EM Algorithm, 34
1.8.1
Work Before Dempster, Laird, and Rubin (1977), 34
1.8.2 EM Examples and Applications Since Dempster,
Laird, and Rubin (1977), 36
1.8.3
Two
Interpretations
of
EM, 37
1.8.4 Developments in EM Theory, Methodology,
and Applications, 38
1.9
Overview
of
the Book,
41
1.1
0 Notation, 42
CONTENTS
2. Examples
of
the EM Algorithm
2.1
Introduction, 45
2.2 Multivariate Data with Missing Values, 45
2.2.1 Example 2.1: Bivariate Normal Data with Missing Values, 45
2.2.2 Numerical Illustration, 49
2.2.3 Multivariate Data: Buck's Method, 50
2.3 Least Squares with Missing Data,
51
2.3.1 Healy-Westmacott Procedure,
51
2.3.2 Example 2.2: Linear Regression with Missing
Dependent Values, 52
2.3.3 Example 2.3: Missing Values in a Latin Square Design, 54
2.3.4 Healy-Westmacott Procedure as an EM Algorithm, 55
2.4 Example 2.4: Multinomial with Complex Cell Structure, 57
2.5 Example 2.5: Analysis
of
PET and SPECT Data, 60
2.6 Example 2.6: Multivariate t-Distribution with Known Degrees
of
Freedom,
63
2.6.1 ML Estimation
of
Multivariate t-Distribution,
63
2.6.2 Numerical Example: Stack Loss Data, 67
•
2.7 Finite Normal Mixtures, 68
2.7.1 Example 2.7: Univariate Component Densities,
68
2.7.2 Example 2.8: Multivariate Component Densities,
71
2.7.3 Numerical Example: Red Blood Cell Volume Data,
72
.
2.8 Example 2.9: Grouped and Truncated Data,
74
2.8.1 Introduction, 74
2.8.2 Specification
of
Complete Data, 74
2.8.3 E-Step,
77
2.8.4 M-Step,
78
2.8.5 Confirmation ofIncomplete-Data Score Statistic,
78
45
评论1