ENGLISH    |  

DMLA/2007年度/輪読

Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006.

ISBN: 978-0-387-31073-2

日程

  • 7/6 開始.
  • (夏学期)原則毎週の金曜日、17:00−.
  • (7月-9月)火曜日もこの輪読に充てる.
  • (10月以降)火曜日17:00-

進め方

  • 数式と図表を理解することができればOK. 本文は、斜体語句の意味と数式と図表を理解するために読む.
  • ひとつの章を複数人で担当してよい.
  • 興味ある部分だけの、スポット参加歓迎.
  • 原則, 章の順番によらず, 担当者が好きな章を選んでよい.
    • ただし, 章間につながりがある場合があるので選ぶ際にはじゅうぶん注意してください.
    • 章間の依存関係に気がついた人は, このページ末尾に書き足してください.

参加予定者

yuusaku-t, kazuo-h, masahiko-h, manab-ki, hideharu-o, shimbo, matsu, yotaro-w, masayu-a, junta-m

分担

(早く終われば繰り上がる可能性があります.)

  • 7/6, 10 kazuo-h (Chapter 1)
  • 7/13, 16 manab-ki, masahiko-h (Chapter 3)
  • 7/20, 24 shimbo (Chapter 9)
  • 7/27, 31, 8/3, 21 kazuo-h (Chapter 2)
  • 8/21, 24, 31 matsu (Chapter 4)
  • 8/28, 9/4, 7 shimbo (Chapter 11)
  • 9/11, 18 shimpei-m (Chapter 6)
  • 10/9, 12, 16, 23 junta-m (Chapter 7)
  • 10/23, 30, 11/6, 16 yotaro-w (Chapter 8 前半)
  • 11/27, 12/4 inui (Chapter 8 後半)
  • 12/11, 18, 25, 1/8 manab-ki 他 (Chapter 10)
    • 12/11, 18, 25 manab-ki (10.1, 10.2)
    • 12/25 ikumi-s (10.3), yotaro-w(10.4)
    • 1/8 manab-ki (10.5), junta-m (10.6)
    • 1/15 yotaro-w (10.7)
  • 1/22, 29, 2/5 hideharu-o (Chapter 13)
  • 2/12 shimbo (Chapter 12)
  • 2/26, 3/4 kazuo-h (Chapter 12)

目次

章節章題ページ担当(2〜4人OK)日程(案)
Contentsxiii--
1 Introduction1kazuo-h7/6, 10
1.1Example: Polynomial Curve Fitting4
1.2Probability Theory12
1.3Model Selection32
1.4The Curse of Dimensionality33
1.5Decision Theory38
1.6Information Theory48
2 Probability Distributions67kazuo-h, matsu7/27, 31, 8/3, 21
2.1Binary Variables68
2.2Multinomial Variables74
2.3The Gaussian Distribution78
2.4The Exponential Family113
2.5Nonparametric Methods120
3 Linear Models for Regression137manab-ki, masahiko-h7/13, 16
3.1Linear Basis Function Models138
3.2The Bias-Variance Decomposition147
3.3Bayesian Linear Regression152
3.4Bayesian Model Comparison161
3.5The Evidence Approximation165
3.6Limitations of Fixed Basis Functions172
4 Linear Models for Classication179matsu8/21, 24
4.1Discriminant Functions181
4.2Probabilistic Generative Models196
4.3Probabilistic Discriminative Models203
4.4The Laplace Approximation213
4.5Bayesian Logistic Regression217
5 Neural Networks225kazuo-h後回し
5.1Feed-forward Network Functions227
5.2Network Training232
5.3Error Backpropagation241
5.4The Hessian Matrix249
5.5Regularization in Neural Networks256
5.6Mixture Density Networks272
5.7Bayesian Neural Networks277
6 Kernel Methods291shimpei-m9/11, 18
6.1Dual Representations293
6.2Constructing Kernels294
6.3Radial Basis Function Networks299
6.4Gaussian Processes303
7 Sparse Kernel Machines325junta-m10/5, 12
7.1Maximum Margin Classiers326
7.2Relevance Vector Machines345
8 Graphical Models359yotaro-w,inui10/23, 30
8.1Bayesian Networks360
8.2Conditional Independence372
8.3Markov Random Fields383
8.4Inference in Graphical Models393
9 Mixture Models and EM423shimbo7/20, 24
9.1K-means Clustering424
9.2Mixtures of Gaussians430
9.3An Alternative View of EM439
9.4The EM Algorithm in General450
10 Approximate Inference461manabu-ki
10.1Variational Inference462
10.2Illustration: Variational Mixture of Gaussians474
10.3Variational Linear Regression486
10.4Exponential Family Distributions490
10.5Local Variational Methods493
10.6Variational Logistic Regression498
10.7Expectation Propagation505
11 Sampling Methods523shimbo8/28, 9/4, 7
11.1Basic Sampling Algorithms526
11.2Markov Chain Monte Carlo537
11.3Gibbs Sampling542
11.4Slice Sampling546
11.5The Hybrid Monte Carlo Algorithm548
11.6Estimating the Partition Function554
12 Continuous Latent Variables559harendra-b
12.1Principal Component Analysis561
12.2Probabilistic PCA570
12.3Kernel PCA586
12.4Nonlinear Latent Variable Models591
13 Sequential Data605hideharu-o
13.1Markov Models607
13.2Hidden Markov Models610
13.3Linear Dynamical Systems635
14 Combining Models653
14.1Bayesian Model Averaging654
14.2Committees655
14.3Boosting657
14.4Tree-based Models663
14.5Conditional Mixture Models666

章間の依存関係

(間違いあれば修正してください.)

  • 3 章
    • (少なくとも)一箇所, 2 章の数式を引用している.
  • 7 章
    • 4章の結果を参照している箇所が多い
    • 6 章に依存.
  • 8 章
    • 3-7 章とは独立に読める?
  • 9 章
    • 2.3.9 Mixtures of Gaussians
    • ancestral sampling ... 8.1.2 節
    • 8 章 (グラフィカルモデルの図) に依存 (でもほとんど問題ないレベル).
    • 9.3.3 Bernoulli 分布については 2.1 節 (p.69).
    • 9.3.4 節は 3.5 節に依存. 後半, Relevance Vector Machine は 7.2 節.
  • 10 章
    • Mixtures of Gaussians は 9 章 (2 章?) が初出.
  • 11 章
    • ancestral sampling (8章)などの用語が出てくるが, ほぼ他の章と独立して読める.
  • 12 章
    • Kernel PCA は 6 章 Kernel に依存.
    • EM アルゴリズムが出てくるため, 9 章に依存.