HMM - Machine Learning - Fall 2019
Two weeks about hidden Markov models as part of the Machine Learning class at the Department of Computer Science, Aarhus University.
Lecutures
Slides are preliminaries until the day of the lecture.
Wednesday, Nov 20: Hidden Markov Models. Terminology and basic algorithms.
Reading material: Bishop section 13.1-13.2 (not 13.2.1, 13.2.4 and 13.2.6).
Supplementary material: See Rabiner for an alternative presetation of HMM methods and applications. See note about graphical models for background about probability theory and terminology.
Slides from lecture:
Friday, Nov 22: Hidden Markov Models. Implementing the basic algorithms.
Reading material: Bishop section 13.2.4.
Supplementary reading: You can also take a look at Appendix B - Floating Point Numbers from A. Tanenbaum, Structured Computer Organization, in order to se why we do not want numbers to become too small.
Slides from lecture:
Wednesday, Nov 27: Hidden Markov Models. Training and selecting model parameters.
Reading material: Bishop section 13.2.1.
Slides from lecture:
Friday, Nov 29: Hidden Markov Models. Selecting the initial model parameter, using HMMs for (simple) gene finding, ans some useful extensions.
Slides from lecture:
Exercises
The exercise texts are preliminary until the Friday the week before the exercises are planned.
Week 48 (25/11 - 29/11):
- Theoretical exercises: html, ipynb (the notebook refers to the image graphical-models.png, download it and put it in the same directory as the notebook in order to display it in the notebook)
- Practical exercises: html, ipynb
Week 49 (2/12 - 6/12):
Material
- [Bishop]: Christopher M. Bishop, Pattern Recognition and Machine Learning, Cambridge University Press.
- [Rabiner]: Lawrence R. Rabiner. A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceeding of IEEE, 77:257-286, 1989.
- [Mailund]: Thomas Mailund. Conditional probabilities and graphical models.