Postgraduate Course: Probabilistic Modelling and Reasoning (INFR11050)
Course Outline
School | School of Informatics |
College | College of Science and Engineering |
Credit level (Normal year taken) | SCQF Level 11 (Postgraduate) |
Availability | Available to all students |
SCQF Credits | 10 |
ECTS Credits | 5 |
Summary | ***PLEASE NOTE: This course has been replaced by a 20 credit version - INFR11134 Probabilistic Modelling and Reasoning, please see the entry for that course.***
When dealing with real world data, we often need to deal with uncertainty. For example, short segments of a speech signal are ambiguous, and we need to take into account context in order to make sense of an utterance. Probability theory provides a rigorous method for representing and reasoning with uncertain knowledge. The course covers two main areas (i) the process of inference in probabilistic reasoning systems and (ii) learning probabilistic models from data. Its aim is to provide a firm grounding in probabilistic modelling and reasoning, and to give a basis which will allow students to go on to develop their interests in more specific areas, such as data-intensive linguistics, automatic speech recognition, probabilistic expert systems, statistical theories of vision etc. |
Course description |
*Introduction
* Probability
o events, discrete variables
o joint, conditional probability
* Discrete belief networks, inference
* Continuous distributions, graphical Gaussian models
* Learning: Maximum Likelihood parameter estimation
* Decision theory
* Hidden variable models
o mixture models and the EM algorithm
o factor analysis
o ICA, non-linear factor analysis
* Dynamic hidden variable models
o Hidden Markov models
o Kalman filters (and extensions)
* Undirected graphical models
o Markov Random Fields
o Boltzmann machines
* Information theory
o entropy, mutual information
o source coding, Kullback-Leibler divergence
* Bayesian methods for
o Inference on parameters
o Model comparison
Relevant QAA Computing Curriculum Sections: Artificial Intelligence
|
Entry Requirements (not applicable to Visiting Students)
Pre-requisites |
|
Co-requisites | |
Prohibited Combinations | |
Other requirements | This course is open to all Informatics students including those on joint degrees. For external students where this course is not listed in your DPT, please seek special permission from the course organiser (lecturer).
Mathematics prerequisites:
1 - Probability theory: Discrete and continuous univariate random variables. Expectation, variance. Joint and conditional distributions.
2 - Linear algebra: Vectors and matrices: definitions, addition. Matrix multiplication, matrix inversion. Eigenvectors, determinants, quadratic forms.
3 - Calculus: Functions of several variables. Partial differentiation. Multivariate maxima and minima. Integration: need to know definitions, including multivariate integration.
4 - Special functions: Log, exp are fundamental.
5 - Geometry: Basics of lines, planes and hyperplanes. Coordinate geometry of circle, sphere, ellipse, ellipsoid and n-dimensional generalizations.
6 - Graph theory: Basic concepts and definitions: vertices and edges, directed and undirected graphs, trees, paths and cycles, cliques.
Programming prerequisite: A basic level of programming is assumed and not covered in lectures. The assessed assignment will involve some programming, probably in MATLAB. |
Information for Visiting Students
Pre-requisites | None |
High Demand Course? |
Yes |
Course Delivery Information
Not being delivered |
Learning Outcomes
On completion of this course, the student will be able to:
- Define the joint distribution implied by directed and undirected probabilistic graphical models , and carry out inference in graphical models from first principles by hand, and by using the junction tree algorithm
- Demonstrate understanding of maximum likelihood and Bayesian methods for parameter estimation by hand derivation of estimation equations for specific problems
- Critically discuss differences between various latent variable models for data , and derive EM updates for various latent variable models (e.g. mixture models)
- Define entropy, joint entropy, conditional entropy, mutual information, expected code length
- Demonstrate ability to design, assess and evaluate belief network models , and use Matlab code implementing probabilistic graphic models, and demonstrate ability to conduct experimental investigations and draw conclusions from them
|
Reading List
* The course text is "Pattern Recognition and Machine Learning" by C. M. Bishop (Springer, 2006).
* In addition, David MacKay's book "Information Theory, Inference and Learning Algorithms" (CUP, 2003) is highly recommended.
|
Contacts
Course organiser | Dr Amos Storkey
Tel: (0131 6)51 1208
Email: |
Course secretary | Ms Katey Lee
Tel: (0131 6)50 2701
Email: |
|
|