Postgraduate Course: Information Theory (INFR11087)
Course Outline
School | School of Informatics |
College | College of Science and Engineering |
Credit level (Normal year taken) | SCQF Level 11 (Year 4 Undergraduate) |
Availability | Available to all students |
SCQF Credits | 10 |
ECTS Credits | 5 |
Summary | Information theory describes the fundamental limits on our ability to store, process and communicate data, whether in natural or artificial systems. Understanding and approaching these limits is important in a wide variety of topics in informatics.
This course covers the theory introduced by Shannon in 1948, which revolutionized how we think about information and communication, and some of the practical techniques for compression and reliable communication that have been developed since. |
Course description |
-Differential Entropy and information content
-Source coding theorem
-Symbol codes, Kraft-McMillan inequality, Huffman codes
-Stream codes, adaptive models, arithmetic coding
-Compression in practice
-Relative Entropy, mutual information, related inequalities
-Noisy channel coding theorem, channel capacity
-Error correcting codes
-Codes robust to erasures
-Lossy compression
-Hash codes
|
Entry Requirements (not applicable to Visiting Students)
Pre-requisites |
|
Co-requisites | |
Prohibited Combinations | |
Other requirements | This course is open to all Informatics students including those on joint degrees. For external students where this course is not listed in your DPT, please seek special permission from the course organiser.
- A solid mathematical background is required.
- Essential maths knowledge: Special functions log, exp are fundamental; mathematical notation (such as sums) use throughout; some calculus.
- Probability theory will be used extensively: Random variables, expectation, Bernoulli trials, Binomial distribution, joint and conditional probabilities.
- A basic level of programming is assumed and not covered in lectures. The assessed assignment will involve programming in a language or your choice. |
Information for Visiting Students
Pre-requisites | Visiting students are required to have comparable background to that
assumed by the course prerequisites listed in the Degree Regulations &
Programmes of Study. If in doubt, consult the course lecturer. |
High Demand Course? |
Yes |
Course Delivery Information
Not being delivered |
Learning Outcomes
On completion of this course, the student will be able to:
- Explain the source coding and noisy channel theorems and describe their implications for applications covered in lectures.
- Compute information theoretic quantities, construct bounds and describe+implement algorithms involving high-dimensional probability distributions.
- Describe the techniques covered in the course: identify their limitations, discuss their practical merits and design and describe alternatives.
- For a novel data source, communication channel or application, identify relevant information theoretic aspects to provide insight or suggest useful methods.
|
Reading List
ESSENTIAL: "Information Theory, Inference and Learning Algorithms", David MacKay, CUP, 2003. http://www.inference.phy.cam.ac.uk/mackay/itila/book.html
BACKGROUND ONLY: Elements of Information Theory, 2nd Edition, Cover and Thomas, Wiley 2006
|
Contacts
Course organiser | Dr Stratis Viglas
Tel: (0131 6)50 5183
Email: |
Course secretary | Ms Sarah Larios
Tel: (0131 6)51 1514
Email: |
|
|