Postgraduate Course: Probability, Estimation Theory and Random Signals (PETARS) (MSc) (PGEE11164)
Course Outline
School | School of Engineering |
College | College of Science and Engineering |
Credit level (Normal year taken) | SCQF Level 11 (Postgraduate) |
Availability | Available to all students |
SCQF Credits | 20 |
ECTS Credits | 10 |
Summary | The Probability, Estimation Theory, and Random Signals course introduces the fundamental statistical tools that are required to analyse and describe advanced signal processing algorithms within the MSc Signal Processing and Communications programme. It provides a unified mathematical framework which is the basis for describing random events and signals, and how to describe key characteristics of random processes.
The course covers probability theory, considers the notion of random variables and vectors, how they can be manipulated, and provides an introduction to estimation theory. It is demonstrated that many estimation problems, and therefore signal processing problems, can be reduced to an exercise in either optimisation or integration. While these problems can be solved using deterministic numerical methods, the course introduces the notion of Monte Carlo techniques which are the basis of powerful stochastic optimisation and integration algorithms. These methods rely on being able to sample numbers, or variates, from arbitrary distributions. This course will therefore discuss the various techniques which are necessary to understand these methods and, if time permits, techniques for random number generation are considered. The random signals aspect of the course considers representing real-world signals by stochastic or random processes. The notion of statistical quantities such as autocorrelation and auto-covariance are extended from random vectors to random processes (time series), and a frequency-domain analysis framework is developed. This course also investigates the effect of systems and transformations on time-series, and how they can be used to help design powerful statistical signal processing algorithms to achieve a particular task.
The course introduces the notion of representing signals using parametric models; it extends the broad topic of statistical estimation theory for determining optimal model parameters. In particular, the Bayesian paradigm for statistical parameter estimation is introduced. Emphasis is placed on relating these concepts to state-of-the-art applications and signals. This course provides the fundamental knowledge required for the advanced signal, image, and communication courses in the MSc course.
|
Course description |
Any minor modifications to the latest syllabus and lecture are always contained in the lecture handout.
Course Introduction, Motivation, Prerequisites (2 lectures):
1. Motivating the field of statistical signal processing, along with the role of probability, random variables, and estimation theory as a consistent mathematical analysis framework.
2. Examples of modern signal processing applications.
3. Function norms (signal measures), the Fourier transform, and Laplace transform (revision).
Scalar Random Variables (4 lectures):
1. Notion of a random variable and its formal definition involving experimental outcomes, sample space, probability of events, and assigned values; the concept of the cumulative distribution function (cdf), the probability density function (pdf), and their formal properties.
2. Discrete random variables (RVs), their probability mass function (pmf), the corresponding cdfs and pdfs, as well as mixtures of continuous and discrete random variables.
3. Examples of several common discrete-and continuous RVs and their pdfs.
4. Introduce the probability transformation rule through a conceptual derivation, with examples.
5. Expectations, moments, central moments, and higher-order statistics and cummulants.
6. The characteristic function, the moment generating function (MGF), properties and examples.
Random Vectors and Multiple Random Variables (7 lectures):
1. Generalisation of the theory on scalar random variables to multiple random variables.
2. Introduction to the concept and formal definition of a random vector, along with the notion of a joint cdf, joint pdf; cover the properties of joint cdfs and pdfs, the probability of arbitrary events, and calculating joint cdfs from joint pdfs.
3. Introducing marginal cdfs from pdfs, independence, conditional densities, and Bayes's theorem.
4. Popular examples of dependent random variables, including the Monty Hall problem.
5. The probability transformation rule, including calculating the Jacobian, auxiliary variables, and the sum of independent RVs. Examples include application of Cartesian to Polar coordinate transformation.
6. Statistical descriptions of random vectors, including the mean and correlation matrices, cross-correlation, and cross-covariance. Covers the properties of correlation and covariance matrices, and determining whether a correlation or covariance matrix is a valid one.
7. Considers special case of linear transformations of random vectors; effect of linear transformations on statistical properties; invariance of the expectation operator.
8. Normally distributed random vectors; derivation of the Gaussian integral identity; the two envelopes problem/paradox; properties of the multivariate Gaussian.
9. Characteristic functions and MGFs for random vectors.
10. Analysis of the sum of independent random variables, and the central limit theorem (CLT), using the characteristic functions. MATLAB demo of the CLT.
Principles of Estimation Theory (7 lectures)
1. General introduction to parameter estimation set in the context of observing a repeated number of observations of an experiment, possibly as a function of time. Covers examples such as the taxi-cab problem.
2. Properties of estimators: bias, variance, mean-squared error (MSE), Cramer-Rao lower-bound (CRLB). Discusses the notion of a likelihood function. Includes examples such as sample mean and sample variance.
3. Efficiency of an estimator, consistency, and estimating multiple parameters.
4. Maximum-likelihood estimator, the principle of least squares.
5. Linear least squares and the normal equations.
6. Introduction to Bayesian estimation: priors, marginalisation, posterior distributions.
7. Overview of problems of optimisation and marginalisation in practice.
Review of Discrete-time Systems (Self-learning material).
1. Introduction and module overview. Role of deterministic and random signals, and the various interpretations of random processes in the different physical sciences.
2. Brief review of Fourier transform theorem: a. Transforms for continuous-time, discrete-time, Periodic or aperiodic, signals. b. Parseval¿s Theorem. c. Properties of the discrete Fourier transform (DFT). d. The DFT as a linear transformation. e. Summary of frequently used transform pairs.
3. Review of discrete-time systems. a. Basic discrete-time signals. b. The z-transform and basic properties. c. Summary of frequently used transform pairs. d. Definitions of linear time-invariant (LTI) and linear time-varying (LTV) systems. e. Rational transfer functions; pole-zero models. f. Frequency response of LTI systems. g. Example of inverse bilateralz-transforms, and different approaches to get the same answer; partial fraction expansions using the cover-up rule.
Stochastic Processes (6 lectures).
1. Introduction to stochastic processes, and their definition as an ensemble of deterministic realisations resulting from the outcome of a sample space; also covers the various interpretations of the samples of a random process.
2. Covers predictable processes with an example of harmonic processes; description of stochastic processes using probability density functions (pdfs).
3. Notion of stationary and nonstationary processes.
4. Statistical description of random processes; examples of some predictable processes through a MATLAB demonstration; second-order statistics including mean and autocorrelation sequences, with an example of calculating autocorrelation for a harmonic process.
5. Types of random processes, including independent, independent and identically distributed (i. i. d.) random processes, and uncorrelated and orthogonal processes.
6. Introduction to stationary processes, both order-N stationary, strict-sense stationary, and wide-sense stationary; example of testing whether a Wiener process is stationary or not; also covers wide sense periodic and wide-sense cyclo-stationary processes.
7. Notion of ergodicity, and the notion of time-averages being equal to ensemble averages in the mean-square sense.
8. Second-order statistical descriptions, including autocorrelation and covariances; joint-signal statistics; types of joint stochastic processes; correlation matrices.
9. Basic introduction to Markov processes.
Frequency-Domain Description of Stationary Processes (3 lectures).
1. Introduction to random processes in the frequency domain, including the stochastic decomposition interpretation, the transform of averages interpretation, and the connections between these interpretations.
2. Formal definition of the power spectral density (PSD) and its properties; general form of the PSD including autocorrelation sequences (ACSs) with periodic components; the PSD of a harmonic signal (as a linear summation of sinusoids).
3. The PSD of common stationary processes: introducing white noise, harmonic processes, complex-exponentials.
4. Definition of the cross-power spectral density (CPSD), a physical overview, and the properties of the CPSD; an overview of complex spectral density functions, their relationships with PSDs, and how to find their inverses; properties of complex spectral densities.
Linear systems with stationary random inputs (3 lectures).
1. Considers the effect of linear systems on random processes, and the resulting output processes; discusses the linearity of the expectation operator.
2. Develops the basic relationships between the input and output for stationary random processes, including input-output cross-correlation, output autocorrelation, and output power. Discusses the case of LTI systems, and the fact that most real world applications will be a LTV system.
3. System identification using cross-correlation.
4. Frequency-domain analysis of LTI systems, including input-output CPSD and output PSD.
5. Equivalence of time-domain and frequency-domain methods.
6. LTV systems with non-stationary inputs.
Linear signal models (2 lectures).
1. Introduction to the notion of parametric modelling.
2. Nonparametric vs parametric signal models.
3. Types of pole-zero models.
4. All-pole models: impulse response, autocorrelation functions, poles, minimum-phase conditions.
5. Linear prediction, autoregressive (AR) processes, Yule-Walker equations.
6. All-zero models: impulse response, autocorrelation functions, zeros, and moving average (MA) processes.
7. Pole-Zero models: autocorrelation functions, autoregressive moving average (ARMA) processes.
8. Overview of extension to time-varying processes.
9. Applications and examples.
Estimation Theory for Random Processes (3 lectures).
1. Sample autocorrelation and auto-covariance functions.
2. Least-squares for AR modelling.
3. Estimating signals in noise, using parametric signal models.
4. Bayesian estimation of sinusoids in noise, and other applications of Bayesian estimation methods to time-series analysis.
|
Information for Visiting Students
Pre-requisites | None |
High Demand Course? |
Yes |
Course Delivery Information
|
Academic year 2022/23, Available to all students (SV1)
|
Quota: None |
Course Start |
Semester 1 |
Timetable |
Timetable |
Learning and Teaching activities (Further Info) |
Total Hours:
200
(
Lecture Hours 44,
Seminar/Tutorial Hours 22,
Feedback/Feedforward Hours 11,
Programme Level Learning and Teaching Hours 4,
Directed Learning and Independent Learning Hours
119 )
|
Assessment (Further Info) |
Written Exam
100 %,
Coursework
0 %,
Practical Exam
0 %
|
Additional Information (Assessment) |
Written Exam: 100%
Coursework: 0%
Practical Exam: 0% |
Feedback |
Examples classes, tutorials, office hours, Mock examination. |
Exam Information |
Exam Diet |
Paper Name |
Hours & Minutes |
|
Main Exam Diet S1 (December) | | 3:00 | |
Learning Outcomes
On completion of this course, the student will be able to:
- Define, understand and manipulate scalar and multiple random variables, using the theory of probability; this should include the basic tools of probability transformations and characteristic functions, moments, the central limit theorem (CLT) and its use in estimation theory and the sum of random variables.
- Understand the principles of estimation theory, and estimation techniques such as maximum-likelihood, least squares, minimum variance unbiased estimator (MVUE) estimators, and Bayesian estimation; be able to characterise the estimator using standard metrics, including the Cramer-Rao lower-bound (CRLB).
- Explain, describe, and understand the notion of a random process and statistical time series, and characterise them in terms of its statistical properties.
- Define, describe, and understand the notion of the power spectral density of stationary random processes, and be able to analyse and manipulate them; analyse in both time and frequency the affect of transformations and linear systems on random processes, both in terms of the density functions, and statistical moments.
- Explain the notion of parametric signal models, and describe common regression-based signal models in terms of its statistical characteristics, and in terms of its affect on random signals; apply least squares, maximum-likelihood, and Bayesian estimators to model based signal processing problems.
|
Reading List
Recommended Course Text Book: Therrien C. W. and M. Tummala, Probability and Random Processes for Electrical and Computer Engineers, Second edition, CRC Press, 2011. IDENTIFIERS ¿Hardback, ISBN10: 1439826986, ISBN13:978-1439826980
Manolakis D. G., V. K. Ingle, and S. M. Kogon, Statistical and Adaptive Signal Processing: Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing, McGraw Hill, Inc., 2000.
Kay S. M., Fundamentals of Statistical Signal Processing: Estimation Theory, Prentice-Hall, Inc., 1993.
Papoulis A. and S. Pillai, Probability, Random Variables, and Stochastic Processes, Fourth edition, McGraw Hill, Inc., 2002.
|
Additional Information
Graduate Attributes and Skills |
Not entered |
Keywords | Probability,Random Variables,Estimation Theory,Random and Stochastic Signals,Numerical Methods |
Contacts
Course organiser | Dr James Hopgood
Tel: (0131 6)50 5571
Email: |
Course secretary | Miss Jo Aitkenhead
Tel: (0131 6)50 5532
Email: |
|
|