Postgraduate Course: Large Scale Optimization for Data Science (MATH11147)
Course Outline
School | School of Mathematics |
College | College of Science and Engineering |
Credit level (Normal year taken) | SCQF Level 11 (Postgraduate) |
Availability | Available to all students |
SCQF Credits | 10 |
ECTS Credits | 5 |
Summary | The detailed modelling of real life problems requires a knowledgeable choice of the objective function and constraints, and often leads to very large optimization problems. The efficient solution of such problems is a key to the success of optimization in practice.
Data Science provides numerous instances of problems which can be modelled using optimization. The amount of data in some of these models challenges existing optimization techniques and requires the development of new ones.
This course will address the methods for constrained optimization and the assumption will be made that the knowledge of an exact (or an approximation of) the second order information (Hessian of the Lagrangian) is available. The course will cover interior point methods (IPMs) for various classes of optimization problems, addressing their theory and implementation.
It will also cover the alternating direction method of multipliers (ADMM) and touch on stochastic gradient (SD) used in deep learning.
The successful applications of these techniques in various Data Science problems from areas such as statistics, machine learning, engineering, energy and finance, will be discussed.
The practical component of this course will consist of computing laboratory work using Matlab. These exercises will reinforce the theoretical analysis of problems, methods and their implementation. |
Course description |
Unconstrained and Constrained Optimization (modelling issues: constraints in optimization)
Interior Point Methods for linear, quadratic, nonlinear, second-order cone and semidefinite programming (motivation, theory, polynomial complexity, implementation). Newton Method and self-concordant barriers in optimization
Implementational aspects of methods for very large scale optimization
(sparse matrices, inexact Newton Method).
Alternating Direction Method of Multipliers (ADMM). Stochastic Gradient.
Data Science Applications:
- Statistics: regressions, classification, discrimination analysis,
- Machine learning: support vector machines
- Engineering: signal and image processing
- Finance: portfolio optimization, asset and liability management
|
Entry Requirements (not applicable to Visiting Students)
Pre-requisites |
Students MUST have passed:
Fundamentals of Optimization (MATH11111)
|
Co-requisites | |
Prohibited Combinations | |
Other requirements | None |
Information for Visiting Students
Pre-requisites | Visiting students are advised to check that they have studied the material covered in the syllabus of each prerequisite course before enrolling. |
High Demand Course? |
Yes |
Course Delivery Information
|
Academic year 2022/23, Available to all students (SV1)
|
Quota: None |
Course Start |
Semester 2 |
Timetable |
Timetable |
Learning and Teaching activities (Further Info) |
Total Hours:
100
(
Lecture Hours 18,
Seminar/Tutorial Hours 5,
Supervised Practical/Workshop/Studio Hours 4,
Programme Level Learning and Teaching Hours 2,
Directed Learning and Independent Learning Hours
71 )
|
Assessment (Further Info) |
Written Exam
50 %,
Coursework
50 %,
Practical Exam
0 %
|
Additional Information (Assessment) |
Written Exam 50 %, Coursework 50 %
|
Feedback |
Written feedback will be provided on coursework assignments. |
Exam Information |
Exam Diet |
Paper Name |
Hours & Minutes |
|
Main Exam Diet S2 (April/May) | Large Scale Optimization for Data Science (MATH11147) | 2:00 | |
Learning Outcomes
On completion of this course, the student will be able to:
- Model real-life problems as optimization problems.
- Choose a solution method appropriate to the characteristics of a given problem and obtain a solution using Matlab-based utilities.
- Explain how complexity analysis can be used to assess the efficiency of optimization techniques.
- Demonstrate the action of optimization methods by solving illustrative problems on paper.
- Explain how the implementation of optimization methods yields problems in numerical linear algebra.
|
Reading List
Numerical Optimization, J. Nocedal and S. Wright, Springer, 2nd edition. ISBN-10: 038730303
Primal-Dual Interior-Point Methods, S. Wright, SIAM, Philadelphia.
ISBN 0-89871 |
Additional Information
Graduate Attributes and Skills |
Not entered |
Keywords | ODS,Data Science |
Contacts
Course organiser | Prof Jacek Gondzio
Tel: (0131 6)50 8574
Email: |
Course secretary | Miss Gemma Aitchison
Tel: (0131 6)50 9268
Email: |
|
|