# Core Courses ## Advanced Topics in Stochastic Analysis

Dr Andreas Sojmark
(Imperial College)

1. Stochastic Differential Equations
(a) Strong existence and uniqueness under general Lipschitz assumptions
(b) Results for cadlag integrands and general semi-martingales
(d) The strong Markov property

2. Weak convergence: Skorokhod and Wasserstein
(a) Skorokhod spaces; J1 topology
(b) Wasserstein metrics on probability measures
(c) Weak/u.c.p. stability of stochastic integrals and SDEs

3. Mc-Kean Vlasov systems

(a) Laws of large numbers: De Finetti and Birkhoff
(b) Wasserstein arguments and propagation of chaos
(c) Nonlinear PDEs and martingale approach
(d) Common noise and connections to Zakai-type SPDEs

## Stochastic Partial Differential Equations

Dr Ajay Chandra & Dr Giuseppe Cannizzaro
(Imperial College)

This course will be an introduction to stochastic partial differential equations (SPDEs). The beginning of the class will provide the necessary background on functional analysis and infinite dimensional Gaussian measures. We will then turn to stochastic integration in infinite dimensions and show how to establish local existence and uniqueness for a class of linear and semi-linear SPDEs.

Prerequisites: analysis, functional analysis and probability, stochastic processes and Ito calculus.

• Examples of SPDEs (particle systems, filtering, biology).
• Infinite dimensional Gaussian measures (definition, covariance operators, Fernique’s Theorem).
• Cameron-Martin theory.
• Cylindrical Wiener procceses and stochastic integration in infinite dimensions.
• Semigroup theory, interpolation spaces, and stochastic convolutions.
• Local (and global) existence for linear and semilinear SPDEs.
• Long time behaviour of linear SPDEs and related questions

## Introduction to Stochastic PDEs Term 1 - 2019 Course Info Introduction to Stochastic PDEs Term 1 - 2019 Course Info

## Theories of Deep Learning

Prof Jared Tanner (Oxford)

Deep learning is the dominant method for machines to perform classification tasks at reliability rates exceeding that of humans, as well as outperforming world champions in games such as go. Alongside the proliferating application of these techniques, the practitioners have developed a good understanding of the properties that make these deep nets effective, such as initial layers learning weights similar to those in dictionary learning, while deeper layers instantiate invariance to transforms such as dilation, rotation, and modest diffeomorphisms. There are now a number of theories being developed to give a mathematical theory to accompany these observations; this course will explore these varying perspectives.

Students will become familiar with the variety of architectures for deep nets, including the scattering transform and ingredients such as types of nonlinear transforms, pooling, convolutional structure, and how nets are trained. Students will focus their attention on learning a variety of theoretical perspectives on why deep networks perform as observed, with examples such as: dictionary learning and transferability of early layers, energy decay with depth, Lipschitz continuity of the net, how depth overcomes the curse of dimensionality, constructing adversarial examples, geometry of nets viewed through random matrix theory, and learning of invariance.

## Simulation Methods and Stochastic Algorithms

Prof. Mike Giles (Oxford)

This course will be an introduction to simulation methods for a wide range of stochastic models. The emphasis is on the construction of the numerical approximations with a limited discussion of their accuracy and computational cost, and almost no stochastic numerical analysis.

The topics covered include:

• random and quasi-random number generation;
• basics of Monte Carlo and Quasi-Monte Carlo simulation;
• Multilevel Monte Carlo and Multilevel QMC methods;
• sensitivity analysis;
• stochastic dierential equations;
• continuous-time Markov processes;
• stochastic PDEs;
• stochastic approximation;
• estimation of invariant measures;
• nested estimation;