Objectives
You will gain detailed knowledge of
- Current understanding of generalization in neural networks vs classical statistical models.
- Optimization procedures for neural network models such as stochastic gradient descent and ADAM.
- Automatic differentiation and at least one software framework (PyTorch, TensorFolow) as well as an overview of other software approaches.
- Architectures that are deployed to deal with different data types such as images or sequences including a. convolutional networks b. recurrent networks.
In addition you will gain knowledge of more advanced topics reflecting recent research in machine learning chosen from the following list.
- Approaches to unsupervised learning including autoencoders and generative adversarial networks.
- Techniques for deploying models in low data regimes such as transfer learning and meta-learning.
- Techniques for propagating uncertainty such as Bayesian neural networks.
- Deployment of neural network models in hardware systems.
Teaching Style
The start of the course will focus on the latest undertanding of current theory of neural networks, contrasting with previous classical understandings of generalization performance. Then we will move to practical examples of network architectures and deployment. We will end with more advanced topics reflecting current research.
Schedule
Week 1
Two lectures: Generalization and Neural architectures.
Week 2
Two lectures: Optimization: Stochastic Gradient Descent and ADAM
Week 3
Two lectures: Background: Automatic differentiation and GPU Acceleration
Week 4
Two lectures: Neural architectures: Convolutional neural networks
Week 5
Two lectures: Neural architectures: Recurrent Neural Networks and LSTMs.
Week 6-8
Lectures from the following list of special topics.
Special topics
- Neural architectures: Auto Encoders and Generative Adversarial Networks
- Hardware Implementations
- Reinforcement learning
- Transfer learning and meta-learning
- Uncertainty and Bayesian Neural Networks