Teaching language

English

Prerequisites

Signal processing, Optimization, Basic Linear algebra, Basic Probability Theory, Mathematical analysis and Programming. The main programming language that will be used during the course is Python (with different libraries, e.g. Keras with Tensorflow backend), although students are also allowed to use Matlab or Octave.

Recommended prerequisites

Recommended previous knowledge IKT 720 Optimization and IKT 719 Advanced Optimization (or similar courses elsewhere).

Course contents

In this course, we cover the mathematical and algorithmic foundations of deep neural networks (DNNs), as well as the development of skills at designing and training them for different problems, placing deep learning in the context of Electrical Engineering. Deep Learning has shown an important impact in multiple important applications, such as internet search, speech recognition, face recognition, computer vision, wireless communications, sensor networks and self-driving cars.

 

The course covers the following main topics:

- Review of estimation and detection methods based on statistical models/descriptions, regression and classification, motivation for data-driven machine learning approaches.

- Types of machine learning problems and approaches, statistical learning, motivation for deep neural networks, how deep Learning relates to the Electrical Engineering curricula.

- Optimization and training in deep learning: LMS algorithm, stochastic gradient descent with mini-batches, loss functions and regularizers, activation functions, back-propagation, trainable parameters and hyper-parameters, universal approximation, vanishing gradients, momentum and optimizers (e.g. AdaGrad, RMSProp, Adam), Drop-out, batch normalization, hyper-parameter optimization.

- Deep learning design flow: collection, contamination, cleaning, augmentation of data, synthetic data generation, designing features and dimensionality reduction (e.g. PCA, LDA), best practices for design of the overall data processing flow.

- Convolutional deep neural networks: 1D and 2D CNNs, pooling and kernel selection, design architectures, complexity reduction methods, applications.

- Recurrent deep neural networks: concept of RNNs, general gating and filtering concepts, gated recurrent units (GRUs), Long-Term Short-Term Memory (LSTM) networks, Backprop Through Time (BPTT), Recurrent networks for non-linear filtering.

- Other topics (time permitting): Auto-encoders and Generative models, Generative Adversarial Networks (GANs), Transformers, BERT, Basics of Deep Reinforcement Learning.

- Computing material: working with Python, training feed-forward DNN using numpy, training CNNs and RNNs using Keras, other deep learning computing frameworks (time permitting).

Learning outcomes

Upon successful completion of the course, the students should:

- Understand the basics and limitations of the classical estimation, detection and regression methods, as a motivation for the modern data-driven machine learning and deep learning methods.

- Understand the different machine learning methods and when deep learning approaches are the most adequate.

- Understand the underlying concepts and properties related to learning by Deep Neural Networks (DNNs), including stochastic gradient methods, back-propagation equations, different training methods, concepts of trainable parameters and hyper-parameters, activation functions, losses, regularizers and optimizers, understanding what choices are adequate for a given problem.

- Be able to both generate data sets and work with on-line public data sets using standard Python tools.

- Be able to train and evaluate the performance of common DNN architectures/types, such as deep feedforward, deep convolutional and deep recurrent neural networks (specially GRUs, LSTM), analysing the impact of different design choices.

- Know how to formulate and apply the Deep Learning framework to solve several practical problems in different application domains.

Examination requirements

Compulsory attendance is the only requirement.

Teaching methods

Lectures, homework exercises, final project or exam, self-study.

Assessment methods and criteria

The Final Grade: pass (A or B) or fail (based on 60% of Homework grade + 40% of final project/examination grade). Passing the course is contingent upon attending all lectures, successfully finishing homework problems, and the final project or examination. Regarding the final project, its grade will be assessed based on two parts: project report and oral presentation, which will count equally. For both the project report and the oral presentation, the assessment criteria will be technical correctness and clarity of exposition.

Last updated from FS (Common Student System) June 30, 2024 9:38:39 PM