MAI4CAREU - Deep Learning - Attention and Transformers
The University of Cyprus's MSc Deep Learning is part of the Master programmes in Artificial Intelligence 4 Careers in Europe (MAI4CAREU). One of Master's programme's courses, MAI642 - Deep Learning is split up into several lectures and is taught by Associate Professor Theocharis Theocharides. The ninth lecture on Transformers & Attention is the final lecture of the course’s scope on ‘Deep Neural Networks – Deep Into Deep Learning’.
Learning outcomes
The lecture dives into RNNs and Long Short-Term Memory (LSTM). Taking you step-by-step through LSTM, it also provides illustrations of successful applications of LSTMs such as Sequence to Sequence (Machine Translation). For example, the slides provide a comprehensive walk through of a seq2seq translation model using biLSTM encoder and LSTM decoder for a French to English translation.
The lecture focuses in particular on:
- Transformers
- Transformer architecture
- Encoder, decoders
- Attention: multihead attention, self-attention
The lecture explores how these can be applied to: Language Modeling, Computer Vision, Speech, Reinforcement Learning. By illustrating the theory with practical examples, case studies, and interactive diagrams, the slides offer students a comprehensive understanding of the course material. Professor Theocharides also provides a bibliography for further learning.
Course content and schedule
The MAI642 - Deep Learning course is spread out over 13 weeks of lectures, which are split into 13 lectures covering 4 broader topics:
- Week 1-3: Basics
- Week 4: Fundamentals of Deep (Convolutional) Learning
- Week 5-9: Deep Neural Networks – Deep into deep learning
- Week 10-13: Emerging Deep Learning research and applications
The course then wraps up with a comprehensive final exam.