Difference between revisions of "Artificial Neural Networks and Deep Learning"
(→Teaching Material (the textbook)) |
|||
Line 65: | Line 65: | ||
Lectures will be based on material from different sources, teachers will provide their slides to students as soon they are available. As a general reference you can check the following text, but keep in mind that teachers will not follow it strictly | Lectures will be based on material from different sources, teachers will provide their slides to students as soon they are available. As a general reference you can check the following text, but keep in mind that teachers will not follow it strictly | ||
− | * | + | * [http://www.deeplearningbook.org/ Deep Learning]. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016. |
===Course Slides=== | ===Course Slides=== |
Revision as of 19:40, 18 September 2019
The following are last minute news you should be aware of ;-)
19/09/2019: No lecture on the 20/09/2019 ... check the detailed schedule. 19/09/2019: The course starts today!
Contents
Course Aim & Organization
Neural networks are mature, flexible, and powerful non-linear data-driven models that have successfully been applied to solve complex tasks in science and engineering. The advent of the deep learning paradigm, i.e., the use of (neural) network to simultaneously learn an optimal data representation and the corresponding model, has further boosted neural networks and the data-driven paradigm.
Nowadays, deep neural network can outperform traditional hand-crafted algorithms, achieving human performance in solving many complex tasks, such as natural language processing, text modeling, gene expression modeling, and image recognition. The course provides a broad introduction to neural networks (NN), starting from the traditional feedforward (FFNN) and recurrent (RNN) neural networks, till the most successful deep-learning models such as convolutional neural networks (CNN) and long short-term memories (LSTM).
The course major goal is to provide students with the theoretical background and the practical skills to understand and use NN, and at the same time become familiar and with Deep Learning for solving complex engineering problems.
Teachers
The course is composed of a blending of lectures and exercises by the course teachers and a teaching assistant.
- Matteo Matteucci: the course teacher
- Giacomo Boracchi: the course teacher
- Francesco Lattari: the course teaching assistant
Course Program
This goal is pursued in the course by:
- Presenting major theoretical results underpinning NN (e.g., universal approx, vanishing/exploding gradient, etc.)
- Describing the most important algorithms for NN training (e.g., backpropagation, adaptive gradient algorithms, etc.)
- Illustrating the best practices on how to successfully train and use these models (e.g., dropout, data augmentation, etc.)
- Providing an overview of the most successful Deep Learning architectures (e.g., CNNs, sparse and dense autoencoder, LSTMs for sequence to sequence learning, etc.)
- Providing an overview of the most successful applications with particular emphasis on models for solving visual recognition tasks.
Detailed course schedule
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change (I will notify you by email). Please note that not all days we have lectures!!
Note: Lecture timetable interpretation * On Thursday, in L26.12, starts at 16:15, ends at 18:15 * On Friday, in 6.0.1, starts at 14:15, ends at 17:15
Date | Day | Time | Room | Teacher | Type | Topic |
19/09/2019 | Thursday | 16:15 - 17:15 | L26.12 | Lecture | Giacomo Boracchi | Course Introduction |
20/09/2019 | Friday | 14:15 - 17:15 | 6.0.1 | -- | -- | No Lecture today |
26/09/2019 | Thursday | 16:15 - 18:15 | L26.12 | Lecture | Matteo Matteucci | Introduction to Machine Learning |
27/09/2019 | Friday | 14:15 - 17:15 | 6.0.1 | Lecture | Matteo Matteucci | Perceptron and Hebian Learning |
Course Evaluation
Course evaluation is composed of two parts:
- A written examination covering the whole program graded up to 27/32
- A home project in the form of a Kaggle style competition practicing the topics of the course graded up to 5/32
The final score will sum the grade of the written exam and the grade of the home project.
Teaching Material (the textbook)
Lectures will be based on material from different sources, teachers will provide their slides to students as soon they are available. As a general reference you can check the following text, but keep in mind that teachers will not follow it strictly
- Deep Learning. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.
Course Slides
Slides from the lectures by Matteo Matteucci
- [2019/2020] Course Introduction: introductory slides of the course with useful information about the course syllabus, grading, and the course logistics.
Slides from the lectures by Giacomo Boracchi
- ... coming soon ...
Additional material from the teachers
- ... coming soon ...