Difference between revisions of "Deep Learning Course"

From Chrome
Jump to: navigation, search
(Course Program)
(Detailed course schedule)
 
(59 intermediate revisions by the same user not shown)
Line 2: Line 2:
  
 
The following are last minute news you should be aware of ;-)
 
The following are last minute news you should be aware of ;-)
 +
* 21/02/2018: Added slides on Word Embedding, Attention Mechanism, and Deep Convolutional Architectures
 +
* 16/02/2018: Added slides about Neural Networks Training and Recurrent Neural Networks
 +
* 12/02/2018: Added slides about Course Intro, Deep Learning Intro, and Feedforward Neural Networks
 
  * 12/02/2018: The course starts today!
 
  * 12/02/2018: The course starts today!
 
  * 09/02/2018: First version of this website ...
 
  * 09/02/2018: First version of this website ...
Line 7: Line 10:
 
==Course Aim & Organization==
 
==Course Aim & Organization==
  
Coming soon ...
+
Deep Learning is nowadays becoming the dominant approach in learning cognitive systems which are nowdays able to recognize patterns in data (e.g., images, text, and sounds) or to perform end to end learning of complex behaviors. The proposed doctoral course will introduce the deep learning basics as well as its applications with an on-hands approach where students will be challenged with the practical issues of data collection, model design, model training and performance evaluation.
  
===Teachers===
+
Starting from the foundations of Neural Networks and Deep Learning, the course will introduce the most successful models, algorithms, and tools, for image understanding, sequence prediction and sequence to sequence translation through deep neural models.
  
Coming soon ...
+
In the first part of the course we will provide attendee the theory and the computational tools to approach deep learning with an in hand experience. In this part, we will cover the theory of neural networks, including feed-forward and recurrent models, with a specific focus on Feed Forward Neural Networks (FFNN), Convolutional Neural Networks (CNNs) and Long-Short Term Memory networks (LSTMs). These models will be presented both from a theoretical point of view and from a practical point of view with simple self-contained examples of applications. In this context two deep learning frameworks will be introduced, i.e., TensorFlow and PyTorch, in practical sessions with a special emphasis on the TensorFlow library.
  
<!--The course is composed by a blending of lectures and exercises by the course teacher and a teaching assistant.
+
In the second part we will focus on different application domains. This will be done by presenting a selection of state of the art results in applying deep learning techniques to domains such as (but not limited to) pattern recognition, speech recognition and language modeling, geometric reasoning.  
* [http://www.deib.polimi.it/ita/personale/dettagli/267262 Matteo Matteucci]: the course teacher
+
 
* [http://davide.eynard.it/ Davide Eynard]: the teaching assistant
+
For the final evaluation groups of 3 to 5 students (depending on the number of attendee) will be required to implement one of the models from the papers taken into analysis during the course by using the TensorFlow library.
-->
+
  
 
===Course Program===
 
===Course Program===
  
Coming soon ...
+
To avoid overlap between the "Deep Learning, Theory Techniques and Applications" (DL) PhD course and the "Image Classification: modern approaches" (IC) PhD course, and leverage on the specific competencies of the teachers, topics presented in DL classes will not be covered by the IC classes. Similarly, topics in IC won’t be covered in DL. Please refer to the detailed program and the course logistics to see which are the mandatory classes to be attended from the Image Classification course.
  
To avoid overlap between the "Deep Learning" PhD course and the "Image Classification" PhD course, and leverage on the specific competencies of the teachers, topics presented in DL classes will not be covered by the IC classes. Similarly, topics in IC won’t be covered in DL. Please refer to the detailed program and the course logistics to see which are the mandatory classes to be attended from the Image Classification course.
+
The topics which will be covered by the course are:
 +
* Introduction to neural networks and the feed-forward architectures
 +
* Backpropagation, training and overfitting in neural networks
 +
* Recurrent Neural Networks and other classical architectures, e.g., Radial Basis Functions, Neural Autoencoders, etc.
 +
* Introduction and basics of image handling
 +
* Basic approaches to image classification
 +
* Data-driven features: Convolutional Neural Networks
 +
* TensorFlow and PyTorch introduction with examples
 +
* Structural learning, Long-Short Term Memories, and application to text and speech
 +
* Extended problems in image classification
 +
 
 +
===Teachers===
 +
 
 +
The course is composed by a blending of theoretical lectures, practical exercises, and seminars
 +
* [http://www.deib.polimi.it/ita/personale/dettagli/267262 Matteo Matteucci]: the official teacher for the Deep Learning course
 +
* [http://www.deib.polimi.it/eng/people/details/1001112 Marco Ciccone]: co -teacher for the Deep Learning course
 +
* [http://home.deib.polimi.it/boracchi/ Giacomo Boracchi]: the co-teacher from the Image Classification course (shared code course)
 +
* [http://www.leet.it/home/giusti/website/doku.php Alessandro Giusti]: the co-teacher from the Image Classification course (shared code course)
 +
* [http://www.luigimalago.it/ Luigi Malago']: Guest Speaker from the Machine Learning and Optimization Group at the [https://rist.ro/en.html Romanian Institute of Science and Technology (RIST)]
 +
* [http://www.people.usi.ch/mascij/ Jonathan Masci]: Guest Speaker from [https://nnaisense.com/ NNAISENSE]
 +
* [https://www.linkedin.com/in/francescovisin/ Francesco Visin]: Guest Speaker from [https://deepmind.com/ DeepMind]
 +
* ...
 +
 
 +
===Websites===
 +
 
 +
Please refer to the following websites for specific course materials and detailed calendars:
 +
* DL: http://chrome.ws.dei.polimi.it/index.php/Deep_Learning_Course
 +
* IC: http://home.deib.polimi.it/boracchi/teaching/ImageClassification.htm
  
<!--
 
Techniques from machine and statistical learning are presented from a theoretical (i.e., based on statistics and information theory) and practical perspective through the descriptions of algorithms, the theory behind them, their implementation issues, and few examples from real applications. The course mostly follows the following book which is also available for download in pdf
 
* [http://www-bcf.usc.edu/~gareth/ISL/ An Introduction to Statistical Learning with Applications in R] by Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani
 
The course is composed by a set of ex-cattedra lectures on specific techniques (e.g., linear regression, linear discriminant analysis, clustering, etc.). Supervised and unsupervised learning are discussed in the framework of classification and clustering problems. The course outline is:
 
* '''''Machine Learning and Pattern Classification''''': the general concepts of Machine Learning and Patter Recognition are introduced with a brief review of Statistical Decision Theory;
 
* '''''Linear Regression Techniques''''': linear methods for regression will be disccussed and compared (e.g., Linear Regression and Ridge Regression).
 
* '''''Linear Classification Techniques''''': linear methods for classification will be presented as the starting point for more complex methods  (e.g., Linera Regression on the indicator matrix, Linear and Quadratic Discriminant Analysis, Logistic Regression, Percptron rule and Optimal Separating Hyperplanes, a.k.a., Support Vector Machines)
 
* '''''Unsupervised Learning Techniques''''': the most common approaches to unsupervised learning are described mostly focusing on clustering techniques such as hierarchical clustering, k-means, k-medoids, Mixture of Gaussians, DBSCAN, Jarvis-Patrick, etc.;
 
* '''''Model Validation and Selection''''': model validation and selection are orthogonal issues to all previous techniques; during the course their fundamentals are described and discussed in the framework of linear models for regression (e.g., AIC, BIC, cross-validation, etc. ).
 
-->
 
  
 
===Detailed course schedule===
 
===Detailed course schedule===
Line 39: Line 58:
 
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change. Please note that some days you have a lecture both in the morning and in the afternoon.
 
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change. Please note that some days you have a lecture both in the morning and in the afternoon.
  
<!--
+
Please remind that the Deep Learning PhD course has some shared lectures with the Image Classification PhD course, students are required to attend these lectures which are listed below since those topic will not be covered by the Deep learning classes although they are part of the course program. You will find also the [Optional] lectures from the Image Classification course you might want to attend either because you are enrolled also in that course or for personal interest.
 +
 
 
Note: Lecture timetable interpretation
 
Note: Lecture timetable interpretation
  * On Monday, in room V.S8-A, starts at 10:15 (cum tempore), ends at 13:15
+
  * In the morning lecture starts at 9:30 sharp and ends at 13:00
  * On Tuesday, in room V.S8-B, starts at 08:15 (cum tempore), ends at 10:15
+
  * In the afternoon lecture starts at 14:15 sharp and ends at 17:45
 +
 
 
{| border="1" align="center" style="text-align:center;"
 
{| border="1" align="center" style="text-align:center;"
 
|-
 
|-
 
|Date || Day || Time || Room || Teacher || Topic
 
|Date || Day || Time || Room || Teacher || Topic
 
|-
 
|-
|18/09/2017 || Monday || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Course Introduction (Ch. 1 ISL)
+
|12/02/2018 || Monday || 09:30 - 13:00 || S.0.2 - Building 3 || Matteo Matteucci || Course introduction, Machine Learning vs. Deep Learning introduction, the perceptron, the feed forward neural network architecture, backpropagation and gradient descent, error measures for regression and classification.
 
|-
 
|-
|19/09/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Matteo Matteucci || Statistical Decision Theory and Bias-Variance trade off. (Ch. 2 ISL)
+
|12/02/2018 || Monday || 14:15 - 17:45 || S.0.2 - Building 3 || Giacomo Boracchi || Introduction and basics of image handling in Python
 
|-
 
|-
|25/09/2017 || Monday  || 10:15 - 13:15 || V.S8-A || --- || No Lecture
+
|14/02/2018 || Wednesday || 09:30 - 13:00 || S.0.2 - Building 3 || Matteo Matteucci || Overfitting, Early Stopping, Weight Decay, Regularization
 
|-
 
|-
|26/09/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard    || Introduction to R (Ch. 2 ISL)
+
|14/02/2018 [OPTIONAL] || Wednesday || 14:15 - 17:45  || S.0.2 - Building 3 || Giacomo Boracchi || Hand-crafted features for image classification
 
|-
 
|-
|02/10/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Statistical Decision Theory and Model Assessment. (Ch. 2 ISL)
+
|16/02/2018 || Friday || 09:30 - 13:00 || S.0.5 - Building 3 || Matteo Matteucci || Recurrent Neural Networks, Backpropagation through time, Vanishing Gradient, Long-Short Term Memories, Gated Recurrent Units
 
|-
 
|-
|03/10/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Statistical Decision Theory Exercises (Ch. 2 ISL)
+
|16/02/2018 [OPTIONAL] || Friday || 14:15 - 17:45  || S.0.5 - Building 3 || Giacomo Boracchi || Computer Vision features for image classification
 
|-
 
|-
|09/10/2017 || Monday || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Statistical Decision Theory and Model Assessment. (Ch. 2 ISL)
+
|19/02/2018 || Monday || 09:30 - 13:00 || S.0.5 - Building 3 || Marco Ciccone || TensorFlow and PyTorch Tutorial
 
|-
 
|-
|10/10/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Statistical Decision Theory Exercises (Ch. 2 ISL)
+
|19/02/2018 || Monday || 14:15 - 17:45 || S.0.5 - Building 3 || Alessandro Giusti || Data-driven features: Convolutional Neural Networks
 
|-
 
|-
|16/10/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Linear Regression (Ch. 2 ISL + Ch. 3 ISL)
+
|21/02/2018 || Wednesday || 09:30 - 10:30 || S.0.2 - Building 3 || Marco Ciccone || Common deep architectures for image classification: LeNet, AlexNet, GoogleNet, ResNet, ...
 
|-
 
|-
|17/10/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Exercises on Simple Linear Regression (Ch. 3 ISL)
+
|21/02/2018 || Wednesday || 10:45 - 11:45 || S.0.2 - Building 3 || Matteo Matteucci || Word Embedding
 
|-
 
|-
|23/10/2017 || Monday  || 10:15 - 13:15 || V.S8-A || --- || No Lecture
+
|21/02/2018 || Wednesday || 12:00 - 13:00 || S.0.2 - Building 3 || Alberto Mario Pirovano || Attention Mechanisms
 
|-
 
|-
|24/10/2017 || Tuesday  || 08:15 - 10:15 || V.S8-B || --- || No Lecture
+
|21/02/2018 || Wednesday || 14:15 - 17:45 || S.0.2 - Building 3 || Alessandro Giusti || Advanced CNNs and Best practices in image classification
 
|-
 
|-
|30/10/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Linear Regression (Ch. 2 ISL + Ch. 3 ISL)
+
|23/02/2018 || Friday || 09:30 - 10:30 || C.I.1 - Building 6 || Luigi Malago' || Variational Autoencoders
 
|-
 
|-
|31/10/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Exercises on Simple Linear Regression (Ch. 3 ISL)
+
|23/02/2018 || Friday || 10:45 - 11:45 || C.I.1 - Building 6 || Jonathan Masci || Deep Learning on graphs and manifolds: going beyond Euclidean data
 
|-
 
|-
|06/11/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Linear Regression and Feature Selection (Ch. 3 + Ch. 6 ISL)
+
|23/02/2018 || Friday || 12:00 - 13:00 || C.I.1 - Building 6 || Francesco Visin || Graph Networks
 
|-
 
|-
|07/11/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Exercises on Linear Regression and Feature Selection
+
|23/02/2018 [OPTIONAL] || Friday || 14:10 - 17:45 || S.0.5 - Building 3 || Alessandro Giusti || An overview on extended problems in image classification
|-
+
|13/11/2017 || Monday  || 10:15 - 13:15 || V.S8-A || --- || No Lecture (Suspension)
+
|-
+
|14/11/2017 || Tuesday  || 08:15 - 10:15 || V.S8-B || --- || No Lecture (Suspension)
+
|-
+
|20/11/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Classification by Logistic Regression (Ch. 4 ISL + Ch. 4 ESL)
+
|-
+
|21/11/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Exercises on Classification by Logistic Regression
+
|-
+
|27/11/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Classification by Linear Discriminant Analysis (Ch. 4 ISL)
+
|-
+
|28/11/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Lecture canceled
+
|-
+
|04/12/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Classification: from generative to discriminative approaches (Ch. 4 ISL + Ch. 4 ESL)
+
|-
+
|05/12/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Exercises on Classification by Linear Discriminant Analysis
+
|-
+
|11/12/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Support Vector Machines (Ch. 4 ESL, Ch. 9 ISL, Ch. 12 ESL)
+
|-
+
|12/12/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Clustering: Intro, k-means and the alike
+
|-
+
|18/12/2017 || Monday  || 10:15 - 13:15 || V.S8-A || Matteo Matteucci || Support Vector Machines (Ch. 4 ESL, Ch. 9 ISL, Ch. 12 ESL)
+
|-
+
|19/12/2017 || Tuesday || 08:15 - 10:15 || V.S8-B || Davide Eynard || Clustering: GMM, Hierarchical and Density-based
+
|-
+
|19/12/2017 || Tuesday || 10:15 - 12:15 || V.S8-B || Davide Eynard || Clustering: Spectral + Evaluation
+
 
|-
 
|-
 
|}
 
|}
-->
 
  
 
===Course Evaluation===
 
===Course Evaluation===
Line 122: Line 116:
 
-->
 
-->
  
===Course Logistics===
+
==Course Logistics==
  
 
We would like to thank all the students for their enthusiastic participation, it has made course logistics quite challenging, but we have followed the motto “no student left behind” and we have done our best to accommodate the all of you.
 
We would like to thank all the students for their enthusiastic participation, it has made course logistics quite challenging, but we have followed the motto “no student left behind” and we have done our best to accommodate the all of you.
  
 
+
===Classrooms===
 
+
====Classrooms====
+
  
 
Both DL and IC will be held in the following rooms so take note and spread the voice in case you know people who are going to attend and might not have received this email.
 
Both DL and IC will be held in the following rooms so take note and spread the voice in case you know people who are going to attend and might not have received this email.
  
* February 12th, Aula S.0.2. Ed 3
+
* February 12th, Aula S.0.2. Ed 3, 260 seats
* February 14th, Aula S.0.2. Ed 3
+
* February 14th, Aula S.0.2. Ed 3, 260 seats
* February 16th, Aula S.0.5. Ed 3
+
* February 16th, Aula S.0.5. Ed 3, 174 seats
* February 19th, Aula S.0.5. Ed 3
+
* February 19th, Aula S.0.5. Ed 3, 174 seats
* February 21th, Aula S.0.2. Ed 3
+
* February 21th, Aula S.0.2. Ed 3, 260 seats
* February 23th, Aula N.1.2. Ed 2
+
* February 23th, Aula N.1.2. Ed 2, 168 seats
  
 
These classrooms should be able to fit the all of you, in case not we will check for alternative rooms and will notify this.
 
These classrooms should be able to fit the all of you, in case not we will check for alternative rooms and will notify this.
  
 
+
===Course hours===
Course hours:
+
  
 
Starting hours are sharp, i.e., they already include the “academic quarter”:
 
Starting hours are sharp, i.e., they already include the “academic quarter”:
  
DL will be in the morning: from 9.30 to 13.00,  
+
* DL will be in the morning: from 9.30 to 13.00,
 +
* IC will be in the afternoon: from 14.15 to 17.45,
  
IC will be in the afternoon: from 14.15 to 17.45,
+
===Attendance===
  
 +
The Deep eThese are courses open to Master Students, PhD students and few external participants. Everybody who is interested in getting credits (CFU) or attendance certificates, will be asked to sign an attendance list on each class (a class is a block of hours in the morning or in the afternoon).
  
Attendance:
+
Notice that In particular DL students have to attend IC on: February 12th, February 19th, and February 21th. This makes the DL schedule:
 
+
* February 12th Morning
These are courses open to Master Students, PhD students and few external participants. Everybody who is interested in getting credits (CFU) or attendance certificates, will be asked to sign an attendance list on each class (a class is a block of hours in the morning or in the afternoon).
+
* February 12th Afternoon
 
+
* February 14th Morning
 
+
* February 16th Morning
 
+
* February 19th Morning
 
+
* February 19th Afternoon
In particular DL students have to attend IC on: February 12th, February 19th, and February 21th. This makes the DL schedule
+
* February 21th Morning
 
+
* February 21th Afternoon
 
+
* February 23th Morning
February 12th Morning
+
 
+
February 12th Afternoon
+
 
+
February 14th Morning
+
 
+
February 16th Morning
+
 
+
February 19th Morning
+
 
+
February 19th Afternoon
+
 
+
February 21th Morning
+
 
+
February 21th Afternoon.
+
 
+
February 23th Morning.
+
 
+
 
+
Similarly IC students have to attend DL on February 12th, February 14th, and February 19th. This makes the IC schedule
+
 
+
 
+
February 12th Morning
+
 
+
February 12th Afternoon
+
 
+
February 14th Morning
+
 
+
February 14th Afternoon
+
 
+
February 16th Afternoon
+
 
+
February 19th Morning
+
 
+
February 19th Afternoon
+
 
+
February 21th Afternoon.
+
 
+
February 23th Afternoon.
+
 
+
  
 
Despite the fact you are attending just one of the courses (e.g., because as master student you can get credits only for one of them) we warmly suggest to attend all the lectures from both courses.  
 
Despite the fact you are attending just one of the courses (e.g., because as master student you can get credits only for one of them) we warmly suggest to attend all the lectures from both courses.  
  
 
+
===Programming Environment===
Programming Environment
+
  
 
The reference programming language for both courses is Python. In both DL and IC there will be sessions where you’ll be asked to implement yourself a few lines of codes and possibly we will give some simple assignment. The first lecture, bring your laptop and make sure to have Python 3.6 installed; install Miniconda / or Anaconda framework from conda.io … more informations will be provided later on regarding for additional tools and facilities.
 
The reference programming language for both courses is Python. In both DL and IC there will be sessions where you’ll be asked to implement yourself a few lines of codes and possibly we will give some simple assignment. The first lecture, bring your laptop and make sure to have Python 3.6 installed; install Miniconda / or Anaconda framework from conda.io … more informations will be provided later on regarding for additional tools and facilities.
  
 +
==Teaching Material==
  
Websites
+
Lectures will be mostly based on presentations from the teachers and invited speakers. These slides are taken from several sources, tutorials, summer schools, papers and so on. In case you are interested in a reference book, you can read:
 +
* [http://www.deeplearningbook.org/ Deep Learning]. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.
  
Please refer to the following websites for specific course materials and detailed calendars:
+
===Teachers Slides ===
  
DL: http://chrome.ws.dei.polimi.it/index.php/Deep_Learning_Course
+
In the following you can find the lecture slides used by the course teachers during classes.
  
IC: http://home.deib.polimi.it/boracchi/teaching/ImageClassification.htm
+
====Deep Learning Classes====
  
 +
These are the slides presented during Matteo Matteucci lectures
 +
* [[Media:DL-00-IntroCourse.pdf | Course introduction]]: Introduction to the course with details about the logistics, the grading, the topics, the teachers, and so on ...
 +
* [[Media:DL-01-IntroDeepLearning.pdf | Deep Learning introduction]]: Introduction to Deep Learning and learning data representation from data.
 +
* [[Media:DL-02-Perceptron2NeuralNetworks.pdf | From Perceptrons to Neural Networks]]: The Perceptron and its learning algorithm, Feed Forward Neural Networks and Backpropagation.
 +
* [[Media:DL-03-NeuralNetworksTraining.pdf | Neural Networks Training]]: Dealing with overfitting and optimization in Fee Forward Neural Networks
 +
* [[Media:DL-04-RecurrentNeuralNetworks.pdf | Recurrent Neural Networks]]: Vanishing and exploding gradient, Long-Short Term Memory cells
 +
* [[Media:DL-05-WordEmbedding.pdf | Word Embedding]]: Deep Unsupervised Learning, Embedding, Language Models, and word2vec.
  
Se you all (DL + IC students) on Monday morning in Aula S.0.2 Ed. 3!!!
+
Slides for the tutorials by Marco Ciccone are published here:
 +
* [[Media:DL-E1-TensorFlow101.pdf | Tensorflow 101]]: Tutorial on Tensorflow by Marco Ciccone
 +
* [[Media:DL-E2-PyTorch101.pdf | pyTorch 101]]: Tutorial on pyTorch by Marco Ciccone
  
 +
====Image Classification Classes====
 +
These are the slides presented during Giacomo Boracchi and Alessandro Giusti lectures:
 +
* ...
 +
* ...
 +
* ...
  
Giacomo Boracchi & Matteo Matteucci
+
====Seminars====
 
+
These are the presentations given by course special guests:
==Teaching Material==  
+
* [[Media:DL-G1-AdvancedCNNArchitectures.pdf | Advanced CNN Architectures]]: Seminar on the importance of depth in convolutional neural networks by Marco Ciccone
 
+
* [[Media:DL-G2-AttentionMechanisms.pdf | Attention Mechanisms]]: Seminar on attention mechanism in sequence to sequence learning by Alberto Mario Pirovano
Coming soon ...
+
* [https://drive.google.com/file/d/1pOCBc_PxMKsuEiDT7D82vQ4bX2sTg6T8/view?usp=sharing| Deep Learning on Graphs and Manifolds]: Seminar on deep learning beyond Euclidean data by Jonathan Masci
 
+
* [[Media:DL-G4-VariationAutoencoders.pdf | Variational Autoencoders]]: Seminar on variational autoencoders principles and perspectives by Luigi Malagò
<!--
+
* [[Media:DL-G5-GraphNetworks.pdf | Graph Networks]]: Seminar on deep learning and graphs, i.e., graph networks, by Francesco Visin
Lectures will be based on material taken from the book.
+
* [http://www-bcf.usc.edu/~gareth/ISL/ An Introduction to Statistical Learning with Applications in R] by Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani
+
If you are interested in a more deep treatment of the topics you can refer to the following book from the same authors
+
* [http://www-stat.stanford.edu/~tibs/ElemStatLearn/index.html The Elements of Statistical Learning: Data Mining, Inference, and Prediction.] by Trevor Hastie, Robert Tibshirani, and Jerome Friedman.
+
Some additional material that could be used to prepare the oral examination will be provided together with the past homeworks.
+
-->
+
 
+
===Teacher Slides ===
+
 
+
<!--
+
In the following you can find the lecture slides used by the teacher and the teaching assistants during classes.
+
Lectures:
+
* [[Media:ML-2017-01-Intro.pdf | [2017] Course introduction]]: introductory slides of the course with useful information about the grading, and the course logistics. Some examples from supervised and unsupervised learning. Regression, classification, clustering terminology and examples.
+
* [[Media:ML-2017-02-StatisticalLearning.pdf | [2017] Statistical Learning Introduction]]: Statistical Learning definition, rationale, and trade-offs (e.g., prediction vs. inference, parametric vs non parametric models, flexibility vs. interpretability, etc.)
+
* [[Media:ML-2016-03-AssessingModelAccuracy.pdf | [2016] Statistical Learning and Model Assessment]]: Model Assessment for Regression and Classification, Bias-Variance trade-off, Model complexity and overfitting, K-Nearest Neighbors Classifier vs. Bayes Classifier.
+
* [[Media:ML-2016-04-LinearRegression.pdf | [2016] Linear Regression]]: Simple Linear Regression and Multiple Linear Regression. Feature selection. Ridge Regression and Lasso.
+
* [[Media:ML-2016-05-LinearClassification.pdf | [2016] Linear Classification]]: From Linear Regression to Logistic Regression. Linear Discriminant Analysis and Quadratic Discriminant Analysis. Comparison between linear classification methods.
+
* [[Media:ML-2016-06-SupportVectorMachines.pdf | [2016] Support Vector Machines]]: Discriminative vs. generative methids. Hyperplanes learning and Perceptron. Maximum Margin Classifiers. The Kernel trick and Support Vector Machines.
+
For exercises and lab material please refer to [http://davide.eynard.it/pattern-analysis-and-machine-intelligence-2015-2016/ Davide Eynard website].
+
-->
+
  
 
===Additional Resources===
 
===Additional Resources===
  
Coming soon ...
+
Papers and links useful to integrate the slides from lecturers and guests:
 +
* [http://binds.cs.umass.edu/papers/1995_Siegelmann_Science.pdf Computation Beyond the Turing Limit] Hava T. Siegelmann, Sience (268)545:548,1995.
  
<!--
+
==F.A.Q.==
Papers and links useful to integrate the textbook
+
* [http://scott.fortmann-roe.com/docs/BiasVariance.html Bias vs. Variance]: "Understanding the Bias-Variance Tradeoff" essay by Scott Fortmann-Roe
+
* [http://www.onmyphd.com/?p=kkt.karush.kuhn.tucker Karush Kuhn Tucker Conditions]: a short note on their meaning with references to relevant wikipedia pages
+
* [http://students.brown.edu/seeing-theory/ Seeing Theory]: a website where the basic concepts of probability and statistics are explained in a visual way.
+
-->
+
  
==F.A.Q.==
+
===About attendance===
  
===...===
+
'''''I heard you take signatures and attendance is mandatory, but I have an exam and I will mess one or two lessons. Do I still get the credits?'''''
  
 +
* PhD students are required to attend 70% of the lectures, i.e., at least (9d x 4h x 0.7) ~= 25h. This means that in case of 4h blocks you have to attend at least 6 blocks out of the 9 blocks foreseen. For Master students we are  little more flexible in case they have exams ... I suggest them not to push me writing here a number since then it stays ;-)
  
 
<!--
 
<!--

Latest revision as of 23:22, 26 February 2018


The following are last minute news you should be aware of ;-)

* 21/02/2018: Added slides on Word Embedding, Attention Mechanism, and Deep Convolutional Architectures
* 16/02/2018: Added slides about Neural Networks Training and Recurrent Neural Networks
* 12/02/2018: Added slides about Course Intro, Deep Learning Intro, and Feedforward Neural Networks
* 12/02/2018: The course starts today!
* 09/02/2018: First version of this website ...

Course Aim & Organization

Deep Learning is nowadays becoming the dominant approach in learning cognitive systems which are nowdays able to recognize patterns in data (e.g., images, text, and sounds) or to perform end to end learning of complex behaviors. The proposed doctoral course will introduce the deep learning basics as well as its applications with an on-hands approach where students will be challenged with the practical issues of data collection, model design, model training and performance evaluation.

Starting from the foundations of Neural Networks and Deep Learning, the course will introduce the most successful models, algorithms, and tools, for image understanding, sequence prediction and sequence to sequence translation through deep neural models.

In the first part of the course we will provide attendee the theory and the computational tools to approach deep learning with an in hand experience. In this part, we will cover the theory of neural networks, including feed-forward and recurrent models, with a specific focus on Feed Forward Neural Networks (FFNN), Convolutional Neural Networks (CNNs) and Long-Short Term Memory networks (LSTMs). These models will be presented both from a theoretical point of view and from a practical point of view with simple self-contained examples of applications. In this context two deep learning frameworks will be introduced, i.e., TensorFlow and PyTorch, in practical sessions with a special emphasis on the TensorFlow library.

In the second part we will focus on different application domains. This will be done by presenting a selection of state of the art results in applying deep learning techniques to domains such as (but not limited to) pattern recognition, speech recognition and language modeling, geometric reasoning.

For the final evaluation groups of 3 to 5 students (depending on the number of attendee) will be required to implement one of the models from the papers taken into analysis during the course by using the TensorFlow library.

Course Program

To avoid overlap between the "Deep Learning, Theory Techniques and Applications" (DL) PhD course and the "Image Classification: modern approaches" (IC) PhD course, and leverage on the specific competencies of the teachers, topics presented in DL classes will not be covered by the IC classes. Similarly, topics in IC won’t be covered in DL. Please refer to the detailed program and the course logistics to see which are the mandatory classes to be attended from the Image Classification course.

The topics which will be covered by the course are:

  • Introduction to neural networks and the feed-forward architectures
  • Backpropagation, training and overfitting in neural networks
  • Recurrent Neural Networks and other classical architectures, e.g., Radial Basis Functions, Neural Autoencoders, etc.
  • Introduction and basics of image handling
  • Basic approaches to image classification
  • Data-driven features: Convolutional Neural Networks
  • TensorFlow and PyTorch introduction with examples
  • Structural learning, Long-Short Term Memories, and application to text and speech
  • Extended problems in image classification

Teachers

The course is composed by a blending of theoretical lectures, practical exercises, and seminars

Websites

Please refer to the following websites for specific course materials and detailed calendars:


Detailed course schedule

A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change. Please note that some days you have a lecture both in the morning and in the afternoon.

Please remind that the Deep Learning PhD course has some shared lectures with the Image Classification PhD course, students are required to attend these lectures which are listed below since those topic will not be covered by the Deep learning classes although they are part of the course program. You will find also the [Optional] lectures from the Image Classification course you might want to attend either because you are enrolled also in that course or for personal interest.

Note: Lecture timetable interpretation

* In the morning lecture starts at 9:30 sharp and ends at 13:00
* In the afternoon lecture starts at 14:15 sharp and ends at 17:45
Date Day Time Room Teacher Topic
12/02/2018 Monday 09:30 - 13:00 S.0.2 - Building 3 Matteo Matteucci Course introduction, Machine Learning vs. Deep Learning introduction, the perceptron, the feed forward neural network architecture, backpropagation and gradient descent, error measures for regression and classification.
12/02/2018 Monday 14:15 - 17:45 S.0.2 - Building 3 Giacomo Boracchi Introduction and basics of image handling in Python
14/02/2018 Wednesday 09:30 - 13:00 S.0.2 - Building 3 Matteo Matteucci Overfitting, Early Stopping, Weight Decay, Regularization
14/02/2018 [OPTIONAL] Wednesday 14:15 - 17:45 S.0.2 - Building 3 Giacomo Boracchi Hand-crafted features for image classification
16/02/2018 Friday 09:30 - 13:00 S.0.5 - Building 3 Matteo Matteucci Recurrent Neural Networks, Backpropagation through time, Vanishing Gradient, Long-Short Term Memories, Gated Recurrent Units
16/02/2018 [OPTIONAL] Friday 14:15 - 17:45 S.0.5 - Building 3 Giacomo Boracchi Computer Vision features for image classification
19/02/2018 Monday 09:30 - 13:00 S.0.5 - Building 3 Marco Ciccone TensorFlow and PyTorch Tutorial
19/02/2018 Monday 14:15 - 17:45 S.0.5 - Building 3 Alessandro Giusti Data-driven features: Convolutional Neural Networks
21/02/2018 Wednesday 09:30 - 10:30 S.0.2 - Building 3 Marco Ciccone Common deep architectures for image classification: LeNet, AlexNet, GoogleNet, ResNet, ...
21/02/2018 Wednesday 10:45 - 11:45 S.0.2 - Building 3 Matteo Matteucci Word Embedding
21/02/2018 Wednesday 12:00 - 13:00 S.0.2 - Building 3 Alberto Mario Pirovano Attention Mechanisms
21/02/2018 Wednesday 14:15 - 17:45 S.0.2 - Building 3 Alessandro Giusti Advanced CNNs and Best practices in image classification
23/02/2018 Friday 09:30 - 10:30 C.I.1 - Building 6 Luigi Malago' Variational Autoencoders
23/02/2018 Friday 10:45 - 11:45 C.I.1 - Building 6 Jonathan Masci Deep Learning on graphs and manifolds: going beyond Euclidean data
23/02/2018 Friday 12:00 - 13:00 C.I.1 - Building 6 Francesco Visin Graph Networks
23/02/2018 [OPTIONAL] Friday 14:10 - 17:45 S.0.5 - Building 3 Alessandro Giusti An overview on extended problems in image classification

Course Evaluation

Coming soon ...


Course Logistics

We would like to thank all the students for their enthusiastic participation, it has made course logistics quite challenging, but we have followed the motto “no student left behind” and we have done our best to accommodate the all of you.

Classrooms

Both DL and IC will be held in the following rooms so take note and spread the voice in case you know people who are going to attend and might not have received this email.

  • February 12th, Aula S.0.2. Ed 3, 260 seats
  • February 14th, Aula S.0.2. Ed 3, 260 seats
  • February 16th, Aula S.0.5. Ed 3, 174 seats
  • February 19th, Aula S.0.5. Ed 3, 174 seats
  • February 21th, Aula S.0.2. Ed 3, 260 seats
  • February 23th, Aula N.1.2. Ed 2, 168 seats

These classrooms should be able to fit the all of you, in case not we will check for alternative rooms and will notify this.

Course hours

Starting hours are sharp, i.e., they already include the “academic quarter”:

  • DL will be in the morning: from 9.30 to 13.00,
  • IC will be in the afternoon: from 14.15 to 17.45,

Attendance

The Deep eThese are courses open to Master Students, PhD students and few external participants. Everybody who is interested in getting credits (CFU) or attendance certificates, will be asked to sign an attendance list on each class (a class is a block of hours in the morning or in the afternoon).

Notice that In particular DL students have to attend IC on: February 12th, February 19th, and February 21th. This makes the DL schedule:

  • February 12th Morning
  • February 12th Afternoon
  • February 14th Morning
  • February 16th Morning
  • February 19th Morning
  • February 19th Afternoon
  • February 21th Morning
  • February 21th Afternoon
  • February 23th Morning

Despite the fact you are attending just one of the courses (e.g., because as master student you can get credits only for one of them) we warmly suggest to attend all the lectures from both courses.

Programming Environment

The reference programming language for both courses is Python. In both DL and IC there will be sessions where you’ll be asked to implement yourself a few lines of codes and possibly we will give some simple assignment. The first lecture, bring your laptop and make sure to have Python 3.6 installed; install Miniconda / or Anaconda framework from conda.io … more informations will be provided later on regarding for additional tools and facilities.

Teaching Material

Lectures will be mostly based on presentations from the teachers and invited speakers. These slides are taken from several sources, tutorials, summer schools, papers and so on. In case you are interested in a reference book, you can read:

  • Deep Learning. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, 2016.

Teachers Slides

In the following you can find the lecture slides used by the course teachers during classes.

Deep Learning Classes

These are the slides presented during Matteo Matteucci lectures

Slides for the tutorials by Marco Ciccone are published here:

Image Classification Classes

These are the slides presented during Giacomo Boracchi and Alessandro Giusti lectures:

  • ...
  • ...
  • ...

Seminars

These are the presentations given by course special guests:

Additional Resources

Papers and links useful to integrate the slides from lecturers and guests:

F.A.Q.

About attendance

I heard you take signatures and attendance is mandatory, but I have an exam and I will mess one or two lessons. Do I still get the credits?

  • PhD students are required to attend 70% of the lectures, i.e., at least (9d x 4h x 0.7) ~= 25h. This means that in case of 4h blocks you have to attend at least 6 blocks out of the 9 blocks foreseen. For Master students we are little more flexible in case they have exams ... I suggest them not to push me writing here a number since then it stays ;-)