Difference between revisions of "Deep Learning Course"

From Chrome
Jump to: navigation, search
(Course Logistics)
(Teaching Material)
Line 204: Line 204:
 
==Teaching Material==  
 
==Teaching Material==  
  
Coming soon ...
+
Lectures will be mostly based on presentations from the teachers and invited speakers. These slides are taken from several sources, tutorials, summer schools, papers and so on. In case you are interested in a reference book, you can read:
 
+
* [http://www-bcf.usc.edu/~gareth/ISL/ ...]
<!--
+
Lectures will be based on material taken from the book.  
+
* [http://www-bcf.usc.edu/~gareth/ISL/ An Introduction to Statistical Learning with Applications in R] by Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani
+
If you are interested in a more deep treatment of the topics you can refer to the following book from the same authors
+
* [http://www-stat.stanford.edu/~tibs/ElemStatLearn/index.html The Elements of Statistical Learning: Data Mining, Inference, and Prediction.] by Trevor Hastie, Robert Tibshirani, and Jerome Friedman.
+
Some additional material that could be used to prepare the oral examination will be provided together with the past homeworks.
+
-->
+
  
 
===Teacher Slides ===
 
===Teacher Slides ===

Revision as of 10:56, 9 February 2018


The following are last minute news you should be aware of ;-)

* 12/02/2018: The course starts today!
* 09/02/2018: First version of this website ...

Course Aim & Organization

Deep Learning is nowadays becoming the dominant approach in learning cognitive systems which are nowdays able to recognize patterns in data (e.g., images, text, and sounds) or to perform end to end learning of complex behaviors. The proposed doctoral course will introduce the deep learning basics as well as its applications with an on-hands approach where students will be challenged with the practical issues of data collection, model design, model training and performance evaluation.

Starting from the foundations of Neural Networks and Deep Learning, the course will introduce the most successful models, algorithms, and tools, for image understanding, sequence prediction and sequence to sequence translation through deep neural models.

In the first part of the course we will provide attendee the theory and the computational tools to approach deep learning with an in hand experience. In this part, we will cover the theory of neural networks, including feed-forward and recurrent models, with a specific focus on Feed Forward Neural Networks (FFNN), Convolutional Neural Networks (CNNs) and Long-Short Term Memory networks (LSTMs). These models will be presented both from a theoretical point of view and from a practical point of view with simple self-contained examples of applications. In this context two deep learning frameworks will be introduced, i.e., TensorFlow and PyTorch, in practical sessions with a special emphasis on the TensorFlow library.

In the second part we will focus on different application domains. This will be done by presenting a selection of state of the art results in applying deep learning techniques to domains such as (but not limited to) pattern recognition, speech recognition and language modeling, geometric reasoning.

For the final evaluation groups of 3 to 5 students (depending on the number of attendee) will be required to implement one of the models from the papers taken into analysis during the course by using the TensorFlow library.

Course Program

To avoid overlap between the "Deep Learning, Theory Techniques and Applications" (DL) PhD course and the "Image Classification: modern approaches" (IC) PhD course, and leverage on the specific competencies of the teachers, topics presented in DL classes will not be covered by the IC classes. Similarly, topics in IC won’t be covered in DL. Please refer to the detailed program and the course logistics to see which are the mandatory classes to be attended from the Image Classification course.

The topics which will be covered by the course are:

  • Introduction to neural networks and the feed-forward architectures
  • Backpropagation, training and overfitting in neural networks
  • Recurrent Neural Networks and other classical architectures, e.g., Radial Basis Functions, Neural Autoencoders, etc.
  • Introduction and basics of image handling
  • Basic approaches to image classification
  • Data-driven features: Convolutional Neural Networks
  • TensorFlow and PyTorch introduction with examples
  • Structural learning, Long-Short Term Memories, and application to text and speech
  • Extended problems in image classification

Teachers

The course is composed by a blending of theoretical lectures, practical exercises, and seminars

Detailed course schedule

A detailed schedule of the course can be found here; topics are just indicative while days and teachers are correct up to some last minute change. Please note that some days you have a lecture both in the morning and in the afternoon.


Course Evaluation

Coming soon ...


Course Logistics

We would like to thank all the students for their enthusiastic participation, it has made course logistics quite challenging, but we have followed the motto “no student left behind” and we have done our best to accommodate the all of you.

Classrooms

Both DL and IC will be held in the following rooms so take note and spread the voice in case you know people who are going to attend and might not have received this email.

  • February 12th, Aula S.0.2. Ed 3
  • February 14th, Aula S.0.2. Ed 3
  • February 16th, Aula S.0.5. Ed 3
  • February 19th, Aula S.0.5. Ed 3
  • February 21th, Aula S.0.2. Ed 3
  • February 23th, Aula N.1.2. Ed 2

These classrooms should be able to fit the all of you, in case not we will check for alternative rooms and will notify this.

Course hours

Starting hours are sharp, i.e., they already include the “academic quarter”:

  • DL will be in the morning: from 9.30 to 13.00,
  • IC will be in the afternoon: from 14.15 to 17.45,

Attendance

The Deep eThese are courses open to Master Students, PhD students and few external participants. Everybody who is interested in getting credits (CFU) or attendance certificates, will be asked to sign an attendance list on each class (a class is a block of hours in the morning or in the afternoon).

Notice that In particular L students have to attend IC on: February 12th, February 19th, and February 21th. This makes the DL schedule:

  • February 12th Morning
  • February 12th Afternoon
  • February 14th Morning
  • February 16th Morning
  • February 19th Morning
  • February 19th Afternoon
  • February 21th Morning
  • February 21th Afternoon
  • February 23th Morning

Similarly IC students have to attend DL on February 12th, February 14th, and February 19th. This makes the IC schedule

  • February 12th Morning
  • February 12th Afternoon
  • February 14th Morning
  • February 14th Afternoon
  • February 16th Afternoon
  • February 19th Morning
  • February 19th Afternoon
  • February 21th Afternoon
  • February 23th Afternoon

Despite the fact you are attending just one of the courses (e.g., because as master student you can get credits only for one of them) we warmly suggest to attend all the lectures from both courses.

Programming Environment

The reference programming language for both courses is Python. In both DL and IC there will be sessions where you’ll be asked to implement yourself a few lines of codes and possibly we will give some simple assignment. The first lecture, bring your laptop and make sure to have Python 3.6 installed; install Miniconda / or Anaconda framework from conda.io … more informations will be provided later on regarding for additional tools and facilities.

Websites

Please refer to the following websites for specific course materials and detailed calendars:

Teaching Material

Lectures will be mostly based on presentations from the teachers and invited speakers. These slides are taken from several sources, tutorials, summer schools, papers and so on. In case you are interested in a reference book, you can read:

Teacher Slides

Additional Resources

Coming soon ...


F.A.Q.

...