Difference between revisions of "Methodologies for Intelligent Systems"

From Chrome
Jump to: navigation, search
 
(48 intermediate revisions by the same user not shown)
Line 2: Line 2:
  
 
The following are last minute news you should be aware of ;-)
 
The following are last minute news you should be aware of ;-)
  18/06/2009: Errata corrige in the homework! The last point of the firs exercise should be: "Classify the record: {Small,Stretch,Adult} with the previous classifiers"
+
  07/08/2010: Summer grades are out details are [[Media:Grades_Summer_2010.pdf |here!]]
  16/06/2009: I've just published the Homework for this year! Check it out down in the page ...
+
23/07/2010: the exam of 26/07/2010 will be in room 1.4 starting at 9:00 am
 +
29/06/2010: homework EXTENSION, new deadline 10/07/2010
 +
  16/06/2010: the homework is out! Turn it in by 05/07/2010 !!!
 +
09/03/2010: the course starts today!
  
 
==Course Aim & Organization==
 
==Course Aim & Organization==
Line 15: Line 18:
 
* [http://www.dei.polimi.it/people/matteucci Matteo Matteucci]: the course teacher
 
* [http://www.dei.polimi.it/people/matteucci Matteo Matteucci]: the course teacher
 
* [http://www.dei.polimi.it/people/eynard Davide Eynard]: the teaching assistant on clustering
 
* [http://www.dei.polimi.it/people/eynard Davide Eynard]: the teaching assistant on clustering
* [http://www.dei.polimi.it/people/blatt Rossella Blatt]: the teaching assistant on feature selection/projection
+
* [http://www.dei.polimi.it/people/tognetti Simone Tognetti]: the teaching assistant on feature selection/projection
  
 
===Course Program and Schedule===
 
===Course Program and Schedule===
Line 36: Line 39:
 
* A oral examination covering the whole progran that count for 70% of the course grade
 
* A oral examination covering the whole progran that count for 70% of the course grade
  
The homework is just one per year, it will be published at the end of the course and you will have 15 days to turn it in. It is not mandatory, however if you do not turn it in you loose 30% of the course grade.
+
The homework is just one per year, it will be published at the end of the course and you will have 15 days to turn it in. It is not mandatory, however if you do not turn it in you loose 30% of the course grade. There is the option of substitute the homework with a practical project, but this has to be discussed and agreed with the course professor.
  
 
==Teaching Material==
 
==Teaching Material==
Line 44: Line 47:
 
===Machine Learning and Pattern Recognition===
 
===Machine Learning and Pattern Recognition===
  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-01.pdf Lecture 1]: Introduction to Machine Learning
+
* [[Media:Mis-handout-lecture-1.pdf | Lecture 1]]: Introduction to Machine Learning
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-02.pdf Lecture 2]: Probability for Dataminers
+
* [[Media:Mis-handout-lecture-2.pdf | Lecture 2]]: Probability for Dataminers
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-03.pdf Lecture 3]: Decision Trees
+
* [[Media:Mis-handout-lecture-3.pdf | Lecture 3]]: Decision Trees
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-04.pdf Lecture 4]: Decision Rules
+
* [[Media:Mis-handout-lecture-4.pdf | Lecture 4]]: Decision Rules
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-05.pdf Lecture 5]: Bayesian Classifiers
+
* [[Media:Mis-handout-lecture-5.pdf | Lecture 5]]: Bayesian Classifiers
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-06.pdf Lecture 6]: Bayesian Networks
+
* [[Media:Mis-handout-lecture-6.pdf | Lecture 6]]: Bayesian Networks
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-07.pdf Lecture 7]: Markov Chains and Hidden Markov Models
+
* [[Media:Mis-handout-lecture-7.pdf | Lecture 7]]: Markov Chains and Hidden Markov Models
  
 
===Clustering===
 
===Clustering===
  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e01.pdf Clustering Lecture 1]: Introduction
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e1.pdf Clustering Lecture 1]: Introduction
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e02.pdf Clustering Lecture 2]: K-Means and Hierarchical  
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e2.pdf Clustering Lecture 2]: K-Means and Hierarchical  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e03.pdf Clustering Lecture 3]: Fuzzy, Gaussians, and SOM  
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e3.pdf Clustering Lecture 3]: Fuzzy, Gaussians, and SOM  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e04.pdf Clustering Lecture 4]: Vector Spacec Model and PDDP
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e4.pdf Clustering Lecture 4]: Vector Spacec Model and PDDP
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e05.pdf Clustering Lecture 5]: DBSCAN and Jarvis Patrick
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e5.pdf Clustering Lecture 5]: DBSCAN and Jarvis Patrick
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e06.pdf Clustering Lecture 6]: Evaluation measures
+
* [http://davide.eynard.it/teaching/2010_msi/handout-lecture-e6.pdf Clustering Lecture 6]: Evaluation measures
  
 
===Dimensionality Reduction and Feature Selection===
 
===Dimensionality Reduction and Feature Selection===
  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e08.pdf Dimensionality Reduction Lecture 1]: Dimensionality reduction Intro, Feature extraction, PCA and LDA
+
* [[Media:01-Introduction.pdf | Feature Lecture 1]]: Dimensionality reduction Intro and Feature extraction
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e09.pdf Dimensionality Reduction Lecture 2]: Feature selection
+
** [[Media:Lezione01.txt | Matlab example for the first lecture]] (rename it as lezione.m)
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e10.pdf Genetic Algorithms]: a rather comprehensive tutorial
+
* [[Media:02-FeatureProjection.pdf| Feature Lecture 2]]: Feature projection, PCA and LDA
* [http://home.dei.polimi.it/matteucc/lectures/MIS/handout-e11.pdf Algorithm Evaluation]: from cross-validation to confidence intervals
+
** [[Media:LDA.txt | Matlab LDA example]] (rename it as LDA.m)
 +
* [[Media:03-FeatureSelection.pdf | Feature Lecture 3]]: Feature selection methods
 +
* [[Media:04-GeneticAlgorithms.pdf | Genetic Algorithm]]: Lecture about genetic algorithms
 +
* [[Media:05-Crossvalidation.pdf | Cross Validation]]: Lecture about cross validation and model evaluation techniques
 +
** [[Media:Lezione04a.txt | Matlab example for the fourth lecture]] (rename it as lezione04a.m)
 +
** [[Media:Lezione04.txt | Matlab example for the fourth lecture]] (rename it as lezione04.m)
 +
** [[Media:Lez_weka.txt | Matlab example for the fourth lecture]] (rename it as lez_weka.m)
 +
** [[Media:Matlab2weka.txt | Matlab example for the fourth lecture]] (rename it as matlab2weka.m)
 +
** [[Media:Plot_distribution2.txt | Matlab example for the fourth lecture]] (rename it as plot_distribution2.m)
 +
** [[Media:Test_weka.txt | Matlab example for the fourth lecture]] (rename it as test_weka.m)
 +
** [[Media:Train_weka_classif_affective.txt | Matlab example for the fourth lecture]] (rename it as train_weka_classif_affective.m)
  
 
===Homeworks===
 
===Homeworks===
  
 
The homework, although not mandatory, counts for the 30% of the course grade (i.e., if you do not turn it in you loose 30% of the final grade).  
 
The homework, although not mandatory, counts for the 30% of the course grade (i.e., if you do not turn it in you loose 30% of the final grade).  
You have 15 days to turn it in to the teacher. '''This year the homework is due by the 3rd of July!'''
+
You have 15 days to turn it in to the teacher. '''This year the homework is due by the 5th of July!''' Please turn in a digital (or digitalized) copy of your homework.
  
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2008-2009.pdf Homework for the academic year 2008/2009]
+
* [[Media:Homework_2009-2010.pdf | Homework for the academic year 2009/2010]]
  
 
Past years course homework; you can use them to make some practice and prepare this year homework ;-)
 
Past years course homework; you can use them to make some practice and prepare this year homework ;-)
  
 +
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2008-2009.pdf Homework for the academic year 2008/2009]
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2007-2008_1.pdf Homework for the academic year 2007/2008 Part 1] and [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2007-2008_2.pdf Part 2]
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2007-2008_1.pdf Homework for the academic year 2007/2008 Part 1] and [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2007-2008_2.pdf Part 2]
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2006-2007.pdf Homework for the academic year 2006/2007]
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/Homework_2006-2007.pdf Homework for the academic year 2006/2007]
Line 83: Line 97:
 
===Additional Lecture Notes and Bibliography===
 
===Additional Lecture Notes and Bibliography===
  
 +
* [http://people.cs.ubc.ca/~murphyk/Bayes/Charniak_91.pdf Bayesian Networks without tears]: a useful introduction to Bayesian Network.
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/FundamentalIssuesHMM.pdf Fundamental Problems for HMM]: a document to introduce Hidden Markov Models and the three fundamental questions about them.
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/FundamentalIssuesHMM.pdf Fundamental Problems for HMM]: a document to introduce Hidden Markov Models and the three fundamental questions about them.
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/BayesianSolution.pdf An exercise on modeling and reasoning with Bayesian Networks].
 
* [http://home.dei.polimi.it/matteucc/lectures/MIS/BayesianSolution.pdf An exercise on modeling and reasoning with Bayesian Networks].
 +
* The Pearl's message passing algorithm deserves some extra thoughts
 +
** [http://en.wikipedia.org/wiki/Belief_propagation The wikipedia article and related references]
 +
** [http://www.google.com/url?sa=t&source=web&cd=1&ved=0CBYQFjAA&url=http%3A%2F%2Fwww.cs.pitt.edu%2F~tomas%2Fcs3750%2Fpearl.ppt&ei=gb0WTIu7FISZ_QaBjYCACA&usg=AFQjCNGVGKkp17sCn2M-Gg3_FuYyigLmeA Powerpoint Slides by Tomas Singliar]
 +
** [http://whatdafact.com/data_kittipat/Note_on_Pearls_message_passing.pdf Some handwritten notes by Kittipat Kampa]

Latest revision as of 18:07, 7 August 2010


The following are last minute news you should be aware of ;-)

07/08/2010: Summer grades are out details are here!
23/07/2010: the exam of 26/07/2010 will be in room 1.4 starting at 9:00 am
29/06/2010: homework EXTENSION, new deadline 10/07/2010
16/06/2010: the homework is out! Turn it in by 05/07/2010 !!!
09/03/2010: the course starts today!

Course Aim & Organization

The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge discovery, and data analysis/modeling.

Teachers

The course is composed by a blending of lectures and exercises by the course teacher and some teaching assistants.

Course Program and Schedule

These techniques are presented from a theoretical (i.e., statistics and information theory) and practical perspective through the descriptions of algorithms, their implementation, and applications.The course is composed by a set of selfcontained lectures on specific techniques such as decision trees, decision rules, Bayesian networks, clustering, etc. Supervised and unsupervised learning are discussed in the framework of classification and clustering problems. The course outline is:

  • Machine Learning and Pattern Classification: in this part of the course the general concepts of Machine Learning and Patter Recognition are introduced with a brief review of statistics and information theory;
  • Unsupervised Learning Techniques: the most common approaches to unsupervised learning are described mostly focusing on clustering techniques, rule induction, Bayesian networks and density estimators using mixure models;
  • Supervised Learning Techniques: in this part of the course the most common techniques for Supervised Learning are described: decision trees, decision rules, Bayesian classifiers, hidden markov models, lazy learners, etc.
  • Feature Selection and Reduction: techniques for data rediction and feature selection will be presented with theory and applications
  • Model Validation and Selection: model validation and selection are orthogonal issues to previous technique; during the course the fundamentals are described and discussed (e.g., AIC, BIC, cross-validation, etc. ).

A detailed schedule of the course can be found here; topics are just indicative while days and teachers are corret "up to some last minute change".

Course Evaluation

The course evaluation is composed by two parts:

  • A homework with exercises covering the whole program that counts for 30% of the course grade
  • A oral examination covering the whole progran that count for 70% of the course grade

The homework is just one per year, it will be published at the end of the course and you will have 15 days to turn it in. It is not mandatory, however if you do not turn it in you loose 30% of the course grade. There is the option of substitute the homework with a practical project, but this has to be discussed and agreed with the course professor.

Teaching Material

In the following you can find the lecture slides used by the teacher and the teaching assistants during classes. Some additional material that could be used to prepare the oral examination is provided as well together with the homework.

Machine Learning and Pattern Recognition

Clustering

Dimensionality Reduction and Feature Selection

Homeworks

The homework, although not mandatory, counts for the 30% of the course grade (i.e., if you do not turn it in you loose 30% of the final grade). You have 15 days to turn it in to the teacher. This year the homework is due by the 5th of July! Please turn in a digital (or digitalized) copy of your homework.

Past years course homework; you can use them to make some practice and prepare this year homework ;-)

Additional Lecture Notes and Bibliography