Difference between revisions of "Methodologies for Intelligent Systems"
Line 2: | Line 2: | ||
The following are last minute news you should be aware of ;-) | The following are last minute news you should be aware of ;-) | ||
+ | 01/08/2009: Grades for the homwork (and exams) are out! You can find them [http://home.dei.polimi.it/matteucc/lectures/MIS/ here]! | ||
07/07/2009: The 20/06/2009 exam will be from 9:30 to 13:30 in room A3.8 | 07/07/2009: The 20/06/2009 exam will be from 9:30 to 13:30 in room A3.8 | ||
26/06/2009: The 29/06/2009 exam will be from 9:30 to 13:30 in room A1.1; the 03/07/2009 exam will be from 12:30 to 16:30 in room A1.4. | 26/06/2009: The 29/06/2009 exam will be from 9:30 to 13:30 in room A1.1; the 03/07/2009 exam will be from 12:30 to 16:30 in room A1.4. | ||
− | 18/06/2009: | + | 18/06/2009: The last point of the firs exercise should be: "Classify the record: {Small,Stretch,Adult} with the previous classifiers" |
16/06/2009: I've just published the Homework for this year! Check it out down in the page ... | 16/06/2009: I've just published the Homework for this year! Check it out down in the page ... | ||
Revision as of 23:26, 1 August 2009
The following are last minute news you should be aware of ;-)
01/08/2009: Grades for the homwork (and exams) are out! You can find them here! 07/07/2009: The 20/06/2009 exam will be from 9:30 to 13:30 in room A3.8 26/06/2009: The 29/06/2009 exam will be from 9:30 to 13:30 in room A1.1; the 03/07/2009 exam will be from 12:30 to 16:30 in room A1.4. 18/06/2009: The last point of the firs exercise should be: "Classify the record: {Small,Stretch,Adult} with the previous classifiers" 16/06/2009: I've just published the Homework for this year! Check it out down in the page ...
Contents
Course Aim & Organization
The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge discovery, and data analysis/modeling.
Teachers
The course is composed by a blending of lectures and exercises by the course teacher and some teaching assistants.
- Matteo Matteucci: the course teacher
- Davide Eynard: the teaching assistant on clustering
- Rossella Blatt: the teaching assistant on feature selection/projection
Course Program and Schedule
These techniques are presented from a theoretical (i.e., statistics and information theory) and practical perspective through the descriptions of algorithms, their implementation, and applications.The course is composed by a set of selfcontained lectures on specific techniques such as decision trees, decision rules, Bayesian networks, clustering, etc. Supervised and unsupervised learning are discussed in the framework of classification and clustering problems. The course outline is:
- Machine Learning and Pattern Classification: in this part of the course the general concepts of Machine Learning and Patter Recognition are introduced with a brief review of statistics and information theory;
- Unsupervised Learning Techniques: the most common approaches to unsupervised learning are described mostly focusing on clustering techniques, rule induction, Bayesian networks and density estimators using mixure models;
- Supervised Learning Techniques: in this part of the course the most common techniques for Supervised Learning are described: decision trees, decision rules, Bayesian classifiers, hidden markov models, lazy learners, etc.
- Feature Selection and Reduction: techniques for data rediction and feature selection will be presented with theory and applications
- Model Validation and Selection: model validation and selection are orthogonal issues to previous technique; during the course the fundamentals are described and discussed (e.g., AIC, BIC, cross-validation, etc. ).
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are corret "up to some last minute change".
Course Evaluation
The course evaluation is composed by two parts:
- A homework with exercises covering the whole program that counts for 30% of the course grade
- A oral examination covering the whole progran that count for 70% of the course grade
The homework is just one per year, it will be published at the end of the course and you will have 15 days to turn it in. It is not mandatory, however if you do not turn it in you loose 30% of the course grade.
Teaching Material
In the following you can find the lecture slides used by the teacher and the teaching assistants during classes. Some additional material that could be used to prepare the oral examination is provided as well together with the homework.
Machine Learning and Pattern Recognition
- Lecture 1: Introduction to Machine Learning
- Lecture 2: Probability for Dataminers
- Lecture 3: Decision Trees
- Lecture 4: Decision Rules
- Lecture 5: Bayesian Classifiers
- Lecture 6: Bayesian Networks
- Lecture 7: Markov Chains and Hidden Markov Models
Clustering
- Clustering Lecture 1: Introduction
- Clustering Lecture 2: K-Means and Hierarchical
- Clustering Lecture 3: Fuzzy, Gaussians, and SOM
- Clustering Lecture 4: Vector Spacec Model and PDDP
- Clustering Lecture 5: DBSCAN and Jarvis Patrick
- Clustering Lecture 6: Evaluation measures
Dimensionality Reduction and Feature Selection
- Dimensionality Reduction Lecture 1: Dimensionality reduction Intro, Feature extraction, PCA and LDA
- Dimensionality Reduction Lecture 2: Feature selection
- Genetic Algorithms: a rather comprehensive tutorial
- Algorithm Evaluation: from cross-validation to confidence intervals
Homeworks
The homework, although not mandatory, counts for the 30% of the course grade (i.e., if you do not turn it in you loose 30% of the final grade). You have 15 days to turn it in to the teacher. This year the homework is due by the 3rd of July!
Past years course homework; you can use them to make some practice and prepare this year homework ;-)
- Homework for the academic year 2007/2008 Part 1 and Part 2
- Homework for the academic year 2006/2007
- Homework for the academic year 2005/2006
Additional Lecture Notes and Bibliography
- Fundamental Problems for HMM: a document to introduce Hidden Markov Models and the three fundamental questions about them.
- An exercise on modeling and reasoning with Bayesian Networks.