Difference between revisions of "Methodologies for Intelligent Systems"

From Chrome
Jump to: navigation, search
(Created page with 'The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge di...')
 
Line 1: Line 1:
 +
__FORCETOC__
 +
 +
==Course Aim & Organization==
 +
 
The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge discovery, and data analysis/modeling.
 
The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge discovery, and data analysis/modeling.
  
Line 10: Line 14:
  
 
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are corret "up to some last minute change".
 
A detailed schedule of the course can be found here; topics are just indicative while days and teachers are corret "up to some last minute change".
 +
 +
==Teaching Material==
 +
 +
Lecture Slides on Machine Learning and Pattern Recognition
 +
 +
  1. Lecture 1: Introduction to Machine Learning
 +
  2. Lecture 2: Probability for Dataminers
 +
  3. Lecture 3: Decision Trees
 +
  4. Lecture 4: Decision Rules
 +
  5. Lecture 5: Bayesian Classifiers
 +
  6. Lecture 6: Bayesian Networks
 +
  7. Lecture 7: Markov Chains and Hidden Markov Models
 +
 +
Lecture Slides on Clustering
 +
 +
  1. Clustering Lecture 1: Introduction (03/04/2008)
 +
  2. Clustering Lecture 2: K-Means and Hierarchical (03/04/2008)
 +
  3. Clustering Lecture 3: Fuzzy, Gaussians, and SOM (03/04/2008)
 +
  4. Clustering Lecture 4: Vector Spacec Model and PDDP
 +
  5. Clustering Lecture 5: DBSCAN and Jarvis Patrick
 +
  6. Clustering Lecture 6: Evaluation measures
 +
 +
Lecture Slides on Dimensionality Reduction and Feature Selection
 +
 +
  1. Dimensionality Reduction Lecture 1: Dimensionality reduction Intro
 +
  2. Dimensionality Reduction Lecture 2: Feature extraction, PCA and LDA
 +
  3. Dimensionality Reduction Lecture 3: Feature selection
 +
  4. Genetic Algorithms: a rather comprehensive tutorial
 +
  5. Algorithm Evaluation: from cross-validation to confidence intervals
 +
 +
A few additional lecture notes
 +
 +
  1. Fundamental Problems for HMM: a document to introduce Hidden Markov Models and the three fundamental questions about them.
 +
  2. An exercise on modeling and reasoning with Bayesian Networks.
 +
 +
Past years course homework; tou can use them to make some practice and prepare this year homework ;-)
 +
 +
  1. Homework for the academic year 2008/2009
 +
  2. Homework for the academic year 2007/2008
 +
  3. Homework for the academic year 2006/2007
 +
  4. Homework for the academic year 2005/2006

Revision as of 09:39, 16 June 2009


Course Aim & Organization

The objective of this course is to give an advanced presentation of the techniques most used in artificial intelligence and machine learning for pattern recognition, knowledge discovery, and data analysis/modeling.

These techniques are presented from a theoretical (i.e., statistics and information theory) and practical perspective through the descriptions of algorithms, their implementation, and applications.The course is composed by a set of selfcontained lectures on specific techniques such as decision trees, decision rules, Bayesian networks, clustering, etc. Supervised and unsupervised learning are discussed in the framework of classification and clustering problems. The course outline is:

  • Machine Learning and Pattern Classification: in this part of the course the general concepts of Machine Learning and Patter Recognition are introduced with a brief review of statistics and information theory;
  • Unsupervised Learning Techniques: the most common approaches to unsupervised learning are described mostly focusing on clustering techniques, rule induction, Bayesian networks and density estimators using mixure models;
  • Supervised Learning Techniques: in this part of the course the most common techniques for Supervised Learning are described: decision trees, decision rules, Bayesian classifiers, hidden markov models, lazy learners, etc.
  • Feature Selection and Reduction: techniques for data rediction and feature selection will be presented with theory and applications
  • Model Validation and Selection: model validation and selection are orthogonal issues to previous technique; during the course the fundamentals are described and discussed (e.g., AIC, BIC, cross-validation, etc. ).

A detailed schedule of the course can be found here; topics are just indicative while days and teachers are corret "up to some last minute change".

Teaching Material

Lecture Slides on Machine Learning and Pattern Recognition

  1. Lecture 1: Introduction to Machine Learning
  2. Lecture 2: Probability for Dataminers
  3. Lecture 3: Decision Trees
  4. Lecture 4: Decision Rules
  5. Lecture 5: Bayesian Classifiers
  6. Lecture 6: Bayesian Networks
  7. Lecture 7: Markov Chains and Hidden Markov Models

Lecture Slides on Clustering

  1. Clustering Lecture 1: Introduction (03/04/2008)
  2. Clustering Lecture 2: K-Means and Hierarchical (03/04/2008)
  3. Clustering Lecture 3: Fuzzy, Gaussians, and SOM (03/04/2008)
  4. Clustering Lecture 4: Vector Spacec Model and PDDP
  5. Clustering Lecture 5: DBSCAN and Jarvis Patrick
  6. Clustering Lecture 6: Evaluation measures

Lecture Slides on Dimensionality Reduction and Feature Selection

  1. Dimensionality Reduction Lecture 1: Dimensionality reduction Intro
  2. Dimensionality Reduction Lecture 2: Feature extraction, PCA and LDA
  3. Dimensionality Reduction Lecture 3: Feature selection
  4. Genetic Algorithms: a rather comprehensive tutorial
  5. Algorithm Evaluation: from cross-validation to confidence intervals

A few additional lecture notes

  1. Fundamental Problems for HMM: a document to introduce Hidden Markov Models and the three fundamental questions about them.
  2. An exercise on modeling and reasoning with Bayesian Networks.

Past years course homework; tou can use them to make some practice and prepare this year homework ;-)

  1. Homework for the academic year 2008/2009
  2. Homework for the academic year 2007/2008
  3. Homework for the academic year 2006/2007
  4. Homework for the academic year 2005/2006