Difference between revisions of "3D Structure From Visual Motion"
(→Course Schedule) |
(→Course Schedule) |
||
Line 30: | Line 30: | ||
In the following you find a tentative schedule for the course. In brackets you find also the lecturer for each specific topic. | In the following you find a tentative schedule for the course. In brackets you find also the lecturer for each specific topic. | ||
− | *'''3d Vision Basics''' | + | *'''3d Vision Basics''' |
− | ** Course introduction | + | ** Course introduction |
− | ** Feature extraction, matching and tracking | + | ** Feature extraction, matching and tracking |
− | ** Projection model and projection matrix | + | ** Projection model and projection matrix |
− | ** Fundamental and Essential matrices | + | ** Fundamental and Essential matrices |
− | *'''Structure from Motion and Visual Odometry''' | + | *'''Structure from Motion and Visual Odometry''' |
− | ** Optical flow | + | ** Optical flow |
− | ** Combined estiamation of 3D structure and camera egomotion | + | ** Combined estiamation of 3D structure and camera egomotion |
− | ** Motion extraction and 3D reconstruction | + | ** Motion extraction and 3D reconstruction |
− | *'''Unconventional Visual Odometry''' | + | *'''Unconventional Visual Odometry''' |
− | ** Uncalibrated visual odometry | + | ** Uncalibrated visual odometry |
− | ** Omnidirectional odometry | + | ** Omnidirectional odometry |
− | *'''Simulataneous Localization and Mapping''' | + | *'''Simulataneous Localization and Mapping''' |
− | ** From Bayesian Filtering to SLAM | + | ** From Bayesian Filtering to SLAM |
− | ** EKF-Based SLAM | + | ** EKF-Based SLAM |
− | *'''Visual SLAM''' | + | *'''Visual SLAM''' |
− | ** EKF-based Monocular SLAM | + | ** EKF-based Monocular SLAM |
− | ** Stereo and Omnidirectional visual SLAM | + | ** Stereo and Omnidirectional visual SLAM |
− | ** Why filters? PTAM and FrameSLAM | + | ** Why filters? PTAM and FrameSLAM |
− | *'''3D without 3D''' | + | <!-- |
+ | *'''3D without 3D''' | ||
** Plenoptic methods, lumigraph, albedo, non Lambertian surfaces (3h M. Marcon) | ** Plenoptic methods, lumigraph, albedo, non Lambertian surfaces (3h M. Marcon) | ||
+ | --> | ||
==Course Material & Referencies== | ==Course Material & Referencies== |
Revision as of 15:24, 13 September 2013
Recent news you should be aware of ... * A new version of the course is scheduled for the year 2013/2014
This is a description page for the PhD course on 3D Structure from Visual Motion: Novel Techniques in Computer Vision and Autonomous Robots/Vehicles. This course can be taken also by students from Computer Engineering in the Laurea Magistrale track.
Contents
Course Aim & Organization
Simultaneous estimate of the unknown motion of a camera (or the vehicle this camera is upon) while reconstructing the 3D structure of the observed world is a challenging task that has been deeply studied in the recent literature. The PhD course on 3D Structure from Visual Motion: Novel Techniques in Computer Vision and Autonomous Robots/Vehicles will present modern techniques to simultaneously estimate the unknown motion of a camera while reconstructing the 3D structure of the observed world to be applied in scientific fields such as: 3D reconstruction, autonomous robot navigation, aerial/field surveying, unmanned vehicle maneuvering, etc.
Teachers
Although formally entitled to just one of the teachers (myself) the course is also held by (in order of appearance)
Course Schedule
In the following you find a tentative schedule for the course. In brackets you find also the lecturer for each specific topic.
- 3d Vision Basics
- Course introduction
- Feature extraction, matching and tracking
- Projection model and projection matrix
- Fundamental and Essential matrices
- Structure from Motion and Visual Odometry
- Optical flow
- Combined estiamation of 3D structure and camera egomotion
- Motion extraction and 3D reconstruction
- Unconventional Visual Odometry
- Uncalibrated visual odometry
- Omnidirectional odometry
- Simulataneous Localization and Mapping
- From Bayesian Filtering to SLAM
- EKF-Based SLAM
- Visual SLAM
- EKF-based Monocular SLAM
- Stereo and Omnidirectional visual SLAM
- Why filters? PTAM and FrameSLAM
Course Material & Referencies
The following is some suggested material to follow the course lectures.
Slides and lecture notes
- Correspondence analysis and RANSAC (2011-2012 ed.)
- Camera geometry, single view, and two view geometry material
- Two view geometry and visual odometry material
- Optical flow tracking and egomotion estimation
- Structure from Motion
- Bayesian Filtering, Kalman Filtering, and SLAM
- Simultaneous Localization and Mapping a.k.a. SLAM!
- Monocular SLAM
- Stereo and Omnidirectional SLAM
- Panoramic Visual Odomentry
- Parallel Tracking and Mapping
- 3D without 3D
Suggested Bibliography
- R. Hartley, A. Zisserman. Multiple View Geometry in Computer Vision, Cambridge University Press, March 2004.
- S. Thrun, W. Burgard, D. Fox. Probabilistic Robotics, MIT Press, September 2005.
- Papers you might find useful to deepen your study:
- Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms. H. Durrant-Whyte, T. Bailey [1]
- Unified Inverse Depth Parametrization for Monocular SLAM by J.M.M. Montiel, Javier Civera, and Andrew J. Davison [2]
- Parallel Tracking and Mapping for Small AR Workspaces by Georg Klein and David Murray [3]
- FrameSLAM: from Bundle Adjustment to Realtime Visual Mappping by Kurt Konolige and Motilal Agrawal [4]
Libraries and Demos
TBC
Course Evaluation
The course evaluation will be done on the basis of a project which could be completed also in groups of two people. In the case of PhD students this project could/should be somehow related to their research interests.