Machine Learning Carnegie Mellon University
Textbook: Machine Learning. A Guide to Current Research, Tom M. Mitchell, Jaime G. Carbonell
- Machine Learning, Tom Mitchell. (optional)
- Pattern Recognition and Machine Learning, Christopher Bishop. (optional)
- The Elements of Statistical Learning: Data Mining, Inference and Prediction, Trevor Hastie, Robert Tibshirani, Jerome Friedman. (optional)
Course Description:Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., programs that learn to recognize human faces, recommend music and movies, and drive autonomous robots). This course covers the theory and practical algorithms for machine learning from a variety of perspectives. We cover topics such as Bayesian networks, decision tree learning, Support Vector Machines, statistical learning methods, unsupervised learning and reinforcement learning. The course covers theoretical concepts such as inductive bias, the PAC learning framework, Bayesian learning methods, margin-based learning, and Occam’s Razor. Short programming assignments include hands-on experiments with various learning algorithms, and a larger course project gives students a chance to dig into an area of their choice. This course is designed to give a graduate-level student a thorough grounding in the methodologies, technologies, mathematics and algorithms currently needed by people who do research in machine learning.
Machine Learning– 2014 CIS520
Lectures Notes Self-test
Topics covered : Supervised learning: least squares regression, logistic regression, perceptron, naive Bayes, support vector machines. Model and feature selection, ensemble methods, boosting. Learning theory: Bias/variance tradeoff. Online learning. Unsupervised learning: Clustering. K-means. EM. Mixture of Gaussians. PCA. Graphical models: HMMs, Bayesian and Markov networks. Inference. Variable elimination…Useful Links and Resources
UNIVERSITY OF CAMBRIDGE Department of Engineering
Introduction to Machine Learning, the concept of a model, linear in the parameters regression: probability basics, Bayesian inference and prediction, Marginal Likelihood, Gaussian Processes, Probabilistic Ranking, Text and Discrete Distributions, Mixture models for text , Latent Dirichlet Allocation (LDA) model.
Machine Learning – CS 229
- Lecture notes 1 (ps) (pdf) Supervised Learning, Discriminative Algorithms
- Lecture notes 2 (ps) (pdf) Generative Algorithms
- Lecture notes 3 (ps) (pdf) Support Vector Machines
- Lecture notes 4 (ps) (pdf) Learning Theory
- Lecture notes 5 (ps) (pdf) Regularization and Model Selection
- Lecture notes 6 (ps) (pdf) Online Learning and the Perceptron Algorithm.
- Lecture notes 7a (ps) (pdf) Unsupervised Learning, k-means clustering.
- Lecture notes 7b (ps) (pdf) Mixture of Gaussians
- Lecture notes 8 (ps) (pdf) The EM Algorithm
- Lecture notes 9 (ps) (pdf) Factor Analysis
- Lecture notes 10 (ps) (pdf) Principal Components Analysis
- Lecture notes 11 (ps) (pdf) Independent Components Analysis
- Lecture notes 12 (ps) (pdf) Reinforcement Learning and Control
- Handout #1: Course Information (HTML)
- Handout #2: Course Schedule (HTML)
- Handout #3: Cover Sheet
- Problem Set 1 (pdf)
- Handout #4: Final Project Guidelines (PDF)
- Midterm Solutions
- Practice Midterm, Solutions
- Andrew’s Deep Learning Presentation Slides
- Other resources
This course introduces the fundamental concepts and methods of machine learning, including the description and analysis of several modern algorithms, their theoretical basis, and the illustration of their applications. Many of the algorithms described have been successfully used in text and speech processing, bioinformatics, and other areas in real-world products and services. The main topics covered are:
- Lecture 00: Introduction to convex optimization.
- Lecture 01: Introduction to machine learning, basic definitions, probability tools.
- Lecture 02: PAC model, guarantees for learning with finite hypothesis sets.
- Lecture 03: Rademacher complexity, growth function, VC-dimension, learning bounds for infinite hypothesis sets.
- Lecture 04: Support vector machines (SVMs), margin bounds.
- Lecture 05: Kernel methods.
- Lecture 06: Boosting.
- Lecture 07: Density estimation, Maxent models, multinomial logistic regression.
- Lecture 08: On-line learning.
- Lecture 09: Ranking.
- Lecture 10: Multi-class classification.
- Lecture 11: Regression.
- Lecture 12: Reinforcement learning.
- Lecture 13: Learning languages.
Handouts Lecture Note Section Notes Other resources.. Machine Learning Introduction to machine learning and statistical pattern recognition. Topics include: supervised learning (generative/discriminative learning, parametric/non-parametric learning, neural networks, support vector machines); unsupervised learning (clustering, dimensionality reduction, kernel methods); learning theory (bias/variance tradeoffs; VC theory; large margins); reinforcement learning and adaptive control. The course will also discuss recent applications of machine learning, such as to robotic control, data mining, autonomous navigation, bioinformatics, speech recognition, and text and web data processing.
Introduction; Rectilinear Motion, Rectilinear Motion, Circular Motion, Projectile Motion , x-y Parametric Eqns, Path Known, n-t non-circular ,Dependent & Relative Motion, Plane Relative Motion, Fixed Axis Rotation, Absolute & Relative Motion Analysis, Relative Velocity , Instantaneous Center of Zero Velocity, Relative Acceleration, Equations of Motion: Particle, Rigid Body F=ma, Translation, Fixed Axis Rotation , General Plane Motion, General Plane Motion, Review: Particle and Rigid Body , Work-Energy: Particles , Work-Energy: Rigid Bodies, Work-Energy: Rigid Bodies, Linear Impulse Momentum, Conservation of Momentum.