Machine Learning

Machine Learning Carnegie Mellon University

Textbook: Machine Learning. A Guide to Current Research, Tom M. Mitchell, Jaime G. Carbonell

Machine Learning, Tom Mitchell

Machine Learning, Tom Mitchell

  • Machine Learning, Tom Mitchell. (optional)
  • Pattern Recognition and Machine Learning, Christopher Bishop. (optional)
  • The Elements of Statistical Learning: Data Mining, Inference and Prediction, Trevor Hastie, Robert Tibshirani, Jerome Friedman. (optional)

Homework-solutoion | Lecture Notes  | Video Lectures 

Course Description:Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., programs that learn to recognize human faces, recommend music and movies, and drive autonomous robots). This course covers the theory and practical algorithms for machine learning from a variety of perspectives. We cover topics such as Bayesian networks, decision tree learning, Support Vector Machines, statistical learning methods, unsupervised learning and reinforcement learning. The course covers theoretical concepts such as inductive bias, the PAC learning framework, Bayesian learning methods, margin-based learning, and Occam’s Razor. Short programming assignments include hands-on experiments with various learning algorithms, and a larger course project gives students a chance to dig into an area of their choice. This course is designed to give a graduate-level student a thorough grounding in the methodologies, technologies, mathematics and algorithms currently needed by people who do research in machine learning. 

Machine Learning2014 CIS520

C_Bishop_Pattern Recognition and Machine Learning_2007Textbook: C. Bishop, Pattern Recognition and Machine Learning. 2007

Lectures Notes Self-test

Topics covered : Supervised learning: least squares regression, logistic regression, perceptron, naive Bayes, support vector machines. Model and feature selection, ensemble methods, boosting. Learning theory: Bias/variance tradeoff. Online learning.  Unsupervised learning: Clustering. K-means. EM. Mixture of Gaussians. PCA.  Graphical models: HMMs, Bayesian and Markov networks. Inference. Variable elimination…Useful Links and Resources


Machine-Learning 2014
UNIVERSITY OF CAMBRIDGE Department of Engineering
Machine Learning a Probabilistic PerspectiveDavid Barber Bayesian Reasoning and Machine LearningIntroduction to Machine Learning, the concept of a model, linear in the parameters regression: probability basics, Bayesian inference and prediction, Marginal Likelihood, Gaussian Processes, Probabilistic Ranking, Text and Discrete Distributions, Mixture models for text , Latent Dirichlet Allocation (LDA) model.

Lectures and slides


Machine LearningCS 229
Lecture notes




Foundations of Machine LearningThis course introduces the fundamental concepts and methods of machine learning, including the description and analysis of several modern algorithms, their theoretical basis, and the illustration of their applications. Many of the algorithms described have been successfully used in text and speech processing, bioinformatics, and other areas in real-world products and services. The main topics covered are:

  • Lecture 00: Introduction to convex optimization.
  • Lecture 01: Introduction to machine learning, basic definitions, probability tools.
  • Lecture 02: PAC model, guarantees for learning with finite hypothesis sets.
  • Lecture 03: Rademacher complexity, growth function, VC-dimension, learning bounds for infinite hypothesis sets.
  • Lecture 04: Support vector machines (SVMs), margin bounds.
  • Lecture 05: Kernel methods.
  • Lecture 06: Boosting.
  • Lecture 07: Density estimation, Maxent models, multinomial logistic regression.
  • Lecture 08: On-line learning.
  • Lecture 09: Ranking.
  • Lecture 10: Multi-class classification.
  • Lecture 11: Regression.
  • Lecture 12: Reinforcement learning.
  • Lecture 13: Learning languages.

Homework solution reference


Machine Learning

Handouts Lecture Note Section Notes Other resources.. Machine Learning   Introduction to machine learning and statistical pattern recognition. Topics include: supervised learning (generative/discriminative learning, parametric/non-parametric learning, neural networks, support vector machines); unsupervised learning (clustering, dimensionality reduction, kernel methods); learning theory (bias/variance tradeoffs; VC theory; large margins); reinforcement learning and adaptive control. The course will also discuss recent applications of machine learning, such as to robotic control, data mining, autonomous navigation, bioinformatics, speech recognition, and text and web data processing.

Engineering Mechanics-Dynamics

Textbook: Engineering Mechanics-Dynamics, 11Th   , Hibbeler.

Exams solution  homework  video lectures note

Introduction;  Rectilinear Motion, Rectilinear Motion, Circular Motion, Projectile Motion , x-y Parametric Eqns, Path Known, n-t non-circular ,Dependent & Relative Motion, Plane Relative Motion, Fixed Axis Rotation, Absolute & Relative Motion Analysis, Relative Velocity , Instantaneous Center of Zero Velocity, Relative Acceleration, Equations of Motion:  Particle, Rigid Body F=ma, Translation, Fixed Axis Rotation  , General Plane Motion, General Plane Motion, Review:  Particle and Rigid Body , Work-Energy:  Particles , Work-Energy:  Rigid Bodies, Work-Energy:  Rigid Bodies, Linear Impulse Momentum, Conservation of Momentum.




  1. Pingback: Machine Learning - Engineer Blogs

  2. Pingback: machine

Leave a Reply

Your email address will not be published. Required fields are marked *