Machine Learning

Brought by: edX

Overview

Machine Learning is the basis for the most exciting careers in data analysis today. You’ll learn the models and methods and apply them to real world situations ranging from identifying trending news topics, to building recommendation engines, ranking sports teams and plotting the path of movie zombies.

Major perspectives covered include:

  • probabilistic versus non-probabilistic modeling
  • supervised versus unsupervised learning

Topics include: classification and regression, clustering methods, sequential models, matrix factorization, topic modeling and model selection.

Methods include: linear and logistic regression, support vector machines, tree classifiers, boosting, maximum likelihood and MAP inference, EM algorithm, hidden Markov models, Kalman filters, k-means, Gaussian mixture models, among others.

In the first half of the course we will cover supervised learning techniques for regression and classification. In this framework, we possess an output or response that we wish to predict based on a set of inputs. We will discuss several fundamental methods for performing this task and algorithms for their optimization. Our approach will be more practically motivated, meaning we will fully develop a mathematical understanding of the respective algorithms, but we will only briefly touch on abstract learning theory.

In the second half of the course we shift to unsupervised learning techniques. In these problems the end goal less clear-cut than predicting an output based on a corresponding input. We will cover three fundamental problems of unsupervised learning: data clustering, matrix factorization, and sequential models for order-dependent data. Some applications of these models include object recommendation and topic modeling.

Syllabus

Week 1: maximum likelihood estimation, linear regression, least squares
Week 2: ridge regression, bias-variance, Bayes rule, maximum a posteriori inference
Week 3: Bayesian linear regression, sparsity, subset selection for linear regression
Week 4: nearest neighbor classification, Bayes classifiers, linear classifiers, perceptron
Week 5: logistic regression, Laplace approximation, kernel methods, Gaussian processes
Week 6: maximum margin, support vector machines, trees, random forests, boosting
Week 7: clustering, k-means, EM algorithm, missing data
Week 8: mixtures of Gaussians, matrix factorization
Week 9: non-negative matrix factorization, latent factor models, PCA and variations
Week 10: Markov models, hidden Markov models
Week 11: continuous state-space models, association analysis
Week 12: model selection, next steps

Taught by

Professor John W. Paisley

Machine Learning
Go to course

Machine Learning

Brought by: edX

  • edX
  • Free
  • English
  • Certificate Not Available
  • Certain days
  • advanced
  • English