Skip to main content

The purpose of this course is to address the following: how to algorithmically generate models and hypotheses for usefully describing data. This will require understanding what is meant by data, models, hypotheses, and usefulness.

This will draw on various topics from mathematics and probability, to statistics, to algorithms and data science. This will generally be a fairly math intensive course, and familiarity with probability is a must. I repeat, a familiarity with probability is a /must/. A rough outline of topics for the class is as follows:

  • Probability Theory and Estimation
  • Decision Lists and Trees
  • Classification
  • Support Vector Machines
  • Regression: Ridge and Lasso
  • Statistical Learning Theory
  • Boosting and Ensemble Methods
  • Graphical Models
  • Optimization
  • Semi-Supervised Learning
  • Reinforcement Learning
  • Deep Learning
Notes:
  • Probability Notes ( Coming Soon )
  • Estimation Notes ( Coming Soon )
  • Hypothesis Notes ( Coming Soon )
  • PAC Notes ( Coming Soon )
  • Logistic Regression Notes ( Coming Soon )
  • Optimization Notes ( Coming Soon )
  • Decision Tree Notes ( Coming Soon )
  • Linear Regression Notes ( Coming Soon )
  • Deep Learning Notes ( Coming Soon )
  • Boosting Notes ( Coming Soon )
  • Graphical Model Notes ( Coming Soon )
  • Learning Bayesian Networks ( Coming Soon )
  • Word Embedding Notes ( Coming Soon )
  • Variational Auto Encoder Notes ( Coming Soon )
  • Perceptron Notes ( Coming Soon )
  • Perceptron Complexity Notes ( Coming Soon )
  • Support Vector Machine Notes ( Coming Soon )
  • Non-Linear Support Vector Machine Notes ( Coming Soon )
  • What is the Gaussian Kernel Even Doing? ( Coming Soon )
  • Worked SVM Example ( Coming Soon )
  • Recurrent Neural Network Notes ( Coming Soon )