# Machine Learning Series

Putting pieces of Machine Learning (ML) together is not easy. It involves knowledge from multiple disciplines. Let’s try to approach the topics more systematically. Some people may avoid math in ML. However, you will not go too far. Fortunately, many are high school math. The real issue is whether it is introduced properly. So I will not shy from it but I will give you enough background for it.

# Fundamental

ML algorithms cover areas including information theory, probability, data distribution, Bayesian inference, matrix decomposition, and statistics. We will cover them in this fundamental article first.

The second article covers the basic ML algorithms including linear regression, classification, kernels, and Bayesian linear regression. You may know some of the topics already. So we will be brief for those well-understood areas. But we will spend more time in Gaussian Process and SVM since the concept is relatively complex. We will also detail many different approaches in the clustering and decision tree. I will also cover basic ML concepts including the cost function, regularization, and model optimization (MLE, MAP). We will also cover some deep learning topics including normalization, LSTM, and energy-based models like RBM.

These two articles involve many proves. But in order not to overwhelm you with details. It will separate it into a different article.

# Linear algebra

You may overwhelm by math in many research papers. Probability distribution, calculus, linear algebra, statistics, and optimization are the major math that we need to learn in ML. The last two are huge topics so I will cover them whenever it is needed only. If you have problems in understanding the linear algebra so far in the previous articles, the article below can refresh your memory. Pay special attention to areas related to matrix factorization, it will be used heavily in the dimensional reduction in ML.

Eigenvector may look simple but it is a powerful concept and play a major role in ML. Google ranks page based on a similar concept. Here we will dedicate an article on it.

If you are irritated in hearing “the matrix is positive semidefinite and therefore …”, you may take a look here on some of the properties for some special matrics constantly used in ML.

SVD and PCA are a core topic in ML. Imagine that we can use linear algebra to untangle data to discover the structure of the data. Instead of processing raw data, we start learning knowledge.

# Optimization

Many AI problems are rephrased as an optimization problem. People may already familiar with conventional methods like gradient descent. We will cover more advanced optimization methods here. The first one is the Lagrange multiplier in optimizing problems under constraints.

Many machine learning (ML) problems deal with a dilemma: If we know latent factor A, we can determine latent factor B. Or, if we know B, we can determine A. Either way solves the ML problem. Unfortunately, we don’t know A or B. But in ML, it can be solved by one powerful algorithm called Expectation-Maximization Algorithm (EM).

# Graphical model

The Graphical model (GM) is a branch of ML which uses a graph to represent a domain problem. Many ML algorithms can be generalized and discussed as GM. Studying it allows us a bird’s eye view on many ML algorithms.

Inference

Inference is about making predictions from data that we know. Here, we find an exact inference when GM is known.

But finding an exact solution is usually hard. Here, we will approximate it with linear programming and dual decomposition.

Next, we switch our gear to find the solution through re-establish p(x). This can be done through sampling with methods like Monte Carlo Sampling, importance sampling, Metropolis algorithm and Gibbs sampling that are commonly used in ML.

In ML, there are two major approximation approaches. They are sampling and variational inference. In the next article, we model the posterior with variational inference with a method like Mean Field Variational Inference.

Finally, we will conclude our GM with a discussion on how to learn the model.

# ML topics

Finally, we will look into some advanced ML algorithms.