Statistical & Financial Consulting by Stanford PhD

Home Page

Linear Discriminant Analysis (LDA) is a classification technique developed by Ronald Fisher. In a classification setting we need to solve the following problem. We observe N objects. For each object we know the values of variables X_{1}, ..., X_{p}. We also know that the objects are split into classes 1, 2, ..., K. For each object we know its class membership. We need to develop a statistical method that allows to identify class membership of a new object for which only the values of X_{1}, ..., X_{p} are known.

Linear discriminant analysis assumes that, within each class i, features X_{1}, ..., X_{p} have joint normal distribution. Random vector (X_{1}, ..., X_{p}) has mean m_{i} = (m_{i1}, ... , m_{ip}) within class i. For different classes i and j, means m_{i} and m_{j} are different, which gives us the ability to distinguish between the classes.

The parameters of joint normal distributions are estimated via the method of maximum likelihood (MML). Because of the normality assumption, MML leads to very simple formulas. Assuming that the parameter estimates are true, we can derive a formula for the probability of object A belonging to class i if its features X_{1}, ..., X_{p} have values x_{1}, ..., x_{p} respectively. After that we can classify A to be a member of class j if j is the likeliest class.

It can be shown that this approach implies linear boundaries between the classes. In other words, in the p-dimensional Euclidean space there are hyperplanes that separate the "territory" of one class from the "territories" of the others. If object A has features (X_{1}, ..., X_{p}) which belong to the "territory" of class j, then the probability that A is a member of j is larger than the probability that A is a member of class i, for any other i. So we classify A to be a member of class j...The linearity of the boundaries is why we call the method "linear discriminant analysis".

LDA is one of the simplest classification techniques, along with logistic regression and k-nearest neighbor. Despite the simplicity, these techniques compete with more convoluted approaches succesfully in many situations.

**LINEAR DISCRIMINANT ANALYSIS REFERENCES
**

Duda, R. O., Hart, P. E., & Stork, D. H. (2000). Pattern Classification (2nd ed). New York: Wiley Interscience.

Hilbe, J. M. (2009). Logistic Regression Models. Boca Raton, FL: Chapman & Hall / CRC Press.

McLachlan, G. J. (2004). Discriminant Analysis and Statistical Pattern Recognition. New York: Wiley Interscience.

Tatsuoka, M. M. (1971). Multivariate analysis. New York: John Wiley & Sons, Inc.

Krzanowski W. J. (1990). Principles of Multivariate Analysis. Oxford University Press.

Mika, S, Ratsch, G., Weston, J., Scholkoph, B. & Mullers, K. R. (1999). Fisher discriminant analysis with kernels. Neural Networks for Signal Processing.

Venables, W. N., & Ripley, B. D. (2002). Modern Applied Statistics with S (4th ed). New York: Springer Verlag.

- Detailed description of the services offered in the areas of statistical consulting and financial consulting: home page, types of service, experience, case studies and payment options
- Directory of financial topics