Pattern Recognition Primer II

Initializing live version
Download to Desktop

Requires a Wolfram Notebook System

Interact on desktop, mobile and cloud with the free Wolfram Player or other Wolfram Language products.

This Demonstration introduces quadratic classifiers. The concept is best understood by looking at two-dimensional examples in which patterns are represented as points and the classifier is represented as a conic section (hyperbola, parabola, or ellipse).

Contributed by: Shashi Sathyanarayana (March 2011)
Open content licensed under CC BY-NC-SA


Snapshots


Details

This Demonstration extends the basic pattern recognition concepts introduced in Pattern Recognition Primer. Quadratic classifiers are to linear classifiers what quadratic equations are to linear equations: a step up in complexity and the range of phenomena they can model. This Demonstration illustrates a simple supervised pattern recognition situation requiring the separation of two classes represented by the red and blue dots, the positives and negatives, respectively. The process of training involves using the given data to find the quadratic curve (i.e., the parabola, ellipse, or hyperbola) that best separates the data. The easy-to-read source code implements formulas to derive the classifiers and evaluate the results.

Use the buttons to choose different datasets and observe the performance of the quadratic classifier in separating the two classes. The graph on the left shows the quadratic classifier (thick quadratic curve) and the linear classifier (dashed line) overlaid on the positive (red) and negative (blue) data points. The graph on the right displays the ROC curves and areas under the curve (AUC) for both the linear and quadratic classifiers. The red dots shown on the ROC curves mark the values of 1-specificity and sensitivity associated with the particular classifiers shown. The better the classifiers are, the closer the red dots get to the top-left corner, and the AUC values get close to 1. While the quadratic classifier is shown to outperform the linear classifier on several datasets, it is always good practice to employ the simpler one, in this case the linear classifier, if it does a satisfactory job in separating the classes.

The use of quadratic classifiers has been investigated in several interesting applications, including harnessing will power to press buttons—that is, recognizing intent using "brain waves" (EEG) and translating them to commands to be given to a computer [1], and to detect singing voice in recorded music for applications such as karaoke [2].

This Demonstration has introduced supervised pattern recognition. For further information, see

[1] Z. A. Keirn and J. I. Aunon, "Man-Machine Communications through Brain-Wave Processing," IEEE Engineering in Medicine and Biology Magazine, 9(1), 1990 pp. 55–57.

[2] M. A. Bartsch and G. H. Wakefield, "Singing Voice Identification Using Spectral Envelope Estimation," IEEE Transactions on Speech and Audio Processing, 12(2), 2004 pp. 100–109.

[3] A. K. Jain, R. P. W. Duin, and J. Mao, "Statistical Pattern Recognition: A Review," IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(1), 2000 pp. 4–37.

[4] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, New York: Wiley, 2000.



Feedback (field required)
Email (field required) Name
Occupation Organization
Note: Your message & contact information may be shared with the author of any specific Demonstration for which you give feedback.
Send