CAS in Teaching Basics of Statistical Learning

Aleksandr Mylläri
University of Turku, Finland
and
Tatiana Mylläri
Åbo Akademi University, Turku/Åbo, Finland

Modern computer algebra systems not only make calculations (analytic and numeric) easy, but also have good visualization facilities. Visual demonstrations provide convenient way to demonstrate the work of the algorithms and help students to understand them. We consider the problem of binary classification. Support Vector Machines (SVMs) are attractive from the educational point of view since it is easy to introduce them gradually: from simple perceptron to more and more advanced classifiers. We start with the Rosenblatt’s perceptron – simple binary classifier for linearly separable cases – and generalize it to maximum margin classifier; then we introduce kernel trick that allows generalization to nonlinearly separable cases, and finally accept misclassifications on the training stage. We model the work of the Rosenblatt’s perceptron and simple SVMs using Mathematica 6 and Maple 12. Constructed models are used in the introductory courses on SVMs and Statistical Learning.