88
33

Time Series Classification by Class-Based Mahalanobis Distances

Abstract

To classify time series by nearest neighbors, we need to specify or learn one or several distances. We consider variations of the Mahalanobis distances which rely on the inverse covariance matrix of the data. Unfortunately -- for time series data -- the covariance matrix has often low rank. To alleviate this problem we can either use a pseudoinverse, covariance shrinking or limit the matrix to its diagonal. We review these alternatives and benchmark them against competitive methods such as the related Large Margin Nearest Neighbor Classification (LMNN) and the Dynamic Time Warping (DTW) distance. As we expected, we find that the DTW is superior, but the Mahalanobis distances are computationally inexpensive in comparison. To get best results with Mahalanobis distances, we recommend learning one distance per class using either covariance shrinking or the diagonal approach.

View on arXiv
Comments on this paper