27
0

Human Activity Recognition Using Self-Supervised Representations of Wearable Data

Abstract

Automated and accurate human activity recognition (HAR) using body-worn sensors enables practical and cost efficient remote monitoring of Activity of DailyLiving (ADL), which are shown to provide clinical insights across multiple therapeutic areas. Development of accurate algorithms for human activity recognition(HAR) is hindered by the lack of large real-world labeled datasets. Furthermore, algorithms seldom work beyond the specific sensor on which they are prototyped, prompting debate about whether accelerometer-based HAR is even possible [Tong et al., 2020]. Here we develop a 6-class HAR model with strong performance when evaluated on real-world datasets not seen during training. Our model is based on a frozen self-supervised representation learned on a large unlabeled dataset, combined with a shallow multi-layer perceptron with temporal smoothing. The model obtains in-dataset state-of-the art performance on the Capture24 dataset (κ=0.86\kappa= 0.86). Out-of-distribution (OOD) performance is κ=0.7\kappa = 0.7, with both the representation and the perceptron models being trained on data from a different sensor. This work represents a key step towards device-agnostic HAR models, which can help contribute to increased standardization of model evaluation in the HAR field.

View on arXiv
Comments on this paper