For the last two decades, more and more complex methods have been developed to identify human activities using various types of sensors, e.g., data from motion capture, accelerometer, and gyroscopes sensors. To date, most of the researches mainly focus on identifying simple human activities, e.g., walking, eating, and running. However, many of our daily life activities are usually more complex than those. To instigate research in complex activity recognition, the "Nurse Care Activity Recognition Challenge" [1] is initiated where six nurse activities are to be identified based on location, air pressure, motion capture, and accelerometer data. Our team, "IITDU", investigates the use of simple methods for this purpose. We first extract features from the sensor data and use one of the simplest classifiers, namely K-Nearest Neighbors (KNN). Experiment using an ensemble of KNN classifiers demonstrates that it is possible to achieve approximately 87% accuracy on 10-fold cross-validation and 66% accuracy on leave-one-subject-out cross-validation.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Multimodal Activity Recognition Nurse Care Activity Recognition Challenge KNN Accuracy 80.2% # 1

Methods


No methods listed for this paper. Add relevant methods here