Can You See It?
- 14 September 2021
- journal article
- research article
- Published by Association for Computing Machinery (ACM) in GetMobile: Mobile Computing and Communications
- Vol. 25 (2), 38-42
- https://doi.org/10.1145/3486880.3486891
Abstract
Today's smartphones and wearable devices come equipped with an array of inertial sensors, along with IMU-based Human Activity Recognition models to monitor everyday activities. However, such models rely on large amounts of annotated training data, which require considerable time and effort for collection. One has to recruit human subjects, define clear protocols for the subjects to follow, and manually annotate the collected data, along with the administrative work that goes into organizing such a recording.Keywords
This publication has 6 references indexed in Scilit:
- IMUTubeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2020
- Towards Machine Learning with Zero Real-World DataPublished by Association for Computing Machinery (ACM) ,2019
- A Multi-Sensor Setting Activity Recognition Simulation ToolPublished by Association for Computing Machinery (ACM) ,2018
- The Opportunity challenge: A benchmark database for on-body sensor-based activity recognitionPattern Recognition Letters, 2013
- On preserving statistical characteristics of accelerometry data using their empirical cumulative distributionPublished by Association for Computing Machinery (ACM) ,2013
- Introducing a New Benchmarked Dataset for Activity MonitoringPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2012