Options
GEAR: Gaze-enabled augmented reality for human activity recognition
Type
conference paper
Date Issued
2023-05-30
Author(s)
Hermann, Jonas
Jenss, Kay Erik
Soler, Marc Elias
Abstract
Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they allow novel interaction methods and provide insights into user attention, intentions, and activities. However, only few studies have used gaze-enabled AR displays for human activity recognition (HAR). In an experimental study, we collected gaze data from 10 users on a HoloLens 2 (HL2) while they performed three activities (i.e., read, inspect, search). We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved an up to 98.7% activity-recognition accuracy. On the HL2, we provided users with an AR feedback that is relevant to their current activity. We present the components of our system (GEAR) including a novel solution to enable the controlled sharing of collected data. We provide the scripts and anonymized datasets which can be used as teaching material in graduate courses or for reproducing our findings.
Language
English
Keywords
pervasive eye tracking
augmented reality
attention
context-awareness
human activity recognition
eye tracking
HSG Classification
contribution to scientific community
Publisher
ACM
Publisher place
New York, NY, USA
Pages
9
Event Title
2023 ACM Symposium on Eye Tracking Research & Applications (ETRA'23)
Event Location
Tübingen, Germany
Event Date
May 30 – June 02, 2023
Division(s)
Contact Email Address
kenan.bektas@unisg.ch
Eprints ID
269706
File(s)
Loading...
open access
Name
PAPER_bektasetal_etra23.pdf
Type
Main Article
Size
2.84 MB
Format
Adobe PDF
Checksum (MD5)
ca300353e0ca95804d5b3e56907ae666
Loading...
open access
Name
POSTER_bektasetal_etra23.pdf
Size
668.55 KB
Format
Adobe PDF
Checksum (MD5)
202148dec7de1f855adea7ba4eada643