Bektas, KenanKenanBektasStrecker, Jannis ReneJannis ReneStreckerMayer, SimonSimonMayerGarcia, KimberlyKimberlyGarciaHermann, JonasJonasHermannJenss, Kay ErikKay ErikJenssAntille, Yasmine SheilaYasmine SheilaAntilleSoler, Marc EliasMarc EliasSoler2023-04-132023-04-132023-05-30https://www.alexandria.unisg.ch/handle/20.500.14171/107618https://doi.org/10.1145/3588015.3588402Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they allow novel interaction methods and provide insights into user attention, intentions, and activities. However, only few studies have used gaze-enabled AR displays for human activity recognition (HAR). In an experimental study, we collected gaze data from 10 users on a HoloLens 2 (HL2) while they performed three activities (i.e., read, inspect, search). We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved an up to 98.7% activity-recognition accuracy. On the HL2, we provided users with an AR feedback that is relevant to their current activity. We present the components of our system (GEAR) including a novel solution to enable the controlled sharing of collected data. We provide the scripts and anonymized datasets which can be used as teaching material in graduate courses or for reproducing our findings.enpervasive eye trackingaugmented realityattentioncontext-awarenesshuman activity recognitioneye trackingGEAR: Gaze-enabled augmented reality for human activity recognitionconference paper