GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work
Type
conference contribution
Date Issued
2024-05-11
Research Team
Interaction- and Communication-based Systems (https://interactions.unisg.ch)
Abstract
Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.
Language
English (United States)
Keywords
eye tracking
augmented reality
non-verbal cues
remote collaboration
CSCW
gaze
presence
HSG Classification
contribution to scientific community
Refereed
Yes
Book title
Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24)
Publisher
ACM
Publisher place
New York, NY, USA
Pages
8
Event Title
CHI Conference on Human Factors in Computing Systems
Event Location
Honolulu, HI, USA
Event Date
May 11 - 16, 2024
Contact Email Address
kenan.bektas@unisg.ch
File(s)
Loading...
open.access
Name
GlassBoARd_CHI2024_LBW-1.pdf
Size
4.77 MB
Format
Adobe PDF
Checksum (MD5)
2326dc4c848f352708d2bb3c57f33339