Kenan BektasAdrian PandjaitanJannis Rene StreckerSimon Mayer2024-03-262024-03-262024-05-11https://www.alexandria.unisg.ch/handle/20.500.14171/119765https://doi.org/10.1145/3613905.3650965Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.en-USeye trackingaugmented realitynon-verbal cuesremote collaborationCSCWgazepresenceGlassBoARd: A Gaze-Enabled AR Interface for Collaborative Workconference contribution