Now showing 1 - 5 of 5
  • Publication
    Everyday Life Challenges and Augmented Realities: Exploring Use Cases For, and User Perspectives on, an Augmented Everyday Life
    (ACM, 2024-04)
    Assistive technologies hold promise in empowering blind and low vision (BLV) people, assisting them in navigating the daily challenges they face. However, there remains limited understanding of the natu-ralistic everyday challenges experienced by BLV people, as well as how assistive technologies are specif-ically employed to effectively address these challenges. In this work, we use cultural probes to enhance our understanding of the challenges faced by BLV people and provide a corpus of insights into their eve-ryday life. We asked 10 BLV people to video record their naturalistic everyday life challenges for the du-ration of one week. These video probes, along with a set of semi-structured interviews, offer important empirical insights into the daily challenges. Despite the availability of assistive technologies and human support, where a BLV person receives assistance from another person, we observed a constant level of uncertainty where the probes revealed that BLV people often question an object's current state, such as the cleanliness of their clothing or the mix-ratio of multiple ingredients when preparing a meal. Fur-thermore, participants reported to experience significant delays in identifying critical observations, which leads to the situation where interventions may be too late to be effective (e.g., when a plant has a disease). By using video probes to provide insights into the daily challenges of BLV people, our corpus of naturalistic everyday life challenges offers a characterization of daily challenges faced by BLV people and informs future research efforts to co-design, co-develop, and co-evaluate novel assistive technologies that meet BLV people's preferences and their individual daily experiences.
    Scopus© Citations 1
  • Publication
    MRTranslate: Bridging Language Barriers in the Physical World Using a Mixed Reality Point-and-Translate System
    Language barriers pose significant challenges in our increasingly globalized world, hindering effective communication and access to information. Existing translation tools often disrupt the current activity flow and fail to provide seamless user experiences. In this paper, we contribute the design, implementation, and evaluation of \MRTranslate, an assistive Mixed Reality (MR) prototype that enables seamless translations of real-world text. We instructed 12 participants to translate items on a food menu using MRTranslate, which we compared to state-of-the-art translation apps, including Google Translate and Google Lens. Findings from our user study reveal that when utilising a fully functional implementation of MRTranslate, participants achieve success in up to 91.67% of their translations whilst also enjoying the visual translation of the unfamiliar text. Although the current translation apps were well perceived, participants particularly appreciated the convenience of not having to grab a smartphone and manually input the text for translation when using MRTranslate. We believe that MRTranslate, along with the empirical insights we have gained, presents a valuable step towards a future where MR transforms language translation and holds the potential to assist individuals in various day-to-day experiences.
  • Publication
    Designing Grocery Shopping Experiences for Virtual Reality
    Online grocery shopping offers time-saving efficiency and convenience, yet many people still prefer physical shopping for trust in food freshness and other sensory experiences. While online stores are evolving to offer new user experiences, such as supporting eco-friendly or ethical shopping, the desktop and mobile platforms limit the engagement of such experiences. Virtual Reality (VR) presents an opportunity to create immersive and rich grocery shopping experiences, closing the gap between the convenience of online shopping and the sensory experience of physical shopping. Yet, designing VR grocery stores remains relatively unexplored. In this paper, we investigate the long-term potential of VR grocery stores, focusing on meeting individual needs. Through a co-design workshop, participants brainstormed the design of product displays, in-shop navigation, shopping carts, social shopping, among others. Based on our findings, we provide design recommendations for future VR grocery shopping to develop meaningful alternatives to existing shopping experiences for groceries.
  • Publication
    Exploring Mobile Devices as Haptic Interfaces for Mixed Reality
    (ACM, 2024)
    Carolin Stellmacher
    ;
    ;
    Yannick Weiss
    ;
    Meagan B. Loerakker
    ;
    Nadine Wagener
    ;
    Dedicated handheld controllers facilitate haptic experiences of virtual objects in mixed reality (MR). However, as mobile MR becomes more prevalent, we observe the emergence of controller-free MR interactions. To retain immersive haptic experiences, we explore the use of mobile devices as a substitute for specialised MR controller. In an exploratory gesture elicitation study (𝑛 = 18), we examined users’ (1) intuitive hand gestures performed with prospective mobile devices and (2) preferences for real-time haptic feedback when exploring haptic object properties. Our results reveal three haptic exploration modes for the mobile device, as an object, hand substitute, or as an additional tool, and emphasise the benefits of incorporating the device’s unique physical features into the object interaction. This work expands the design possibilities using mobile devices for tangible object interaction, guiding the future design of mobile devices for haptic MR experiences.
  • Publication
    Gaze into Fintech: Assessing the Influence of Financial Literacy on Interaction Behaviour Using Eyetracking
    (ACM, 2024-06) ;
    Andrin Benz
    ;
    Financial technology (fintech) has become more accessible to diverse users, including finance practitioners and non-experts. We conducted a usability test with six participants, evaluating a fintech app using eye-tracking technology for finance practitioners and non-finance experts. Our analysis revealed that participants with limited financial knowledge had difficulty understanding financial information and navigating the software, indicated by a higher revisitation rate on financial labels. We also observed higher cognitive overload or uncertainty among non-practitioners, evidenced by gaze-click discrepancies. These insights aim to inform the design of more accessible fintech applications.