Zoom: https://illinois.zoom.us/j/88199799900?pwd=I3wRViAILdEd8yj7O5AoFBhnpDXuPa.1
Abstract:
Mixed Reality (MR) systems aim to create seamless, immersive experiences. Achieving this requires understanding how the user perceives the experience, how it navigates the environment, and how it interacts with it. Unfortunately, current MR systems fail to achieve these fundamental goals, especially when user perception changes over time and they navigate complex environments. At first glance, understanding user perception (an experiential aspect) and tracking user navigation and interaction (a systemic task) appear to be distinct challenges. However, both stem from a shared limitation: today’s MR systems lack the ability to reason about the human-system loop as it unfolds in real-time in dynamic environments. I design MR systems that interpret and respond to human perception in real-time, aligning with user behavior while remaining robust to the unpredictability of physical spaces. In this talk, I will present two core capabilities essential to achieving this goal: interpreting user experience and inferring the effects of dynamic real-world unpredictability on perception and performance.
First, I address the challenge of sensing and interpreting user experience in real time. I will present a new system-level measurement framework that uses reaction time to assess the presence, a central experiential construct in MR that reflects the feeling of ``being there” in a virtual environment. My work identified how changes in reaction time, a continuous behavioral signal captured by the system, reflect both scene fidelity and the user's cognitive state. Grounded in cognitive science theories and validated through human-subject studies, this work establishes a link between subjective user perception and system interactions, offering a method to detect subtle perceptual shifts as they occur without relying on post-hoc or disruptive measures.
Second, I will present how external physical distractions, such as real-world sounds or objects that require interaction outside the virtual environment, disrupt immersion by imposing cognitive discontinuities. These disruptions fragment the user’s mental model, increase cognitive load, and make it harder to re-engage with the virtual task. I will present a cognitive framework that explains how physical-world disruptions break presence and introduce measurable delays in responsiveness. I will show how such distractions reshape cognitive resource allocation and how these effects can be captured systemically through reaction time. This signal not only tracks perceptual breaks but also reflects user recovery under real-world unpredictability, advancing MR systems toward real-time perceptual alignment and experiential continuity in complex, dynamic environments.
Together, these contributions lay the foundation for MR systems that can sense, interpret, and respond to human experience as it unfolds. I will conclude with my broader vision for the future of human-centered MR, advancing system design, perceptual alignment, and societal impact and redefining how immersive systems are designed, evaluated, and deployed to center human experience at every stage.
Bio:
Yasra Chandio is a Ph.D. candidate in Electrical and Computer Engineering at the University of Massachusetts Amherst. She is a Human-Centered Computing (HCC) researcher focused on immersive environments. Yasra develops adaptive frameworks that identify, measure, and adjust key system parameters in MR environments, with a focus on creating immersive, responsive, and sustainable systems that promote user well-being and support broader societal goals. Her work spans HCI, ML, and Systems, contributing foundational insights into how MR systems can better reflect and adapt to human perception and behavior. Her research achievements led her to be named an MIT EECS Rising Star, CPS Rising Star, ML and Systems Rising Star, and a Heidelberg Laureate Forum young researcher. She has received the Best Paper Award at ACM BuildSys, an Honorable Mention for Best Paper at IEEE VR, Best Presentation at the ACM SenSys PhD Forum, and was a finalist at UMASS 3MT 2022 and AIxVR 2024. The BBC and ScienceDaily have featured her work, which has been published in top venues across virtual, augmented, and mixed reality (VR/AR/MR), systems and ML, including IEEE VR, IEEE TVCG, IROS, NeurIPS, ACM FAccT, AIxVR, ISEMV, SafeAR, DATA, and BuildSys/e-Energy. Committed to mentorship, Yasra received the College of Engineering DEI Award and is a CRA-E Fellow and Grace Hopper Scholar.
Faculty Host: Sarita Adve
Meeting ID: 881 9979 9900 ; Password: csillinois