Final presentation_Human 2,0_LimbX_enhanced.png

Research at: MIT Media Lab.

Role: Usability Studies (I came in after the development of the prototype, in the capacity of a UX researcher. I designed the experiment and analyzed the results)

Team of 4 with 2 engineers and a consultant.

Project Overview: We developed an augmented reality (AR)-controlled supernumerary robotic limb that uses eye gaze and voice commands to enable high-level control, aiming to reduce the cognitive load of traditional low-level interfaces. Through comparative usability studies with joystick-controlled and AR-controlled SRLs, we found that technical challenges limited our ability to demonstrate improvements in accuracy or efficiency with AR controls. We still believe there is significant potential for AR-based systems, and hope to refine and conduct future studies to better compare the systems.

Problem

Current supernumerary robotic limbs lack intuitive user control systems

Current control interfaces for supernumerary robotic limbs are often low-level, which results in a high cognitive load and discomfort for the user [1]. This can limit the effectiveness and usability of these devices.

Screen Shot 2024-12-22 at 8.55.36 AM.png

Screen Shot 2024-12-22 at 8.48.17 AM.png