Handheld Kinesthetic Devices and Reinforcement Learning for Haptic Guidance

Speaker

Stanford University

Host

Alberto Rodriguez
Abstract: Screens, headphones, and now virtual and augmented reality headsets can provide people with instructions and guidance through visual and auditory feedback. Yet those senses are often overloaded, motivating the display of information through the sense of touch. Haptic devices, which display forces, vibrations, or other touch cues to a user’s hands or body, can be private, intuitive, and leave the other senses free. In this talk, I will discuss several novel hand-held haptic devices that provide clear, directional, kinesthetic cues while allowing the user to move through a large workspace. Using these devices, we study the anisotropies and variability in human touch perception and movement. Using modeling and reinforcement learning techniques, haptic devices can adapt to the user’s responses and provide effective guidance and intuitive touch interactions. These devices have applications in medical guidance and training, navigation, sports, and entertainment. Such holdable devices could enable haptic interfaces to become as prevalent and impactful in our daily lives as visual or audio interfaces.

Bio: Julie Walker is a Ph.D. Candidate in Mechanical Engineering at Stanford University. She is a member of the Collaborative Haptics and Robotics in Medicine Lab, led by Professor Allison Okamura. She received a masters degree from Stanford University and a bachelors degree in Mechanical Engineering at Rice University. She has worked in haptics and human robot-interaction research since 2012, studying haptic feedback for prosthetic hands, robotic surgery, and teleoperation. Her Ph.D. thesis work focuses on haptic guidance through novel handheld devices, particularly for medical applications. She has received an NSF Graduate Research Fellowship and a Chateaubriand Fellowship.