HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). AI-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.
Episodes

33 minutes ago
33 minutes ago
Electro-tactile interfaces—which deliver tactile sensations through electrical stimulation of skin nerves—offer unique advantages like fast response times, thin flexible form factors, and the ability to simulate textures, softness, and even coldness. But designing them has been notoriously difficult, requiring deep electronics expertise and custom hardware. eTactileKit changes that. This open-source toolkit provides end-to-end support: modular hardware that scales from 8 to 128+ electrodes, design tools for creating 2D and 3D electrode layouts, a Processing-based pattern creator with visual simulation, a GUI for real-time testing and calibration, and APIs for Python and Unity. A three-week study with both novice and experienced designers showed the toolkit significantly lowered the barrier to entry while improving design workflows—enabling rapid prototyping of applications from VR haptic buttons to 3D-printed interactive toys.
Praneeth Bimsara Perera, Ravindu Madhushan Pushpakumara, Hiroyuki Kajimoto, Arata Jingu, Jürgen Steimle, and Anusha Withana. 2025. eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile Interfaces. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25). Association for Computing Machinery, New York, NY, USA, 17 pages. https://doi.org/10.1145/3746059.3747796

3 days ago
3 days ago
What happens when you watch yourself perform in VR—and then have to critique that performance? This study explored self-reflection in motor skill learning using a Karate training task. Participants were embodied as a "trainer" avatar and asked to give verbal feedback on a "trainee"—which was either their own 3D-scanned appearance or a stranger, performing either their own recorded movements or an expert's. The results revealed a psychological "role conflict": participants felt split between being the evaluator and being evaluated. Seeing their own appearance triggered deeper, more emotional reflection, while recognizing their own movements created bodily connection even in a stranger's avatar. The findings suggest VR embodiment isn't binary but multi-faceted, with implications for training and therapy.
Dennis Dietz, Samuel Benjamin Rogers, Julian Rasch, Sophia Sakel, Nadine Wagener, Andreas Martin Butz, and Matthias Hoppe. 2025. The 2×2 of Being Me and You: How the Combination of Self and Other Avatars and Movements Alters How We Reflect on Ourselves in VR. In Proceedings of the 31st ACM Symposium on Virtual Reality Software and Technology (VRST '25). Association for Computing Machinery, New York, NY, USA, 11 pages. https://doi.org/10.1145/3756884.3765986

6 days ago
6 days ago
Can fake heartbeat sounds trick your body into relaxing? This system generates pseudo-heartbeat audio at rates slightly slower than your actual heart rate to induce calm and support meditation. Using a contactless radar sensor to detect real heartbeats, it creates slower auditory feedback (10-30% below actual BPM). Tested with 120 participants at SIGGRAPH Asia 2024, results showed that hearing slower heartbeats made people feel their heart rate was decreasing—even when they knew the sounds were manipulated. The findings suggest potential for non-pharmacological treatment of insomnia through enhanced interoceptive awareness.
Akari Shimabukuro, Seioh Ezaki, and Keiichi Zempo. 2025. Meditation Support System Utilizing Pseudo-Heartbeat Auditory Feedback to Enhance Cardiac Interoceptive Awareness. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 4 pages. https://doi.org/10.1145/3745900.3746096

Tuesday Dec 30, 2025
Tuesday Dec 30, 2025
Learning to identify musical pitch intervals usually requires tedious rote practice. Purrfect Pitch offers a new approach: a wearable haptic vest that translates sound into touch. When users hear two musical notes, they simultaneously feel vibrations at corresponding vertical positions on their back—leveraging our natural "high/low" pitch metaphor. In a study with 18 participants, those using the audio-haptic system identified intervals 20% more accurately and 1.67 seconds faster than audio-only learners. However, the performance boost didn't persist after removing the haptic feedback, suggesting the vest enhances task performance but doesn't accelerate long-term skill acquisition.
Sam Chin, Cathy Mengying Fang, Nikhil Singh, Ibrahim Ibrahim, Joe Paradiso, and Pattie Maes. 2025. Purrfect Pitch: Exploring Pitch Interval Learning through an Audio-Haptic Interface. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 12 pages. https://doi.org/10.1145/3745900.3746079

Saturday Dec 27, 2025
Saturday Dec 27, 2025
Processing high-resolution video with AI requires massive computational resources. GazeLLM offers an elegant solution inspired by human vision: use eye-tracking to focus only on what matters. By cropping first-person video to a small region around the user's gaze point, the system reduces pixel input to just one-tenth while achieving task comprehension equal to or better than full-resolution video. User evaluations across six real-world activities—cooking, bike repair, first aid, and sports—showed that gaze-focused video produces higher quality task descriptions than both full videos and center-cropped alternatives.
Jun Rekimoto. 2025. GazeLLM: Multimodal LLMs incorporating Human Visual Attention. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 10 pages. https://doi.org/10.1145/3745900.3746075

Saturday Dec 13, 2025
Saturday Dec 13, 2025
Traditional quadrotor drones pose safety concerns with their spinning blades. Cuddle-Fish takes a different approach: a helium-filled soft robot with bio-inspired flapping wings that's safe enough to touch, hug, and interact with physically. In testing with 24 participants, people spontaneously engaged in affective behaviors like patting, stroking, and even hugging the robot. Users reported positive emotional responses and felt safe during interactions, with some participants touching the robot to their cheeks, demonstrating trust and comfort.
Mingyang Xu, Jiayi Shao, Yulan Ju, Ximing Shen, Qingyuan Gao, Weijen Chen, Qing Zhang, Yun Suen Pai, Giulia Barbareschi, Matthias Hoppe, Kouta Minamizawa, and Kai Kunze. 2025. Cuddle-Fish: Exploring a Soft Floating Robot with Flapping Wings for Physical Interactions. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 14 pages. https://doi.org/10.1145/3745900.3746080

Wednesday Oct 01, 2025
Wednesday Oct 01, 2025
Our bodies experience a wide variety of kinesthetic forces as we go about our daily lives, including the weight of held objects, contact with surfaces, gravitational loads, and acceleration and centripetal forces while driving, to name just a few. These forces are crucial to realism, yet simply cannot be rendered with today’s consumer haptic suits, which primarily rely on arrays of vibration actuators built into vests. Rigid exoskeletons have more kinesthetic capability to apply forces directly to users’ joints, but are generally cumbersome to wear and cost many thousands of dollars. In this work, we present Kinethreads: a new full-body haptic exosuit design built around string-based motor-pulley mechanisms, which keeps our suit lightweight (<5kg), soft and flexible, quick-to-wear (<30 seconds), comparatively low-cost (~$400), and yet capable of rendering expressive, distributed, and forceful (up to 120N) effects. We detail our system design, implementation, and results from a multi-part performance evaluation and user study.Vivian Shen and Chris Harrison. 2025. Kinethreads: Soft Full-Body Haptic Exosuit using Low-Cost Motor-Pulley Mechanisms. In Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25). Association for Computing Machinery, New York, NY, USA, Article 1, 1–16. https://doi.org/10.1145/3746059.3747755

Wednesday Aug 20, 2025
Wednesday Aug 20, 2025
Xiaru Meng, Yulan Ju, Christopher Changmok Kim, Yan He, Giulia Barbareschi, Kouta Minamizawa, Kai Kunze, and Matthias Hoppe. 2025. A Placebo Concert: The Placebo Effect for Visualization of Physiological Audience Data during Experience Recreation in Virtual Reality. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25). Association for Computing Machinery, New York, NY, USA, Article 807, 1–16. https://doi.org/10.1145/3706598.3713594
A core use case for Virtual Reality applications is recreating real-life scenarios for training or entertainment. Promoting physiological responses for users in VR that match those of real-life spectators can maximize engagement and contribute to more co-presence. Current research focuses on visualizations and measurements of physiological data to ensure experience accuracy. However, placebo effects are known to influence performance and self-perception in HCI studies, creating a need to investigate the effect of visualizing different types of data (real, unmatched, and fake) on user perception during event recreation in VR. We investigate these conditions through a balanced between-groups study (n=44) of uninformed and informed participants. The informed group was provided with the information that the data visualizations represented previously recorded human physiological data. Our findings reveal a placebo effect, where the informed group demonstrated enhanced engagement and co-presence. Additionally, the fake data condition in the informed group evoked a positive emotional response.
https://doi.org/10.1145/3706598.3713594

Friday Aug 08, 2025
Friday Aug 08, 2025
Perceiving and altering the sensation of internal physiological states, such as heartbeats, is key for biofeedback and interoception. Yet, wearable devices used for this purpose can feel intrusive and typically fail to deliver stimuli aligned with the heart’s location in the chest. To address this, we introduce Heartbeat Resonance, which uses low-frequency sound waves to create non-contact haptic sensations in the chest cavity, mimicking heartbeats. We conduct two experiments to evaluate the system’s effectiveness. The first experiment shows that the system created realistic heartbeat sensations in the chest, with 78.05 Hz being the most effective frequency. In the second experiment, we evaluate the effects of entrainment by simulating faster and slower heart rates. Participants perceived the intended changes and reported high confidence in their perceptions for +15% and -30% heart rates. This system offers a non-intrusive solution for biofeedback while creating new possibilities for immersive VR environments.
Waseem Hassan, Liyue Da, Sonia Elizondo, and Kasper Hornbæk. 2025. Heartbeat Resonance: Inducing Non-contact Heartbeat Sensations in the Chest. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25). Association for Computing Machinery, New York, NY, USA, Article 913, 1–22. https://doi.org/10.1145/3706598.3713959

Friday Aug 01, 2025
Friday Aug 01, 2025
To enhance focused eating and dining socialization, previous Human-Food Interaction research has indicated that external devices can support these dining objectives and immersion. However, methods that focus on the food itself and the diners themselves have remained underdeveloped. In this study, we integrated biofeedback with food, utilizing diners’ heart rates as a source of the food’s appearance to promote focused eating and dining socialization. By employing LED lights, we dynamically displayed diners’ real-time physiological signals through the transparency of the food. Results revealed significant effects on various aspects of dining immersion, such as awareness perceptions, attractiveness, attentiveness to each bite, and emotional bonds with the food. Furthermore, to promote dining socialization, we established a “Sharing Bio-Sync Food” dining system to strengthen emotional connections between diners. Based on these findings, we developed tableware that integrates biofeedback into the culinary experience.
Weijen Chen, Qingyuan Gao, Zheng Hu, Kouta Minamizawa, and Yun Suen Pai. 2025. Living Bento: Heartbeat-Driven Noodles for Enriched Dining Dynamics. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25). Association for Computing Machinery, New York, NY, USA, Article 353, 1–18. https://doi.org/10.1145/3706598.3713108