IHCI2026_Banner

 

The In-the-Wild (HCI) Lab explores how humans interact with artificial intelligence (AI) in everyday contexts, designing and deploying systems that are not only intelligent, but also transparent, responsive, and trustworthy. Our research is grounded in the field of Human-AI Interaction (HAI), where we focus on creating partnerships between people and machines rather than simply building tools. We combine an interdisciplinary approach (linking to physiology, psychology, and the social sciences) with an intra-disciplinary approach (integrating distributed systems, on-device AI, machine learning, and signal processing) to enable an end-to-end methodology. This allows us to develop novel AI-driven technologies and thoroughly understand their impact through in-situ evaluation in realistic, unstructured environments.

Our research spans a broad spectrum of foundational themes in HCI, united by a commitment to real-world deployment and human-centered design:

  • Human-AI Interaction and Explainability: This includes developing transparent on-device intelligence for safety-critical and clinical applications, such as real-time activity recognition for first responders and rehabilitation tasks for hospital patients, where interpretability is essential for adoption and decision-making.
  • VR/AR/MR and Spatial Computing: Our work in asymmetric collaboration and embodied virtual agents transforms VR/AR/MR from passive visualization tools into dynamic and responsive spaces for training, education, and rehabilitation.
  • Social Computing and Collaborative Interactions: This research explores how people collaborate in technology-mediated environments, from multi-user virtual reality to distributed wearable sensor networks.
  • Embodied Agents, Virtual Humans, Avatars, and Multimodal Interaction: This research explores how the design of virtual representations influences human perception, motivation, and learning.
  • Interactive, Ubiquitous, and Wearable Computing; IoT and Smart Environments: These studies develop end-to-end wearable systems that bring intelligence directly to the edge.
  • Affective and Cognitive Computing; Behaviour Modelling: This research design mechanisms that sense, interpret, and adapt to human affective and cognitive states.
  • Accessibility, Inclusive Design, and Assistive Technologies: This research explores the design of wearable and immersive technologies that aim to make advanced AI accessible to different populations, including elderly users, patients, and individuals with disabilities.
  • Design Methods, Prototyping, Evaluation and Reproducible HCI: This research emphasises thorough, reproducible evaluation methodologies that bridge laboratory precision with real-world complexity. Our approach combines controlled experiments with in-the-wild deployments, ensuring that our innovations are scientifically validated and practically deployable.
  • Education, Healthcare, Sustainability, Transportation, and Other Application Domains: Our research is driven by real-world challenges, including partnership with fire services, hospitals, and educational institutions to deploy AI-driven solutions in different environments.

 

Selected Publications

  • X. Chai, B. G. Lee, C. Hu, M. Pike, et al., “IoT-FAR: A Multi-Sensor Fusion Approach for IoT-Based Firefighting Activity Recognition,” Information Fusion, vol. 113, pp. 102650, 2025, DOI: 10.1016/j.inffus.2024.102650.
  • L. Nkenyereye, B. G. Lee, W. Y. Chung, “Deep Q-Learning with Feature Extraction and Prioritized Experience Replay for Edge Node Overload in Edge Computing,” Engineering Applications of Artificial Intelligence, vol. 162, p. 112124, 2025, DOI: 10.1016/j.engappai.2025.112124.
  • Y. Li, D. Chieng, B. G. Lee, C. F. Kwong, K. M. Lim and S. Li, "Wi-ViTAL: Domain Generalization of Wireless Human Activity Recognition Using Linear Attention Vision Transformer With Adversarial Learning," in IEEE Transactions on Mobile Computing, vol. 25, no. 4, pp. 5792-5806, April 2026, DOI: 10.1109/TMC.2025.3632752.
  • Y. Yu, C. Xu, S. Yang, Y. Cao, Y. Wang and B. G. Lee, "VRtalk: Real-Time Interactive Intelligent Anime Avatars in Virtual Reality," 2025 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Daejeon, Korea, Republic of, 2025, pp. 1191-1201, DOI: 10.1109/ISMAR67309.2025.00125.
  • L. Zheng, G. Cheng, S. Ke, J. Yuan, B. G. Lee, M. Pike, “Understanding Asymmetric Collaboration in Augmented and Virtual Reality Immersive Environment,” UIST Adjunct '25: Adjunct Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology, Busan, Republic of Korea, 2025, pp. 1-3, DOI: 10.1145/3746058.3758399.
  • R. Han, B. G. Lee, M. Pike, et al., “Sproutfit: An Immersive Seed-Planting Virtual Reality Game to Enhance Patient Motivation for Performing Exercises for the Prevention of Venous Thromboembolism through Loss and Avoidance Gamification,” Virtual Reality, vol. 29, p. 137, 2025, DOI: 10.1007/s10055-025-01220-2.
  • L. Sun, B. G. Lee, M. Pike, S. Yang and D. Chieng, "From Robots to Creatures: The Influence of Pedagogical Agent Design on Student Motivation and Learning in IVR Education," 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Saint Malo, France, 2025, pp. 1260-1261, doi: 10.1109/VRW66409.2025.00276.
  • S. Ke, L. Zheng and B. G. Lee, "Exploring the Influence of Interpersonal Relationships on Gamification Preferences in Collaborative IVR Environments," 2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Saint Malo, France, 2025, pp. 104-114, doi: 10.1109/VR59515.2025.00035.
  • L. Sun, B. G. Lee and W. Y. Chung, “Enhancing Fire Safety Education Through Immersive Virtual Reality Training with Serious Gaming and Haptic Feedback,” International Journal of Human–Computer Interaction, pp. 1–16, 2025, DOI:  10.1080/10447318.2024.2364979.
  • L. Zhu, R. Wu, B. -G. Lee, L. Nkenyereye, W. -Y. Chung and G. Xu, "FEGAN: A Feature-Oriented Enhanced GAN for Enhancing Thermal Image Super-Resolution," in IEEE Signal Processing Letters, vol. 31, pp. 541–545, 2024, DOI: 10.1109/LSP.2024.3356751.
  • Y. Wang, B. G. Lee, H. Pei, “A Wearable Lower Extremity Virtual Rehabilitation System for Patients’ Venous Thromboembolism Prophylaxis,” UbiComp '24: Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing, Melbourne, Australia, pp. 61-65, 2024, DOI: 10.1145/3675094.3677585.
  • R. Wu, B. G. Lee, M. Pike, L. Zhu, X. Chai and Y. Wang, "Enhancing DF-INS for Accurate Zero-Velocity Detection in ILBS: A Dual Foot Synergistic Method," 2023 IEEE SENSORS, Vienna, Austria, 2023, pp. 1–4, DOI: 10.1109/SENSORS56945.2023.10325168.
  • W. Du, G. Yi, O. M. Omisore, W. Duan, T. O. Akinyemi, X. Chen, L. Wang, B. G. Lee and J. Liu, "Guidewire Endpoint Detection Based on Pixel Adjacent Relation in Robot-assisted Cardiovascular Interventions," 2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Sydney, Australia, 2023, pp. 1-5, DOI: 10.1109/EMBC40787.2023.10340841.
  • X. Chai, B. G. Lee, M. Pike, R. Wu, D. Chieng and W. -Y. Chung, "Pre-Impact Firefighter Fall Detection Using Machine Learning on the Edge," in IEEE Sensors Journal, vol. 23, no. 13, pp. 14997–15009, July 1, 2023, DOI: 10.1109/JSEN.2023.3279858. (JCR Q1)
  • R. Wu, M. Pike, X. Chai, B. G. Lee, W. -Y. Chung and L. Nkenyereye, "GA-PDR: Using Gait Analysis for Heading Estimation in PDR Based Indoor Localization System," IECON 2023- 49th Annual Conference of the IEEE Industrial Electronics Society, Singapore, Singapore, 2023, pp. 1–6, DOI: 10.1109/IECON51785.2023.10312643.
  • R. Wu, B.G. Lee, M. Pike, L. Zhu, X. Chai, L. Huang and X. Wu, “IOAM: A Novel Sensor Fusion-Based Wearable for Localization and Mapping,” Remote Sensing, vol. 14, no. 23, pp. 6081, 2022, DOI: 10.3390/rs14236081.
  • S. Benford, R. Ramchurn, J. Marshall, M. L. Wilson, M. Pike, S. Martindale, A. Hazzard, C. Greenhalgh, M. Kallionpaa, P. Tennent and B. Walker, “Contesting Control: Journeys through Surrender, Self-Awareness and Looseness of Control in Embodied Interaction,” Human–Computer Interaction, vol. 36, no. 5, pp. 361–389, 2020 DOI: 10.1080/07370024.2020.1754214.
  • M. Pike, K. Shen and D. Towey, "Supporting Computer Science Student Reading through Multimodal Engagement Interfaces," 2019 IEEE International Conference on Engineering, Technology and Education (TALE), Yogyakarta, Indonesia, 2019, pp. 1-6, DOI: 10.1109/TALE48000.2019.9225970.
  • B. G. Lee and S. M. Lee, "Smart Wearable Hand Device for Sign Language Interpretation System with Sensors Fusion," in IEEE Sensors Journal, vol. 18, no. 3, pp. 1224–1232, 1 Feb.1, 2018, DOI: 10.1109/JSEN.2017.2779466.
  • D. S. Lee, T. W. Chong and B. G. Lee, "Stress Events Detection of Driver by Wearable Glove System," in IEEE Sensors Journal, vol. 17, no. 1, pp. 194–204, 1 Jan.1, 2017, DOI: 10.1109/JSEN.2016.2625323.
  • M. Pike, and E. Ch'ng, “Evaluating Virtual reality Experience and Performance: A Brain-based Approach,” In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry-Volume 1, pp. 469–474, 2016, DOI: 10.1145/3013971.3014012.
  • M. Pike, R. Ramchurn, S. Benford and M. L. Wilson, “#Scanners: Exploring the Control of Adaptive Films using Brain-Computer Interaction,” In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5385–5396, 2016, DOI: 10.1145/2858036.2858276.

 

Selected Patents

  • Method, Apparatus and System for Motion Tracking, 2026, US12523474B2.
  • Ultrasonic Wave-Based Indoor Inertial Navigation Mapping Method and System, 2025, US12467767B2.
  • 一种红外图像生成方法、装置、设备以及存储介质, 2026, ZL202310496326.7.
  • 一种消防信息显示方法、装置、设备及存储介质, 2025, 202210763849.
  • 一种运动轨迹确定方法、装置及系统, 2024, 202210112970.5.
  • 一种基于超声波的室内惯导建图方法和系统, 2023, 202210220368.3.

 

Awards

  • Invention and Entrepreneurship Award (Innovation), China Association of Inventions, National Second Prize, 2025. 中国发明协会发明创业奖创新, 中国发明协会, 发明, 国家二等奖, 2025.
  • Gold Medal in the 2022 8th China International College Students' “Internet+” (“互联网+”) Innovation and Entrepreneurship Competition.
  • Third Prize in the 2023 “Maker China” (“农行杯”) SME Innovation & Entrepreneurship Competition Advanced Manufacturing Special Race.
  • First Prize in the 2023 7th Ningbo National Selection Competition for Young College Student Entrepreneurship (Wuhan Division) (第七届宁波面向全国青年大学生创业大赛全国选拔赛).
  • 2023 Ningbo Computer Innovation and Entrepreneurship Prize for College Students (宁波市大学生计算机创新创业奖).
  • Best Paper Award at 13th (2021) and 15th (2023) International Conference on Intelligent Human Computer Interaction.
  • Best Session Paper Award at 10th International Conference on Open and Innovative Education (ICOIE 2023), Jul. 2023.
  • 2019 IEEE Sensors Journal Best Paper Award, “Smart Wearable Hand Device for Sign Language Interpretation System with Sensors Fusion,” IEEE Sensors Council. 

 

PhD Opportunities and Research Topics

Topic A: Research on Virtual Reality/Augmented Reality/Mixed Reality (VR/AR/MR), Generative AI, and Multimodal Interaction


Topic B: Research on Explainable Intelligence and Autonomous AI Agents for Human Behaviour Understanding


Topic C: Brain-Computer Interaction for Adaptive Learning and Human-Centred Design