Human-Computer Interaction

The HCI Lab creates interactive technologies to enhance everyday life. Our research is grounded in the field of Human-Computer Interaction. We combine an interdisciplinary approach (linking to the Social Sciences and Humanities) with an intra-disciplinary approach (with areas such as Distributed Systems, AI and Machine Learning, Vision and Formal Methods) to enable an end-to-end methodology in which we both develop novel digital technologies and understand them in realistic environments.

Our focus on everyday life extends beyond the workplace to encompass technologies for the home, the workplace and for culture and entertainment. Research activity has had direct impact on society through:

  • Public engagement – via staging a series of public events and activities that are centred in the lab.
  • The development of new interfaces for entertainment.
  • Research collaborations with art collectives and creative industries.
  • Users and industrial beneficiaries of global human-computer interaction research who use it to advance their products.
  • Mixed Reality (MR) technologies with AI.
  • Virtual training platform for education, including but not limited to industries, public sectors, and institution.

Funded Projects

  • Research on Smart AI for Firefighters Safety Assessment (Zhejiang Commonweal General Programme funded by Zhejiang Natural Science Foundation (ZJNSF)), 2021 ~ 2023. 基于智能多传感器融合的消防人员安全评估设计与应用研究(浙江省公益类科技项目) 
  • Research on Wearable Sensors Technology for Firefighters Safety Evaluation (Ningbo Commonweal General Programme funded by Ningbo Science and Technology Bureau (NBSTB)), 2022 ~ 2023. 基于可穿戴嵌入式多传感器的消防人员安全风险评估决策研究与运用(宁波市公益类科技项目)
  • Research on Fire Scene Dispatching Command and Remote Investigation Technology (NB Major S&T Programme CM2025 funded by NBSTB), 2022 ~ 2024. 火灾现场实景调度指挥及远程勘验技术研究(宁波市2025重大科技项目)

 

PhD Opportunities and Research Topics

Project Title: Research on Smart Firefighting Operational System (SFOS)
Project Title: Research on New Methods in Automated Financial Fraud Detection

 

 

Selected Publications

  • Chai, X., Wu. R., Pike, M., Jin, H., Chung, W. Y., Lee, B. G. (2021) Smart Wearable with Sensor Fusion for Fall Detection in Firefighting. Sensors. Accepted.
  • Wu, R., Pike, M., Lee, B. G. Lee. (2021) DT-SLAM: Dynamic Thresholding Based Corner Point Extraction in SLAM System. IEEE Access, 9(1), 91723-91729. doi: 10.1109/ACCESS.2021.3092000.
  • Lee, B. G., Chong, T. W., Chung, W. Y. (2020). Sensor fusion of motion-based sign language interpretation with sign language. Sensors, 20(21), 6256-6272.
  • Lee, B. G., Chung, W. Y., Chong, T. W. (2019). Methods to detect and reduce driver stress: a review. International Journal of Automotive Technology, 10(5), 1051-1063.
  • Benford, S., Ramchurn, R., Marshall, J., Wilson, M. L., Pike, M., Martindale, S., ... & Walker, B. (2020). Contesting control: journeys through surrender, self-awareness and looseness of control in embodied interaction. Human–Computer Interaction, 1-29.
  • Pike, M., Shen, K., & Towey, D. (2019, December). Supporting Computer Science Student Reading through Multimodal Engagement Interfaces. In 2019 IEEE International Conference on Engineering, Technology and Education (TALE) (pp. 1-6). IEEE.
  • Pike, M., & Roadknight, C. (2019, July). Promoting Digital Wellbeing through Real-Time State classification of Psychophysiological Sensor Networks. In The Second Neuroadaptive Technology Conference (p. 60).
  • Machine learning/AI for IoT, M2M and Computer Communications (2019). 30(9), Wiley.
  • Chong, T. W., Lee, B. G. (2018). American sign language recognition using leap motion controller with machine learning approach. Sensors, 18(10), 3554.
  • Lee, B. G., Lee, S. M. (2017). Smart wearable hand device for sign language interpretation system with sensors fusion. IEEE Sensors Journal, 18(3), 1224-1232.
  • Lee, D. S., Chong, T. W., Lee, B. G. (2016). Stress events detection of driver by wearable glove system. IEEE Sensors Journal, 17(1), 194—204.
  • Pike, M., & Ch'ng, E. Evaluating virtual reality experience and performance: a brain-based approach. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry-Volume 1 (pp. 469-474).
  • Pike, M., Ramchurn, R., Benford, S., & Wilson, M. L. (2016, May). # scanners: Exploring the control of adaptive films using brain-computer interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 5385-5396).

 

Achievements

  • Lee, B. G. 2019 IEEE Sensors Journal Best Paper Award, “Smart Wearable Hand Device for Sign Language Interpretation System with Sensors Fusion,” IEEE Sensors Council.

Navigation

Lab Director

Contact Us

Dr. Matthew Pike Matthew.Pike@nottingham.edu.cn

Dr. Boon Giin Lee 
boon-giin.lee@nottingham.edu.cn