Best Paper Award - Runner Up, in C4SR+ workshop @ IROS 2025 Best Paper Award (3rd place), in EIMR workshop @ IROS 2025
Current orthopedic robotic systems largely focus on navigation, aiding surgeons in positioning a guiding tube but still requiring manual drilling and screw placement. The automation of this task not only demands high precision and safety due to the intricate physical interactions between the surgical tool and bone but also poses significant risks when executed without adequate human oversight. As it involves continuous physical interaction, the robot should collaborate with the surgeon, understand the human intent, and always include the surgeon in the loop. To achieve this, this paper proposes a new cognitive human–robot collaboration framework, including the intuitive AR-haptic human–robot interface, the visual-attention-based surgeon model, and the shared interaction control scheme for the robot. User studies on a robotic platform for orthopedic surgery are presented to illustrate the performance of the proposed method. The results demonstrate that the proposed human–robot collaboration framework outperforms full robot and full human control in terms of safety and ergonomics.
@inproceedings{chen2024visual,address={Abu Dhabi, United Arab Emirates},author={Chen, Chen and Zou, Qikai and Song, Yuhang and Yu, Mingrui and Zhu, Senqiang and Song, Shiji and Li, Xiang},booktitle={2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},doi={10.1109/IROS58592.2024.10801930},isbn={979-8-3503-7770-5},month=oct,pages={7078--7084},publisher={IEEE},title={Visual Attention Based Cognitive Human--Robot Collaboration for Pedicle Screw Placement in Robot-Assisted Orthopedic Surgery},year={2024}}
Independence in the Home: A Wearable Interface for a Person with Quadriplegia to Teleoperate a Mobile Manipulator
Akhil Padmanabha, Janavi Gupta, Chen Chen, and 5 more authors
In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction, Mar 2024
Teleoperation of mobile manipulators within a home environment can significantly enhance the independence of individuals with severe motor impairments, allowing them to regain the ability to perform self-care and household tasks. There is a critical need for novel teleoperation interfaces to offer effective alternatives for individuals with impairments who may encounter challenges in using existing interfaces due to physical limitations. In this work, we iterate on one such interface, HAT (Head-Worn Assistive Teleoperation), an inertial-based wearable integrated into any head-worn garment. We evaluate HAT through a 7-day in-home study with Henry Evans, a non-speaking individual with quadriplegia who has participated extensively in assistive robotics studies. We additionally evaluate HAT with a proposed shared control method for mobile manipulators termed Driver Assistance and demonstrate how the interface generalizes to other physical devices and contexts. Our results show that HAT is a strong teleoperation interface across key metrics including efficiency, errors, learning curve, and workload. Code and videos are located on our project website.
@inproceedings{padmanabha24independence,address={Boulder, CO, USA},author={Padmanabha, Akhil and Gupta, Janavi and Chen, Chen and Yang, Jehan and Nguyen, Vy and Weber, Douglas J and Majidi, Carmel and Erickson, Zackory},booktitle={Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction},doi={10.1145/3610977.3634964},isbn={9798400703225},month=mar,pages={542--551},publisher={ACM},title={Independence in the Home: A Wearable Interface for a Person with Quadriplegia to Teleoperate a Mobile Manipulator},urldate={2024-03-12},year={2024}}