By 2028, seven advanced gas-cooled reactors and 11 Magnox reactors—critical components of the UK’s £132 billion nuclear decommissioning program—will be at various stages of decommissioning. Robotic teleoperation is central to the Nuclear Decommissioning Agency’s 2024-2027 strategy, yet current systems face limitations, relying on single-operator-single-robot (SOSR) teleoperation or fully autonomous systems. SOSR methods proposed in EU-H2020 projects like RoMaNS and CENTAURO impose high cognitive and physical strain on operators due to prolonged use of tactile interfaces (e.g., exoskeletons or virtual reality devices), while autonomous systems lack the adaptability required for complex multi-contact tasks, such as disassembling submerged components in fuel pools or sorting radioactive waste. Although EPSRC RAIN Hub and NCNR tested the deployment of two physical interfaces, it inherently limits the degrees of freedom (DoFs) for remote handling and requires higher skills in coordinated manipulation. A critical gap exists in enabling intuitive, scalable control of multiple robots by a single operator without compromising dexterity or efficiency.
Objective: ACTION will develop a virtual interactive interface for the first time to decode human intents and deliver the associated control commands with multiple DoFs (>14 DoFs performed by two hands) through gaze and hand gestures. To seamlessly integrate hand gesture and gaze-based commands with the motion of remote robots, we will introduce a novel control framework to coordinate three or more robots with latency-free visual and haptic sensory feedback. Two pilot scenarios will be conducted in collaboration with industry partners (e.g., TG0, Quanser, RAICo) to systematically validate the proposed system, including the disassembly of components within a simulated nuclear fuel pool and the classification of radioactive waste (inc. key robotic grabbing, separation, and loading processes).
Methodology: Three main work packages (WP) have been identified, where the first two aim to improve the sensorimotor capabilities, as shown in Figure A. WP1 will develop a novel human-machine interface that facilitates intuitive, remote control of multiple robots using real-time dynamic hand gestures. This platform will visually represent the kinematic mapping between hand gestures and robotic motion. WP2 will develop an innovative gaze-based control system that facilitates additional human input extracted from eye activities beyond hand movements, adapting swiftly to tasks requiring rapid commands. WP3 focuses on developing stabilised safe control protocols for remote handling to meet the rich interaction demands (see Figure B). WP3 will enhance the targeted ND scenarios by integrating the virtual interactive interface developed in WP1-2, along with multimodal sensory feedback and a distributed multi-agent controller.
General eligibility criteria: Applicants would normally be expected to hold a minimum of a UK Honours degree at 2:1 level or equivalent in a relevant degree course.
Project specific criteria: The ideal candidate should have strong background in robotics, control theory and human-robot interaction, rich experiences in coding such as MATLAB and Python; excellent oral and written communication skills with ability to prepare presentations, reports, and journal papers to the highest levels of quality; excellent interpersonal skill to work effectively in a team consisting of PhD students and postdoctoral researchers. Non-UK students are welcomed to apply. Overseas applicants should submit IELTS results (minimum 6.5) if applicable.
Informal enquiries and how to apply
For informal enquiries, please contact(z.wang82@lancaster.ac.uk). Candidates interested in applying should send a copy of their CV together with a personal statement/covering letter addressing their background and suitability for this project to Dr. Ziwei Wang by the closing date: 18th July 2025.