"Robotic interventions in hazardous environments often use teleoperation to prevent human exposure to danger. Reliability and safety must be ensured in the design of human-robot interfaces to enable operators to perform remote inspections, repairs, or maintenance. Head-Mounted Devices (HMDs) with Mixed Reality (MR) technologies can provide stereoscopic immersion and interaction but were not previously showcased in such scenarios. The presented work investigated and applied novel MR techniques, adaptive communication congestion protocols, and monitoring of the operator's vital physiological parameters. The framework was successfully applied to mobile telerobotic systems operating in the CERN underground particle accelerator environment with a 4G shared network. The MR interface with an Augmented Reality (AR) HMD provided efficient, precise interaction between the user and the robot through hand and eye tracking, user motion tracking, voice recognition, and video, 3D point cloud, and audio feedback from the robot. It also allowed multiple experts to collaborate locally and remotely in the AR workspace, allowing them to share or monitor the robot's control. While the current system has achieved high operational readiness through successful single and multi-user missions, some areas still need improvement, such as optimising the network architecture for multi-user scenarios and using machine learning techniques to detect non-standard situations in the operator’s vital parameters monitoring.”