Augmented Reality-based Robotic System for In-space Servicing
Resumen
Space assets like communication satellites, space telescopes, and microgravity research stations are essential for human knowledge development. Nevertheless, space assets are subject to failure, damage, or obsolescence in their life cycle. A space asset's failure, damage, or obsolescence is typically addressed using teleoperated space robot systems to diagnose, repair, or upgrade it. In these activities, the region of interest or components to be serviced may be visually occluded by light or a physical obstacle, complicating the teleoperated maneuver and compromising the mission's success and safety as a collision may occur due to blind spots generated by visual occlusions. This paper presents an Augmented Reality (AR)-based robotic framework that allows the robot to perform an on-orbit servicing task despite visually occluded areas. It allows the user to dynamically obtain the best view in a 3D (three-dimensional) model. Besides, for the robot tool to accurately reproduce the natural motion of a human operator's hand, a direct handpresence device is used to directly map the avatar's hand motion to the robot’ send-effector(EE) motion. The system is validate din an AR environment with virtual and physical entities to repair a spacecraft's solar panel in a visually occluded area for the robot. The experimental results demonstrate that because the virtual environment can be manipulated in real-time to show the best perspective to the human operator, the repair trajectory was generated without compromising the safety and operations, even though the robot's EE and the camera-in-hand were not able to observe the area of interest directly.