This project deals with Satellite Autonomous Relative Navigation based on computer vision. It consists fundamentally in studying the solutions to the pertinent problems associated to autonomous relative navigation, aimed at servicing satellites in orbit.
Year by year, the number of satellites and space vehicles launched to space increases. Most of them would be able to carry out the tasks for which they have been designed, while others would not be able to fulfil their mission due to malfunction or operation anomalies, generating a great economic loss and becoming a potential risk for the others. It is clear, then, that the inspection and service of incapacitated satellites is a critical area in the space research and business sectors. The inspection and service of an incapacitated spacecraft must be performed with another spacecraft, which must be able to identify it and perform proximity and surrounding manoeuvres for visual inspection. The service operation will finish by carrying-out of a repair procedure with one or more robotic arms aboard the inspection vehicle. This research project is focused on the first phases of the mission, that is, on the identification of the motion and geometric structure of the incapacitated spacecraft and on the relative navigation around the non-cooperative vehicle for its inspection. Camera-based applications and novel artificial vision algorithms are the base of the technology to use, in straight relation with adequate architectures for guidance, navigation and control of space vehicles. In order to be able to obtain reliable results, an experimental testbed is constructed, based on an industrial robot; where different satellite geometries and configurations will be used under the space dynamics laws, based on robot visual servoing.
Also an adequate vision system will be implemented for the study of the identification and relative navigation problems. The project seeks to obtain the identification, inspection and navigation technology for the next generation of satellites, equipped with a high degree of autonomy due to their capacity of navigation with the information provided by a camera.
Part of the results is highly connected and will be useful for robot visual servoing in industrial applications