The project has primarily focused on data processing of image data from radars and cameras, to be able to detect and track objects that are around the entire ship (360 degrees). For detection and classification of objects, machine learning algorithms have been used. The extracted detections are then used to track objects on the sea surface, and thereby contribute to better situational awareness.
To improve the robustness of the sensor platform new methods for sensor calibration, better sensor models, as well as robust handling of clutter and for objects being temporarily out of sight, have been developed. During the project, field studies have also been carried out on both ferries and larger ships to investigate what information the user uses as a decision basis for navigation. The field study concluded that the navigator/captain uses both vision, camera and information from various navigation aids depending on the type of operation and whether it takes place in light or dark. Based on interviews with captains and navigators, concepts for good user interaction were developed that can be used as a basis for further development of remote controlled ships.