BLUEASSIST

Fraunhofer joint project: Concept development of an innovative UAS overall system for reliable and cost-effective data acquisition in the maritime sector.

As part of the "BLUEASSIST" project, a concept for an autonomously flying, hybrid-electrically powered glider equipped with various sensors as a payload was developed in collaboration with industrial partners Lange Research Aircraft and M4COM.

Thanks to powerful fuel cells and the associated flight duration of up to 40 hours, such a remote sensing system would be suitable for monitoring large maritime areas, among other things. This includes monitoring shipping, detecting oil spills, but also other mostly sovereign tasks such as monitoring major emergencies such as forest fires and floods or border protection. Due to the extremely long flight duration in remote regions, special requirements are placed on data acquisition, processing, evaluation and communication with the UAV. In order to make optimum use of the remote sensing system and to avoid overloading the data transfer, an automated onboard processing and analysis of the data was designed. A complete and detailed evaluation of the data would then be possible in the associated ground station using more powerful and computationally intensive algorithms.

The main activities and objectives of the joint project were:

  • In-depth exchange with potential users
  • Conception of an overall system consisting of UAV, coordinated payload for data acquisition and transmission and attached ground station
  • Development of a hybrid-electric Antares E2 powered by fuel cells
  • Concept for the integration of a multi-sensor system, a data transmission unit and a ground control station
     

Participation of Fraunhofer IOSB - Department SZA

Multi-sensor data processing and evaluation (onboard + ground station)

Widely used camera systems for aerial surveillance are offered by companies such as Trakka Systems with the TC-300 model. The swivelling and stabilized design with integrated thermal and HDTV camera ensures a sufficiently high degree of flexibility with continuous zoom. In order to test the performance of this camera system in the field of ship detection from the air, artificial intelligence (Mask R-CNN networks pre-trained with MS COCO) was applied to a video section of the camera at Fraunhofer IOSB. Mask R-CNN are instance segmentation methods in which incoming images are processed by different networks (FPN, RPN) and features are segmented and classified. Due to its modularity and multiphase structure, Mask R-CNN is comparatively easy to train for new scenarios.

Video: AI applied to Trakka,
© Fraunhofer IOSB 

A comparable scenario is the automatic detection of people. For this purpose, the AI framework described above was not trained on ships, but on people. Changing the perspective (now terrestrial) posed no problem.

Video: People detection, 
© Fraunhofer IOSB

AI in the form of convolutional neural networks (CNN) was also used for the step-by-step detection of vehicles from the air. The network consists of two branches: due to its four pooling layers, the classification branch leads to a high classification performance on the one hand and to a strong reduction in spatial resolution on the other. The segmentation branch contains only one pooling layer and compensates for the reduction caused by the classification branch with the corresponding high-precision resolution. Despite the use of graphics card programming, this algorithm has to wait for every fifth image to keep pace due to the enormous computing effort involved. Due to the large overlap of the individual image areas, this method can nevertheless be described as real-time capable. 

Video: Step-holding vehicle detection,
© Fraunhofer IOSB


 

This project was financed by the research initiative mFUND (Modernity Fund) of the Federal Ministry of Transport and Digital Infrastructure BMVI and its call for ideas and funding "UAS and air cabs".

Further information

 

Scene Analysis Department

You want to learn more about our products in the area of scene analysis?