BLUEASSIST

Fraunhofer joint project: Concept development of an innovative UAS overall system for reliable and cost-effective data acquisition in the maritime sector.

Surveillance of inaccessible areas by long-range UAV

As part of the "BLUEASSIST" project, an autonomously flying and hybrid-electric glider is to be developed and equipped with a wide range of sensors as a payload in cooperation with the industrial partners Lange Research Aircraft and M4COM. electrically powered glider will be developed and equipped with a wide variety of sensors as payloads.

Because of the powerful fuel cells and the associated flight duration of up to 40 hours, this remote sensing system is suitable, among other things, for monitoring large-scale maritime operational areas. This includes monitoring shipping, detecting oil spills, but also other mostly sovereign tasks such as monitoring large-scale damage situations like forest fires and floods or border protection. Due to the extremely long flight duration in remote regions, special requirements are placed on data acquisition, processing, evaluation and communication with the UAV. In order to make optimal use of the remote sensing system and not to overload the data transfer, automated onboard processing and analysis of the data, among other things, is to be carried out. A complete and detailed evaluation of the data is then possible in the affiliated ground station using more powerful and computationally intensive algorithms.

The main activities and objectives of the collaborative project are:

  • Enhanced exchange with potential users
  • Conceptual design of an overall system consisting of UAV, tuned payload for data acquisition and transmission and attached ground station
  • Development of a fuel cell-powered, hybrid-electric Antares E2
  • Concept for the integration of a multi-sensor system, a data transmission unit and a ground control station

In this project, Fraunhofer IOSB and its SZA department are participating in:

  • Investigation of methods for automated data (pre)evaluation, filtering and compression onboard with regard to efficient data transmission
  • Development of application-specific evaluation methods and design of the necessary hardware components for the ground station.
     

Involvement of the Scene Analysis Department SZA:

Multisensory data processing and analysis (onboard + ground station)

Widespread aerial surveillance camera systems are offered by companies such as Trakka Systems with the TC-300 model. The swivelling and stabilised design with integrated thermal and HDTV camera, with continuous zoom, ensures sufficient flexibility. To test the performance of this camera system in the field of aerial ship detection, artificial intelligence (Mask R-CNN networks pre-trained with MS COCO) was applied to a video section of the camera at Fraunhofer IOSB.
Video Source: Trakka, modified by Fraunhofer IOSB

Mask R-CNN are state-of-the-art instance segmentation methods in which incoming images are processed by different networks (FPN, RPN), and features are segmented and classified. Due to its modularity and multi-phase structure, Mask R-CNN is comparatively easy to train on new scenarios.

A comparable scenario is the automatic detection of persons. For this, the AI framework already described was not trained on ships, but on people. Changing the perspective (now terrestrial) does not pose a problem.

Similarly, AI in the form of Convolutional Neural Networks (CNN) is used in the step-by-step detection of vehicles from the air. Here, the network consists of two branches: the classification branch, due to its four pooling layers, leads on the one hand to a high classification performance and on the other hand to a strong reduction of the spatial resolution. The segmentation branch contains only one pooling layer and compensates for the reduction caused by the classification branch with its correspondingly high-precision resolution. Despite the use of graphics card programming, this algorithm has to wait for every fifth image to keep up due to the enormous computational effort. Due to the large overlap of the individual image areas, this method can nevertheless be described as real-time. Video Source: SELSAS

© mFUND

This project is funded by the research initiative mFUND (Modernity Fund) of the Federal Ministry of Transport and Digital Infrastructure BMVI and its call for ideas and funding "UAS and air taxis".

Further information

 

Scene Analysis Department

You want to learn more about our products in the area of scene analysis?