AI-based control of heterogeneous robot groups

In order to effectively and efficiently deploy several heterogeneous (flying, driving, swimming) mobile robots in a mission, many complex problems must be solved. Above all, the task to be solved has to be "understood" - i.e. it has to be brought into relation with internal models and processes. Then the task must usually be analyzed in order to select the most suitable resources (sensors, actuators as well as their mobile carriers, etc.) to enable a solution of the task with the consideration of given quality criteria. The next step is to plan the mission for the solution of the task, taking into account factors common to dynamic scenes such as partial knowledge, uncertainties and time constraints.

During the (more or less precisely) planned execution of the mission, the actual situation is continuously compared with the plan and any deviations are analyzed, which can be realized e.g. by means of a simulation running parallel to the execution. Small deviations can often be compensated locally (e.g. rule-based or behavior-based), larger deviations often require a replanning of the remaining mission. Especially in complex situations, humans have to be integrated into the workflow, for which special algorithms and interfaces have to be developed.

AI-based systems that can efficiently plan complex missions for groups of robots are poorly researched. One of the most difficult challenges for learning AI systems that plan solutions for dynamic tasks is a very limited amount of learning data.

Mission Planning and Mission Control

Mission planning for heterogeneous groups of robots includes the selection of the resources required to solve the task as well as the actual planning of the mission, whereby both steps usually have to be iteratively repeated several times. For planning, hierarchical simulation systems are required, which can be used off- and online through various levels of detail. At first, a mission can be roughly planned, and only if required (possibly also during execution) the plans are iteratively detailed together with environmental models to the required degree. The required level of detail depends on the level of intelligence, the capabilities of the executing robots and the quality criteria. However, not enough is known about the exact dependencies.

During the execution of the mission, the actual environment and the robots' own states are continuously captured and environmental models are adapted based on these data. Furthermore, the current situation is continuously compared with the plan and deviations are analyzed, which is usually realized by means of a simulation running parallel to the execution with the adaptable environment model described above.

Real-time fusion of heterogeneous data and information from multiple sources

If several mobile robots are used simultaneously, there are usually many abundant data sources that are heterogeneous and should be fused online. On the one hand, the real-time fusion of all sources is needed in order to obtain the overall situation of the group as accurately and promptly as possible. On the other hand, the available communication capacities as well as computing power are often not sufficient to transfer and fuse all relevant data in time, and often individual robots only need a certain part of the overall situation picture. Incidentally, data and information can also come from other sources - including humans, if suitable interfaces are available - and have varying degrees of precision and reliability.

Conversely, humans are also interested in the current status of the mission, and the situation description for humans can differ greatly from the situation descriptions for robots. For this purpose, Fraunhofer IOSB researches novel approaches and concepts.

Learning-based AI for mission management

In complex dynamic situations, which also include the use of groups of mobile robots, learning systems usually have an advantage. The main difficulty that has to be overcome when creating such systems is the small amount of suitable learning data. One of the few sources of such learning data is the "manual control" of missions of heterogeneous groups of mobile robots, although even experts usually cannot realize optimal control of complex missions with several mobile systems. A further source of learning data can be the rule- or simulation-based planning and control of such missions, although the performance of such systems is relatively rudimentary so far, and for learning purposes only "good" missions should be selected. To select them, new (also interactive) algorithms are needed. However, both methods for the generation of learning data can only generate a limited number of suitable learning examples - not least because of the complexity of the task, so that algorithms for learning complex procedures from a few examples must also be developed.

Hierarchical networked AI systems

Heterogeneous groups of mobile robots can basically be organized in two ways:

  • With a central instance that "directs" the group, has the most intelligence and takes over complex tasks,
  • without a central instance - the group members communicate with each other to solve more complex tasks.

Larger groups can be organized hierarchically and consist of several (also dynamic) subgroups, which can also be organized differently. Questions to be researched include such areas as the distribution of capabilities (including necessary computing power), dynamic roles and responsibilities, rules of behavior in different situations, and interaction with other robot groups and with humans.

Fields of application

In use cases such as convoy escort, rescue operations, urban reconnaissance or indoor surveillance, our mobile generic ground control station is used for reconnaissance with mobile and stationary sensors in a combined System. The control station can combine a sensor network of air, land and water vehicle data. 

In the event of a disaster, SENEKA offers emergency and rescue forces a mobile robot sensor network to support disaster management. Dynamically networkable sensors and robots shorten the search time for victims and sources of danger, for example. 

Deep-sea mapping by a swarm of autonomous vehicles: The idea of the ARGGONAUTS team, with which Fraunhofer IOSB competed in the Shell Ocean Discovery XPRIZE technology competition and finished among the top 5: Autonomous catamarans tow lightweight diving drones into the operational area, where they single-handedly map the seabed at depths of up to 4000 meters.

Logic-based information fusion: A fundamental task of modern information systems is the correct filtering and fusion of data into information in order to provide decision makers –depending on their role and task – with a correct view of the increasingly complex information space.

In the collaborative project OCEAN2020, a networked martime surveillance and reconnaissance mission of the future is being established. Especially in multinational defense operations at EU or NATO level, the exchange of surveillance and reconnaissance data is essential to be able to act quickly.