Research

Data Fusion lab: Example topics

Due to recent progress in artificial intelligence and information technology, automated and autonomous systems will soon become ubiquitous in our daily lives. In this context, our mission is to advance the state-of-the-art of sensor data processing and fusion methods for intelligent autonomous multi-sensor systems. In order to systematically deal with sensor uncertainties, we investigate and develop modern probabilistic state estimation and machine learning methods. While we have extensive experience with model-based methods (e.g., for embedded real-time systems), we are also considering simulation-based and data-driven techniques that learn from data.

The current focus of the group lies on navigation and perception problems. Various types of sensors such as radar devices, laser scanners, RGB-D sensors, sonars, RGB cameras, and inertial sensors are considered. Research results are applied to various modern application topics; often in cooperation with partners from industry and academia.


Example Topics

Localization and Mapping

Two key tasks of any mobile autonomous system are the determiniation of its own position with respect to a global coordinate system, and the creation of a map of a possibly unknown environment. In this context, we work on odometry, simultaneous localization and mapping (SLAM), and (indoor) positioning. For example, the BMFTR sponsored project OKULAr considers the localization of vehicles in GNSS-denied environments.


(Multi-Object) Tracking

In order to safely move through a dynamic world, an autonomous systems needs to be aware of all moving objects in its surroundings. The determination of the number and properties (e.g., position and velocity) of multiple moving objects, is called (multi-object) tracking. In this context, we work on data association problems and extended object tracking, i.e., the simultaneous estimation of shape and motion. For example, we develop methods for recursively estimating elliptical shape approximations of spatially extended objects based on point cloud measurements.


Collective Perception

A network of spatially distributed sensors allows for an extended field of view of the environment. Furthermore, (heterogeneous) data can be collected from different perspectives. For example, we work on track-2-track association for collective perception under incorporation of communication constraints. An article (in German) about recent results within the project Zukunftslabor Mobilität can be found here.


Indoor Drones

Indoor sensing with drones is challenging as propeller-based systems are noisy and potentially dangerous. For this reason, we develop balloon-based robots that move directly under the ceiling without any propellers. Further details are available at the web page RoboBalloon.

Further Links