Topics Cooperative fusion for tracking of pedestrians from a moving vehicle Radar-Video sensor fusion LIDAR-Radar-Video sensor fusion Radar pedestrian detection using deep learning Multimodal vision 2D multi-modal video fusion for wide-angle environment perception Visible-thermal video enhancement for detection of road users Automotive High Dynamic Range (HDR) imaging Classic multi-exposure HDR reconstruction Intelligent HDR tone mapping for traffic applications Learning-based HDR video reconstruction and tone mapping Efficient multi-sensor data annotation tool Point-cloud processing Fast Low-level Point-cloud processing Point-cloud based Object detection and tracking Environment mapping and odometry Liborg - Lidar based mapping LIDAR based odometry Monocular visual odometry Automotive occupancy mapping Obstacle detection based on 3D imaging Real-time sensor data processing for autonomous vehicles using Quasar - demo video
Cooperative fusion for tracking of pedestrians from a moving vehicle Autonomous vehicles need to be able to detect other road users and roadside hazards at all times and in all conditions.
Topics Networked sensors Sensor networks and methods for wellness monitoring of the elderly Collaborative Tracking in Smart Camera Networks Distributed Camera Networks Multi Camera Networks 3D reconstruction using multiple cameras Real-time video mosaicking Scene and human behavior analysis Foreground background segmentation for dynamic camera viewpoints Foreground/background segmentation Automatic analysis of the worker's behaviour Gesture Recognition Behaviour analysis Immersive communication by means of computer vision (iCocoon) Material Analysis using Image Processing
Sensor networks and methods for wellness monitoring of the elderly Addressing the challenges of a rapidly-ageing population has become a priority for many Western countries. Our aim is to relieve the pressure from nursing homes’ limited capacity by pursuing the development of an affordable, round-the-clock monitoring solution that can be used in assisted living facilities. This intelligent solution empowers older people to live (semi-) autonomously for a longer period of time by alerting their caregivers when assistance is required.
Topics Real-time monitoring in Additive Manufacturing
Real-time monitoring in Additive Manufacturing In additive manufacturing, items are 3D printed layer-by-layer using materials like plastics, polymers, and metals. Unfortunately, instabilities in the printing process can produce defects like cracks, warping, and pores/voids within the printed item. Our goal at IPI is to develop computer vision systems that identify the creation of these defects in real time, then provide the 3D printer with sufficient information to intervene in a way that corrects, or avoids, the defect. Since these defects can occur over a very short amount of time (< 1 ms), our monitoring systems need to operate at very high speeds while also providing accurate results.
Melt pool monitoring: GPU-based real-time detection of pore defects using dynamic features and machine learning Using high speed cameras and photodiodes (sampling rates > 20 kHz), we are exploring AI models that can highlight defect creation.
SafeNav will use innovative sensor setups, including cameras to solve the main challenge = to improve the detection performance in difficult conditions
The SafeNav maritime safety project promises a path towards safer and more secure navigation for the navigator on the bridge today and then moving towards remote-operated and autonomous shipping. One key aspect to boost maritime safety is accurate and efficient detection and tracking of vessels and floating objects as well as marine mammals, in order to avoid navigational hazards such as collisions and subsequent damages to ships, crew members and the marine environment.
In SafeNav, IPI will use innovative sensor setups, including cameras to solve the main challenge = to improve the detection performance in difficult conditions: distant or semi-submerged marine animals or containers, waves crests and sun glitters, poor weather conditions.
EU-HORIZON, 9/2022 - 8/2025
The SeaDetect project, part of Europe's LIFE initiative, aims to halt the biodiversity loss due to collisions between ships and cetaceans by implementing and developing new technologies. To considerably reduce this risk of collision and protect marine life and biodiversity, the SeaDetect project aims to develop two innovative, complementary systems. The first is a detection system to be deployed on ships composed of multiple highly sensitive sensors, of which the data will be fused and processed with artificial intelligence in order to detect cetaceans up to 1km. The second is a network of Passive Acoustic Monitoring (PAM) buoys that will detect and triangulate the position of cetaceans in real-time in order to prevent collision for all vessels in usual maritime roads.
IPI's role in the project is to develop novel detection algorithms based on raw data fusion to improve the detection capabilities of the on-board systems.
EU-LIFE, 9/2022 - 8/2026
The goal of VISION2REUSE is to demonstrate the potential of smart cameras for the automatic monitoring of the quality of reusable packaging in the food and packaging industry. Based on these camera technologies and state-of-the-art machine learning, it will be measured in an accurate and fast way whether the packaging material in question is still suitable for a new reuse cycle or whether it should go to a dedicated end-of-life stream (e.g. recycling).
REACT-EU EFRO, 1/2022 - 12/2023