POWER OPTIMIZED CO-REGISTRATION OF MULTISPECTRAL AND THERMAL CAMERAS FOR REAL-TIME URBAN PLUVIAL FLOOD MONITORING
To address this challenge, our team is developing a low-power distributed sensor network with a camera-based flood mapping platform known as the Urban Flood Observation Network (UFO-Net). This system provides real-time monitoring and automatic high-water mark mapping. It contains two cameras: a multispectral camera (capturing red, green, blue, and near-infrared radiation) and a long-wave infrared (LWIR) thermal camera. The multispectral camera differentiates land cover types and water bodies, while the thermal camera identifies temperature changes associated with water. This research proposes a critical component: the development of a rapid, replicable workflow for co-registering optical and thermal camera fields of view in a power-constrained environment. This co-registration ensures precise data fusion and enhances the system's performance, reliability, and utility for flood monitoring and management.
To achieve these goals, we employ SimpleITK, a tool for iterative optimization of image transformation. Our final goal is to generate five-band images (RGB, NIR, LWIR) directly on edge using a Raspberry Pi 4 to classify inundated versus non-inundated areas, developing a bitmap for flood mapping. Additionally, we want the algorithm to adapt to changing sensor conditions and movement and be computationally and power-efficient. This optimized registration workflow supports developing a robust, real-time flood detection system tailored to urban environments.