UNREALFIRE: A SYNTHETIC DATASET CREATION PIPELINE FOR ANNOTATED FIRE IMAGERY IN UNREAL ENGINE
Authors/Creators
Description
High-quality training data are essential for Deep Neural Network (DNN) training. In Natural Disaster Management (NDM) scenarios, annotated training data are needed to train DNN models, e.g., for wildfire detection/segmentation. However, image annotation in such scenarios is prone to annotation errors, mostly due to the unpredictable visual structure of the fire/smoke. To this end, photorealistic simulators hold substantial promise, since they allow the creation of synthetic wildfire images. Yet, existing assets depicting fires in simulator engines are typically inserted as particle objects. As a result, existing assets do not feature a set 3D mesh causing them to have no 2D projection, i.e., it is not trivial how to generate fire segmentation annotation maps. This paper presents a free, open-access1 pipeline for creating diverse synthetic annotated wildfire image datasets. More specifically, we developed a novel particle segmentation camera for the AirSim plugin, which enables the generation of segmentation maps of objects made of particles. We also integrate Procedural Content Generation tools (PCG) to gather unlimited amounts of diverse, high-quality annotated training data. To evaluate our framework, we generated a sample fire dataset called AUTH-Unreal-Wildfire (AUW) for wildfire segmentation. In our experiments we use a state-of-the-art segmentation DNN, namely PIDNet, and compare the our synthetic wildfire images to different real image datasets, along with their potential to augment real wildfire datasets.
Files
SPATHARIS_ICIP_2025 (1).pdf
Files
(5.6 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:6408801c453d61c5d9e274a0940d505d
|
5.6 MB | Preview Download |