Multimodal Fusion of Satellite and Sensor Data for Enhanced Extreme Weather Predictio

  • Tech Stack: TensorFlow, Keras, Pytorch, Hugging Face, Transformers, NLTK, BERT
  • Github URL: Project Link

We propose a multimodal fusion approach that integrates satellite observations with high-resolution meteorological data to enhance extreme weather event prediction. Our methodology involves curating a comprehensive dataset by combining NOAA storm event records, Open-Meteo sensor data, and NASA GIBS satellite imagery.

Our methodology involves curating a comprehensive dataset by combining NOAA storm event records, Open-Meteo sensor data, and NASA GIBS satellite imagery. Extensive preprocessing steps, including temporal alignment, spatial gridding, and feature extraction, were applied to build a unified dataset.

We then trained machine learning models - Random Forest, Gradient Boosting, and LSTM - leveraging both tabular and image-derived features. Experimental results demonstrate that multimodal fusion significantly outperforms single-source models across multiple evaluation metrics. This study highlights the critical role of data fusion in improving the robustness and reliability of extreme weather predictions and paves the way for future work involving explainable multimodal deep learning frameworks.