Skip to main content

SL-MOE: A TWO-STAGE MIXTURE-OF-EXPERTS SEQUENCE LEARNING FRAMEWORK FOR FORECASTING RAPID INTENSIFICATION OF TROPICAL CYCLONE

Jian Xu (Beijing University of Posts and Telecommunications); Yang Lei (Beijing University of Posts and Telecommunications); Guangqi Zhu (Beijing University of Posts and Telecommunications); Yunling Feng (Beijing University of Posts and Telecommunications); Bo Xiao (Beijing University of Posts and Telecommunications); Qifeng Qian (National Meteorological Center of China); Yajing Xu (Beijing University of Posts and Telecommunications)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Forecasting rapid intensification (RI) of tropical cyclones (TC) is an important and challenging task. However, existing RI forecast methods pay little attention to the imbalanced distribution of RI with dynamic statistical models or machine learning methods. Actually, RI prediction is a class-imbalanced problem in nature. Due to its contingency, there is a great fluctuate in the class distribution between positive (RI) and negative (non-RI) samples each year. To address the above issues, we propose a novel two-stage mixture-of-experts sequence learning framework (SL-MoE) which aims to solve the class imbalanced distribution with decoupled two learning stages, thereby boosting RI forecast: (1) in the representation learning stage, the shared sequence learning backbone is trained to extract general features from class imbalanced data and the TC life flag is included to lessen the impact of the TC dying period. (2) in the mixture-of-expert learning stage, we train diverse experts with different losses and obtain weights using self-supervised weight learning to handle different test class distributions from a single training long-tailed distribution. Specially,we propose a high-resolution multi-satellite image dataset named TCHF. We perform extensive experiments, which achieve new state-of-the-art performance on the TCHF and TCIR datasets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00