Evaluate Transformer model and Self-Attention mechanism in the Yangtze River basin runoff prediction

Xikun Wei, Guojie Wang, Britta Schmalz, Daniel Fiifi Tawia Hagan, Zheng Duan

Research output: Contribution to journalArticlepeer-review

Abstract

Study region: In the Yangtze River basin of China. Study focus: We applied a recently popular deep learning (DL) algorithm, Transformer (TSF), and two commonly used DL methods, Long-Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), to evaluate the performance of TSF in predicting runoff in the Yangtze River basin. We also add the main structure of TSF, Self-Attention (SA), to the LSTM and GRU models, namely LSTM-SA and GRU-SA, to investigate whether the inclusion of the SA mechanism can improve the prediction capability. Seven climatic observations (mean temperature, maximum temperature, precipitation, etc.) are the input data in our study. The whole dataset was divided into training, validation and test datasets. In addition, we investigated the relationship between model performance and input time steps. New hydrological insights for the region: Our experimental results show that the GRU has the best performance with the fewest parameters while the TSF has the worst performance due to the lack of sufficient data. GRU and the LSTM models are better than TSF for runoff prediction when the training samples are limited (such as the model parameters being ten times larger than the samples). Furthermore, the SA mechanism improves the prediction accuracy when added to the LSTM and the GRU structures. Different input time steps (5 d, 10 d, 15 d, 20 d, 25 d and 30 d) are used to train the DL models with different prediction lengths to understand their relationship with model performance, showing that an appropriate input time step can significantly improve the model performance.

Original languageEnglish
Article number101438
JournalJournal of Hydrology: Regional Studies
Volume47
DOIs
Publication statusPublished - 2023

Subject classification (UKÄ)

  • Bioinformatics (Computational Biology)
  • Oceanography, Hydrology, Water Resources

Free keywords

  • GRU
  • LSTM
  • Runoff prediction
  • Self-Attention
  • Transformer

Fingerprint

Dive into the research topics of 'Evaluate Transformer model and Self-Attention mechanism in the Yangtze River basin runoff prediction'. Together they form a unique fingerprint.

Cite this