Comparison between rainfall simulator erosion and observed reservoir sedimentation in an erosion-sensitive semiarid catchment
Research output: Contribution to journal › Article
Estimating catchment scale soil loss based on rainfall simulators is often hampered by the difficulty to scale up simulator results. Our objective was to develop and test a method for estimating catchment scale soil loss based on observed rainfall using a variable intensity rainfall simulator in an erosion-sensitive catchment in semiarid Tunisia. A 7-year period, 1992-1999, with observed sedimentation amounts in a downstream reservoir was chosen to test a methodology. The methodology was based on (1) energy adjustment for the used simulator due to the difference in kinetic energy of simulated and natural rainfall at equal intensities and (2) upscaling of simulated erosion in which rill erosion was estimated by adjusting the difference between slope lengths for the plots versus the catchment after onset of runoff. The comparison between calculated soil loss from rainfall simulator experiments and observed sedimentation in the downstream reservoir displayed good overall results. Calculated soil loss was found to be about 96%, 36%, and 80% for different observed subperiods, respectively. The observed low value for the second period was probably due an exceptionally intense rainfall event during this period, which appears to have led to gully erosion, soil slide, and riverbank collapse. Therefore, during this event, siltation in the reservoir may essentially be due to unaccounted erosion processes such as gully erosion. Overall, however, it appears that plot-scale variable intensity rainfall simulators can rather successfully estimate catchment scale soil losses. (C) 2002 Elsevier Science B.V. All rights reserved.
|Research areas and keywords||
Subject classification (UKÄ) – MANDATORY
|Publication status||Published - 2002|