Winter Wheat Harvest Prediction Using Primarily Satellite Radar Data from Sentinel-1

University essay from Lunds universitet/Matematik LTH

Abstract: Aiding farmers with their tremendous task of sustainably and cost-efficiently feeding the world is of utmost importance. Information technology plays a crucial role in supporting farmers and supplying them with accurate information about their crops. The information gives farmers operative benefits, such as optimized timing of fertilization and plant protection to name a few. Satellite optical imagery from Sentinel-2 satellites has been used to predict harvest but is severely hampered by cloud cover, which is not the case for Synthetic Aperture Radar (SAR) backscatter from Sentinel-1 satellites. In our thesis, we used primarily Sentinel-1 satellite data and additional weather, topography, and elevation data to predict the winter wheat harvest on selected fields in Skåne, a region in Southern Sweden. We compared the performance of two machine learning models: Light Gradient-Boosting Machines (LGBM) and Feedforward Neural Networks. Our results show that Sentinel-1 data contains valuable information for winter wheat harvest prediction, and can achieve a top RMSE of 0.74 tonnes per hectare using all data and LGBM. We tested different resolutions of the harvest data grid. To our surprise, the lower-resolution grid outperforms the higher-resolution grid. Furthermore, we tested if transfer learning between years were possible. In general, we could not achieve transfer learning, as the harvest data distribution varied greatly for each season. Further work is needed to investigate why the lower-resolution grid outperformed the higher-resolution grid and to model the varying harvest distribution. Moreover, the combination of both Sentinel-1 and Sentinel-2 data might lead to better results, since it is possible that Sentinel-1 backscatter contains information that can not be derived from Sentinel-2 optical imagery and vice versa.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)