Deep Learning Based High-Resolution Crop Mapping
This study aims to use sentinel-2 time series data to extract vegetation phenology metrics which will be further used to classify crops
This study aims to use sentinel-2 time series data to extract vegetation phenology metrics which will be further used to classify crops
In this presentation, we delve into the cutting-edge field of high-resolution crop mapping using Sentinel-2 time series and vegetation phenology. We begin with the Motivation behind pursuing such technology, emphasizing the pressing need for accurate crop monitoring and its potential impact on agricultural productivity and sustainability. Next, we address the Challenges faced in the process, including data availability, quality, and computational complexities. The Data Preparation phase is then discussed, highlighting the crucial steps involved in curating and preprocessing the Sentinel-2 satellite imagery for analysis. Subsequently, we delve into the heart of our research, the Methodology, which centers around deep learning techniques to harness the power of convolutional neural networks for crop classification. We explore how Crop Phenology plays a vital role in this context and the significance of tracking vegetation growth stages over time. The extraction of Phenology Metrics is then explained, shedding light on the key indicators used to differentiate various crop types. Finally, the presentation concludes by showcasing the potential of this approach in revolutionizing precision agriculture and its promising implications for sustainable food production and resource management.
The motivation behind crop classification stems from the critical importance of agriculture as the backbone of our global food supply. With a growing population and increasing food demand, ensuring food security has become a paramount challenge. Accurate crop classification plays a pivotal role in effective agricultural management, allowing policymakers, farmers, and researchers to make informed decisions regarding crop planning, resource allocation, and risk assessment. Additionally, precision crop classification aids in optimizing the use of fertilizers, water, and pesticides, leading to reduced environmental impact and promoting sustainable farming practices. Leveraging advanced technologies like deep learning and high-resolution satellite imagery, crop classification endeavors to enhance our understanding of agricultural dynamics, foster more efficient practices, and ultimately contribute to the goal of achieving a resilient and sustainable food production system for the future.
Crop classification encounters various challenges that must be effectively addressed to ensure its success and reliability. Firstly, obtaining high-quality and consistent remote sensing data, such as Sentinel-2 imagery, proves challenging in regions with limited satellite coverage or frequent cloud cover. The availability of ground truth data for accurate training and validation can also be scarce or outdated for certain regions and crop types. Moreover, preprocessing the satellite imagery to remove noise, correct atmospheric effects, and normalize data from different sources is complex and time-consuming. The spectral variability among different crops, coupled with their similar spectral signatures, poses difficulties in accurately distinguishing them based on reflectance properties alone. Seasonal and temporal variations in crop appearance and phenology further contribute to classification uncertainties. Class imbalance in datasets, computational resource requirements, and challenges in model transferability across regions and years also demand special attention. Additionally, the complexity of agricultural systems on a global or large regional scale, coupled with the need for interpretable models, presents further obstacles to crop classification endeavors. Addressing these challenges is crucial for unlocking the full potential of crop classification and its applications in precision agriculture and food security.
This study uses sentinel-2 time series to extract EVI and NDVI vegetation indexes. Each pixel of sentinel-2 image is labeled with the help of CropScape to identify the crop types. With the help of vegetation index from the sentinel-2, phenology metrics are extracted.
The crop classification process involves three key steps. Initially, in the first step, vegetation metrics such as NDVI (Normalized Difference Vegetation Index) or EVI (Enhanced Vegetation Index) are extracted from satellite imagery for a specific year. These metrics are then processed to derive year-wise vegetation indices. In the second step, a random forest regression technique is utilized to predict various phenology metrics, capturing critical information about vegetation growth stages and patterns. Finally, in the third step, the extracted phenology metrics are leveraged as input for a deep learning model, enabling accurate crop classification based on the learned patterns and characteristics of different crop types. This comprehensive approach optimizes the use of remote sensing data and advanced algorithms to achieve precise and reliable crop mapping results.
Vegetation phenology is the study of the timing of plant life cycle events and their relationship with environmental factors.
Vegetation phenology refers to the study of recurring patterns and temporal changes in the life cycle of plants, particularly in response to seasonal variations and climate influences. It involves monitoring the timing of key events in a plant's life, such as bud burst, leaf development, flowering, and senescence (leaf shedding). These phenological events are closely linked to environmental factors like temperature, precipitation, and photoperiod, which impact plant growth and development. Vegetation phenology plays a crucial role in various ecological processes, agricultural practices, and climate change studies. Remote sensing technologies, such as satellite imagery and the extraction of phenology metrics like NDVI (Normalized Difference Vegetation Index) or EVI (Enhanced Vegetation Index), enable monitoring and understanding of these temporal changes at regional and global scales, providing valuable insights into ecosystem dynamics and agricultural productivity.
NDVI changing over the 1 year time period
Vegetation phenology of four major crops - Corn, Cotton, Rice, and Soybean
In the phenology extraction pipeline, the start and end of the growing season are determined by identifying the intersection points between the moving average and the reference curve. The rolling average is computed using forwards and backwards lags, with the length of the non-growing season defining the moving average window. To calculate the average length of the non-growing season for each input entity, the growing season length is estimated based on two consecutive NDVI signal minima and the signal representing seasonal crop growth. Assuming that the seasonal crop growth follows a normal distribution, the Seasonal Growing Length (SLE) is defined by two standard deviations computed from the barycenter of the area, encompassing 68.2% of the statistical population. The specific lag for defining the moving average window is determined using a formula, not provided in the given text. This approach enables accurate identification of the growing season and vegetation phenology, crucial for various agricultural and ecological applications. Some of the strategies are taken from https://github.com/antonkout/Phenology-Extraction/tree/main
Vegetation phenology is extracted over a small region
This figure will help to understand how phenology metrics are distributed over a region.
The given code snippet defines a neural network model for classification tasks. This model is constructed using PyTorch's nn.Module class. It comprises five fully connected layers with batch normalization applied after each layer to enhance training stability and accelerate convergence. The model takes input data with 13 features and aims to classify it into one of four output classes. The hidden layers consist of 256, 128, 64, and 32 neurons, respectively, with ReLU activation functions applied between each layer. The final output layer, fc5, consists of four neurons, representing the number of classes to be predicted. The model's forward method performs the computation by passing the input data through the sequential layers, applying ReLU activations and batch normalization where appropriate. The model is then ready for training with the Adam optimizer and the CrossEntropyLoss function, well-suited for multi-class classification problems. The model training is executed over 500 epochs to iteratively optimize the model's parameters and minimize the classification loss. Additionally, the model is set to run on a GPU (if available) using the .cuda() method to leverage faster computation and training speed. Overall, this model architecture serves as an effective framework for accurate classification tasks, particularly when dealing with input data containing 13 features and four output classes.
Overall accuracy is better in the random forest compared with deep learning. Detecting crops, particularly rice fields, can be challenging due to the presence of water content. Waterlogged rice fields can lead to similar spectral signatures as other water bodies, making it difficult to distinguish them solely based on vegetation indices. The presence of clouds in satellite imagery can also have a significant impact on vegetation index values, resulting in noise and reduced classification performance. Cloud cover obstructs the acquisition of clear and consistent data, leading to incomplete or inaccurate information for crop classification models. Addressing these challenges requires advanced image processing techniques, data fusion approaches, and the integration of multiple data sources to improve the accuracy and reliability of crop detection and classification in regions with water-intensive agriculture like rice fields.
Student Ahmed Manavi Alam Ph.D. Student Electrtical and Computer Engineering Mississippi State University Email: aa2863@msstate edu
Advisor Vitor S. Martins Assistant Professor Agricultural and Biological Engineering Mississippi State University Email: vmartins@abe.misstate.edu
'