Tuesday, 23rd April 2024
To guardian.ng
Search
Breaking News:
News  

Solving real-life agricultural problems with satellite remote sensing

Agriculture is one of the mainstays of Nigeria’s economy. According to the 2022 Statista data, the sector accounts for about 24% of the country’s GDP

EOS Data Analytics popularizes remote sensing technology and satellite imagery analytics among agricultural users

Agriculture is one of the mainstays of Nigeria’s economy. According to the 2022 Statista data, the sector accounts for about 24% of the country’s GDP. Nigeria is the world’s fifth-largest producer of palm oil and cocoa beans and ranks second in sorghum production. Oil, nuts, seeds, and fruits are among the Nigerian ten best performing export categories.

Given that sector employed almost 35% of Nigerians (as of 2019 based on statistics from the World Bank) and 70% of local households practice crop farming, the residents’ food security and well-being depend on access to tools and products allowing for harvesting sufficient crop yields. In this regard, revamping agriculture is not an option but a necessity. Modern farming practices include using machinery, variable rate application of inputs, or decision support software that relies on weather data and remote observations about fields.

Remote sensing is one of the technologies that can transform the Nigerian agriculture sector, making efficient and sustainable use of land and resources possible.

Remote sensing for modernizing agriculture in Nigeria

Remote sensing entails detecting and monitoring the physical characteristics of an area of interest by measuring its reflected and emitted radiation at a distance, notably from satellites with optical sensors that generate image data.

Remotely sensed images of cultivated lands are further processed and analyzed to extract valuable insights about the state of crops — information growers need to manage their farms and, thus, earn more. Insights can include plant growth stages and health, weed infestation, soil moisture levels, weather forecasts, or extreme conditions that can affect crop growth — heat and cold stress, and more.

Precision agriculture solutions are the “what” behind sourcing satellite imagery data and making sense of it. One of such products is EOS Crop Monitoring by EOS Data Analytics, a global provider of AI-powered satellite imagery analytics. The vendor has been helping customers from 20+ industries to make data-driven decisions since 2015. As of today, more than 37,000 Africans use the company’s off-the-shelf solutions.

“EOSDA has brought together researchers, GIS specialists, product managers, software engineers, data scientists, designers, testers, and other professionals who build proprietary solutions that tackle real-life problems of our customers. Working on every stage of the software development life cycle means having full control over processes and zero dependence on contractors. And since the beginning of our work, we have been using machine learning for data analytics,” says Lina Yarysh, Director of Customer Success at EOS Data Analytics.

Machine learning (ML) enables computer systems to learn from data and gradually improve at solving given tasks without being explicitly programmed. ML algorithms excel in analyzing and interpreting large volumes of data.

Deep Learning is a subfield of machine learning that focuses on developing algorithms inspired by the structure and function of the brain — artificial neural networks (ANNs). Artificial neural networks are great at recognizing objects based on learned visual characteristics. ANNs are commonly used for image recognition (classification) tasks, particularly in agriculture.

One of such tasks is crop classification using satellite images of fields. Customers can classify any cultivated crop — maize, millet, yam or cocoa beans, cassava — as long as growth data is available or can be collected. Find how it works in the next section.

Classification of crop types: how it works

In 2016, EOS Data Analytics and the World Bank started working on the project focusing on the agricultural land classification in Ukraine. The company had to do three major tasks within the project:

  • Classify land cover of a territory exceeding 60 million ha to distinguish cropland from other land cover types like forest or wetland
  • Classify up to 15 crop types growing on more than 41 million ha of agricultural (cropland) land
  • Digitize field borders on more on the same territory, 41 million ha

The crop classification task was carried out in several stages.

Data collection. During this stage, specialists gather data for future algorithm training and validation: points (fields) with information about a crop growing in a specific area of interest and soil samples. This ground-truth data was collected twice a year.

EOSDA agronomists and GIS specialists and scientists from partner universities visited areas in summer and winter to map crops growing during these seasons. Every year, agronomists collect thousands of data points to keep crop maps relevant.

The team also used time-series data — satellite imagery from Sentinel.

Dataset preparation.
The goal of this stage is to prepare training data for an algorithm. The project team uploaded data from different sources in one storage system and ensured it didn’t have incorrect, inaccurate, or irrelevant parts. They also adapted data to the classification task the future algorithm must perform. After these activities, specialists labeled data: annotated target attributes in data so that the deep learning algorithm can learn what entities it must classify.

For instance, it’s necessary to label wheat and barley fields to let the neural network know their visual features and then distinguish one crop from another. Data scientists annotated images of required crops at all growth stages so that the neural network could classify them on images taken throughout the growing season.

Neural network training. During this stage, data scientists let the deep learning algorithm process data so that it outputs a model that can classify crop types on satellite images.

As a result, the developed model can classify more than 15 crops (e.g., wheat, sunflower, maize, alfalfa, sugar beet, buckwheat, poppy, barley). Data scientists have been training the neural network model with up-to-date images collected yearly to increase crop classification accuracy gradually. Eventually, it became more than 90% accurate at recognizing crops like soya, maize, and sunflower.

The rest of the tasks within the assignment were also successfully solved. The EOSDA team digitized agricultural lands (defined field borders) and classified six land cover classes all over Ukraine.

Field-level crop maps for each year, from 2016 to 2021, were also created. As a result, farmers, government entities, traders, insurance companies received relevant crop classification data for their work.

Crops or land cover can be classified on any area of interest because EOS Data Analytics uses nine data sources to provide global coverage and have frequent revisits.

Modernizing Nigeria’s agriculture via the use of technology

Nigerian crop growers following traditional farming can hardly maintain high crop productivity and optimize farm spending to meet food demand and provide for themselves. Climate change-driven irregular rainfall patterns and increasing temperatures only complicate their work.

In search of ways to enhance productivity and reduce resource consumption, scientists introduce satellite remote sensing and neural network approaches to satellite data analysis for efficient field management. Neural networks powering analytics capabilities in precision farming software can solve various classification problems, such as land cover and crop type classification.

0 Comments