Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ https://doi.org/10.1...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
https://doi.org/10.1101/2021.0...
Article . 2021 . Peer-reviewed
License: CC BY
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
https://www.biorxiv.org/conten...
Article
License: CC BY
Data sources: UnpayWall
https://dx.doi.org/10.60692/nd...
Other literature type . 2021
Data sources: Datacite
https://dx.doi.org/10.60692/v4...
Other literature type . 2021
Data sources: Datacite
versions View all 3 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Plant detection and counting from high-resolution RGB images acquired from UAVs: comparison between deep-learning and handcrafted methods with application to maize, sugar beet, and sunflower

الكشف عن النباتات والعد من صور RGB عالية الدقة التي تم الحصول عليها من الطائرات بدون طيار: مقارنة بين التعلم العميق والطرق اليدوية مع التطبيق على الذرة وبنجر السكر وعباد الشمس
Authors: Etienne David; Gaëtan Daubige; François Joudelat; Philippe Burger; Alexis Comar; Benoit de Solan; Frédéric Baret;

Plant detection and counting from high-resolution RGB images acquired from UAVs: comparison between deep-learning and handcrafted methods with application to maize, sugar beet, and sunflower

Abstract

AbstractProgresses in agronomy rely on accurate measurement of the experimentations conducted to improve the yield component. Measurement of the plant density is required for a number of applications since it drives part of the crop fate. The standard manual measurements in the field could be efficiently replaced by high-throughput techniques based on high-spatial resolution images taken from UAVs. This study compares several automated detection of individual plants in the images from which the plant density can be estimated. It is based on a large dataset of high resolution Red/Green/Blue (RGB) images acquired from Unmanned Aerial Vehicules (UAVs) during several years and experiments over maize, sugar beet and sunflower crops at early stages. A total of 16247 plants have been labelled interactively on the images. Performances of handcrafted method (HC) were compared to those of deep learning (DL). The HC method consists in segmenting the image into green and background pixels, identifying rows, then objects corresponding to plants thanks to knowledge of the sowing pattern as prior information. The DL method is based on the Faster Region with Convolutional Neural Network (Faster RCNN) model trained over 2/3 of the images selected to represent a good balance between plant development stage and sessions. One model is trained for each crop.Results show that simple DL methods generally outperforms simple HC, particularly for maize and sunflower crops. A significant level of variability of plant detection performances is observed between the several experiments. This was explained by the variability of image acquisition conditions including illumination, plant development stage, background complexity and weed infestation. The image quality determines part of the performances for HC methods which makes the segmentation step more difficult. Performances of DL methods are limited mainly by the presence of weeds. A hybrid method (HY) was proposed to eliminate weeds between the rows using the rules developed for the HC method. HY improves slightly DL performances in the case of high weed infestation. When few images corresponding to the conditions of the testing dataset were complementing the training dataset for DL, a drastic increase of performances for all the crops is observed, with relative RMSE below 5% for the estimation of the plant density.

Keywords

Artificial intelligence, Environmental Engineering, Tree Height Estimation, RGB color model, Convolutional neural network, Plant Science, Pattern recognition (psychology), Agricultural and Biological Sciences, Image resolution, FOS: Mathematics, Plant Disease Detection, Biology, Vegetation Monitoring, Ecology, Sugar beet, FOS: Environmental engineering, Crop Yield Prediction, Life Sciences, Deep learning, Remote Sensing in Vegetation Monitoring and Phenology, Precision Agriculture Technologies, Computer science, Agronomy, Sunflower, FOS: Biological sciences, Environmental Science, Physical Sciences, Smart Farming, Mapping Forests with Lidar Remote Sensing, Mathematics

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    16
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 10%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 10%
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 57
    download downloads 24
  • 57
    views
    24
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
16
Top 10%
Average
Top 10%
57
24
hybrid