
This paper presents a novel approach to fruit detection using deep convolutional neural networks. The aim is to build an accurate, fast and reliable fruit detection system, which is a vital element of an autonomous agricultural robotic platform; it is a key element for fruit yield estimation and automated harvesting. Recent work in deep neural networks has led to the development of a state-of-the-art object detector termed Faster Region-based CNN (Faster R-CNN). We adapt this model, through transfer learning, for the task of fruit detection using imagery obtained from two modalities: colour (RGB) and Near-Infrared (NIR). Early and late fusion methods are explored for combining the multi-modal (RGB and NIR) information. This leads to a novel multi-modal Faster R-CNN model, which achieves state-of-the-art results compared to prior work with the F1 score, which takes into account both precision and recall performances improving from 0 . 807 to 0 . 838 for the detection of sweet pepper. In addition to improved accuracy, this approach is also much quicker to deploy for new fruits, as it requires bounding box annotation rather than pixel-level annotation (annotating bounding boxes is approximately an order of magnitude quicker to perform). The model is retrained to perform the detection of seven fruits, with the entire process taking four hours to annotate and train the new model per fruit.
real-time performance, harvesting robots, visual fruit detection; deep convolutional neural network; multi-modal; rapid training; real-time performance; harvesting robots; horticulture; agricultural robotics, rapid training, agricultural robotics, multi-modal, TP1-1185, Article, Pattern Recognition, Automated, deep convolutional neural network, Image Processing, Computer-Assisted, Humans, Chemical technology, horticulture, Robotics, 004, Fruit, visual fruit detection, Neural Networks, Computer, Capsicum, Algorithms
real-time performance, harvesting robots, visual fruit detection; deep convolutional neural network; multi-modal; rapid training; real-time performance; harvesting robots; horticulture; agricultural robotics, rapid training, agricultural robotics, multi-modal, TP1-1185, Article, Pattern Recognition, Automated, deep convolutional neural network, Image Processing, Computer-Assisted, Humans, Chemical technology, horticulture, Robotics, 004, Fruit, visual fruit detection, Neural Networks, Computer, Capsicum, Algorithms
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 908 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 0.01% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 0.1% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 0.1% |
