Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao ISPRS Journal of Pho...arrow_drop_down
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ISPRS Journal of Photogrammetry and Remote Sensing
Article . 2019 . Peer-reviewed
License: Elsevier TDM
Data sources: Crossref
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
Research.fi
Article . 2020 . Peer-reviewed
Data sources: Research.fi
versions View all 5 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

A new fully convolutional neural network for semantic segmentation of polarimetric SAR imagery in complex land cover ecosystem

Authors: Mohammadimanesh, Fariba; Salehi, Bahram; Mahdianpari, Masoud; Gill, Eric; Molinier; Matthieu;

A new fully convolutional neural network for semantic segmentation of polarimetric SAR imagery in complex land cover ecosystem

Abstract

Despite the application of state-of-the-art fully Convolutional Neural Networks (CNNs) for semantic segmentation of very high-resolution optical imagery, their capacity has not yet been thoroughly examined for the classification of Synthetic Aperture Radar (SAR) images. The presence of speckle noise, the absence of efficient feature expression, and the limited availability of labelled SAR samples have hindered the application of the state-of-the-art CNNs for the classification of SAR imagery. This is of great concern for mapping complex land cover ecosystems, such as wetlands, where backscattering/spectrally similar signatures of land cover units further complicate the matter. Accordingly, we propose a new Fully Convolutional Network (FCN) architecture that can be trained in an end-to-end scheme and is specifically designed for the classification of wetland complexes using polarimetric SAR (PolSAR) imagery. The proposed architecture follows an encoder-decoder paradigm, wherein the input data are fed into a stack of convolutional filters (encoder) to extract high-level abstract features and a stack of transposed convolutional filters (decoder) to gradually up-sample the low resolution output to the spatial resolution of the original input image. The proposed network also benefits from recent advances in CNN designs, namely the addition of inception modules and skip connections with residual units. The former component improves multi-scale inference and enriches contextual information, while the latter contributes to the recovery of more detailed information and simplifies optimization. Moreover, an in-depth investigation of the learned features via opening the black box demonstrates that convolutional filters extract discriminative polarimetric features, thus mitigating the limitation of the feature engineering design in PolSAR image processing. Experimental results from full polarimetric RADARSAT-2 imagery illustrate that the proposed network outperforms the conventional random forest classifier and the state-of-the-art FCNs, such as FCN-32s, FCN-16s, FCN-8s, and SegNet, both visually and numerically for wetland mapping.

Keywords

Land cover, ta114, ta213, Deep learning, Encoder-decoder, Convolutional Neural Network (CNN), Polarimetric Synthetic Aperture Radar (PolSAR), Wetland, Fully Convolutional Network (FCN), SDG 15 - Life on Land

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    196
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 1%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Top 1%
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 0.1%
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
196
Top 1%
Top 1%
Top 0.1%
Upload OA version
Are you the author of this publication? Upload your Open Access version to Zenodo!
It’s fast and easy, just two clicks!