Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Neurocomputingarrow_drop_down
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
Neurocomputing
Article . 2020 . Peer-reviewed
License: Elsevier TDM
Data sources: Crossref
DBLP
Article . 2025
Data sources: DBLP
versions View all 2 versions
addClaim

Automatic fetal brain extraction from 2D in utero fetal MRI slices using deep neural network

Authors: Jinpeng Li; Yishan Luo; Lin Shi 0001; Xin Zhang 0013; Ming Li 0005; Bing Zhang 0012; Defeng Wang;

Automatic fetal brain extraction from 2D in utero fetal MRI slices using deep neural network

Abstract

Abstract Background In utero fetal MRI has been developing in common medical prenatal practice for nearly two decades. But the applications and research on fetal MRI still lag behind due to the lack of specialized image processing and analysis tools. Brain extraction, as an initial preprocessing step for many brain MRI-based processing methods, is an important basis for accurate fetal MRI analysis. However, it is very challenging to automatically extract fetal brains from fetal MRI due to the large variation in fetal brains across different gestational weeks and complex maternal tissues surrounding the fetal brains. Method We proposed a novel two-step framework using the deep learning method for solving the challenging problem of automatic fetal brain extraction in 2D in utero fetal MRI slices. The proposed framework consisted of two fully convolutional network (FCN) models, i.e., a shallow FCN and an extra deep multi-scale FCN (M-FCN). The first shallow FCN rapidly located the fetal brain and extracted the region of interest (ROI) containing the brain. Then, within the brain ROI, the M-FCN further refined the segmentation and produced the final brain mask by leveraging the multi-scale information and residual learning blocks. Dilated convolutional layers were employed in both FCNs to control the size of feature maps and increase the field of view. Result Eighty-eight 2D fetal MRIs were collected for experiments. We compared our method with the state-of-the-art methods on extracting fetal brains. It has been evaluated that our proposed framework outperformed the other methods in both fetal brain localization and segmentation tasks. With the proposed method, we located the fetal brain with an accuracy of 100%. The brain segmentation performance was measured based on the overlap between the automatic segmentations and the manual segmentations. Our proposed method achieved an average of 0.958 Dice score, 0.950 sensitivity rate, and 0.968 precision on the testing dataset, and it took an average of 6 s to process one fetal MRI stack on a workstation with TITAN X GPU and i7-6700 CPU. Conclusion In this paper, we proposed an effective and efficient deep learning framework for automatic fetal brain extraction from fetal MRI. It has been validated with solid experiments that the proposed method can be used as a practical and useful tool in clinical practice and neuroscience research.

Related Organizations
  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    26
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 10%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Top 10%
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 10%
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
26
Top 10%
Top 10%
Top 10%
Upload OA version
Are you the author of this publication? Upload your Open Access version to Zenodo!
It’s fast and easy, just two clicks!