Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ IEEE Accessarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2021 . Peer-reviewed
License: CC BY
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article
License: CC BY
Data sources: UnpayWall
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2021
Data sources: DOAJ
https://dx.doi.org/10.60692/z7...
Other literature type . 2021
Data sources: Datacite
https://dx.doi.org/10.60692/hf...
Other literature type . 2021
Data sources: Datacite
versions View all 4 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

A Novel Deep-Learning Model for Automatic Detection and Classification of Breast Cancer Using the Transfer-Learning Technique

نموذج جديد للتعلم العميق للكشف التلقائي عن سرطان الثدي وتصنيفه باستخدام تقنية التعلم الانتقالي
Authors: Abeer Saber; Mohamed Sakr; Osama M. Abo-Seida; Arabi Keshk; Huiling Chen;

A Novel Deep-Learning Model for Automatic Detection and Classification of Breast Cancer Using the Transfer-Learning Technique

Abstract

Le cancer du sein (BC) est l'une des principales causes de décès par cancer chez les femmes. La détection précoce de la CB permet aux patients de recevoir un traitement approprié, augmentant ainsi les chances de survie. Dans ce travail, un nouveau modèle d'apprentissage profond (DL) basé sur la technique d'apprentissage par transfert (TL) est développé pour aider efficacement à la détection et au diagnostic automatiques de la zone suspectée de BC sur la base de deux techniques, à savoir 80-20 et la validation croisée. Les architectures DL sont modélisées pour être spécifiques aux problèmes. TL utilise les connaissances acquises lors de la résolution d'un problème dans un autre problème pertinent. Dans le modèle proposé, les caractéristiques sont extraites de l'ensemble de données de la société d'analyse d'images mammographiques (SIAM) à l'aide d'une architecture de réseau neuronal convolutif (CNN) pré-entraînée telle que Inception V3, ResNet50, les réseaux du groupe de géométrie visuelle (VGG)-19, VGG-16 et Inception-V2 ResNet. Six mesures d'évaluation pour évaluer la performance du modèle proposé en termes d'exactitude, de sensibilité, de spécificité, de précision, de score F et d'aire sous la courbe roc (ASC) ont été choisies. Les résultats expérimentaux montrent que le TL du modèle VGG16 est puissant pour le diagnostic de la CB en classant les images mammographiques du sein avec une précision globale, une sensibilité, une spécificité, une précision, un score F et une ASC de 98,96 %, 97,83 %, 99,13 %, 97,35 %, 97,66 % et 0,995, respectivement pour la méthode 80-20 et 98,87 %, 97,27 %, 98,2 %, 98,84 %, 98,04 % et 0,993 pour la méthode de validation croisée 10 fois.

El cáncer de mama (BC) es una de las principales causas de muerte por cáncer entre las mujeres. La detección temprana de BC permite a los pacientes recibir el tratamiento adecuado, aumentando así la posibilidad de supervivencia. En este trabajo, se desarrolla un nuevo modelo de aprendizaje profundo (DL) basado en la técnica de aprendizaje por transferencia (TL) para ayudar de manera eficiente en la detección automática y el diagnóstico del área sospechosa de BC basado en dos técnicas, a saber, 80-20 y validación cruzada. Las arquitecturas DL se modelan para ser específicas del problema. TL utiliza el conocimiento adquirido durante la resolución de un problema en otro problema relevante. En el modelo propuesto, las características se extraen del conjunto de datos de la sociedad de análisis DE imágenes mamográficas (mias) utilizando una arquitectura de red neuronal convolucional (CNN) previamente entrenada, como Inception V3, ResNet50, Visual Geometry Group networks (VGG)-19, VGG-16 e Inception-V2 ResNet. Se han elegido seis métricas de evaluación para evaluar el rendimiento del modelo propuesto en términos de precisión, sensibilidad, especificidad, precisión, puntaje F y área bajo la curva Roc (AUC). Los resultados experimentales muestran que el TL del modelo VGG16 es poderoso para el diagnóstico de BC al clasificar las imágenes mamarias de mamografía con precisión general, sensibilidad, especificidad, precisión, puntaje F y AUC de 98.96%, 97.83%, 99.13%, 97.35%, 97.66% y 0.995, respectivamente, para el método 80-20 y 98.87%, 97.27%, 98.2%, 98.84%, 98.04% y 0.993 para el método de validación cruzada de 10 veces.

Breast cancer (BC) is one of the primary causes of cancer death among women. Early detection of BC allows patients to receive appropriate treatment, thus increasing the possibility of survival. In this work, a new deep-learning (DL) model based on the transfer-learning (TL) technique is developed to efficiently assist in the automatic detection and diagnosis of the BC suspected area based on two techniques namely 80-20 and cross-validation. DL architectures are modeled to be problem-specific. TL uses the knowledge gained during solving one problem in another relevant problem. In the proposed model, the features are extracted from the mammographic image analysis- society (MIAS) dataset using a pre-trained convolutional neural network (CNN) architecture such as Inception V3, ResNet50, Visual Geometry Group networks (VGG)-19, VGG-16, and Inception-V2 ResNet. Six evaluation metrics for evaluating the performance of the proposed model in terms of accuracy, sensitivity, specificity, precision, F-score, and area under the ROC curve (AUC) has been chosen. Experimental results show that the TL of the VGG16 model is powerful for BC diagnosis by classifying the mammogram breast images with overall accuracy, sensitivity, specificity, precision, F-score, and AUC of 98.96%, 97.83%, 99.13%, 97.35%, 97.66%, and 0.995, respectively for 80-20 method and 98.87%, 97.27%, 98.2%, 98.84%, 98.04%, and 0.993 for 10-fold cross-validation method.

سرطان الثدي (BC) هو أحد الأسباب الرئيسية للوفاة بالسرطان بين النساء. يسمح الكشف المبكر عن BC للمرضى بتلقي العلاج المناسب، مما يزيد من إمكانية البقاء على قيد الحياة. في هذا العمل، تم تطوير نموذج جديد للتعلم العميق (DL) يعتمد على تقنية النقل والتعلم (TL) للمساعدة بكفاءة في الكشف والتشخيص التلقائي للمنطقة المشتبه بها في BC بناءً على تقنيتين هما 80-20 والتحقق المتبادل. تم تصميم بنيات DL لتكون خاصة بالمشكلة. يستخدم قائد الفريق المعرفة المكتسبة أثناء حل مشكلة واحدة في مشكلة أخرى ذات صلة. في النموذج المقترح، يتم استخراج الميزات من مجموعة بيانات تحليل صورة الثدي الشعاعية (MIAS) باستخدام بنية الشبكة العصبية الالتفافية (CNN) المدربة مسبقًا مثل Inception V3 و ResNet50 وشبكات مجموعة الهندسة البصرية (VGG)-19 و VGG -16 و Inception - V2 ResNet. تم اختيار ستة مقاييس تقييم لتقييم أداء النموذج المقترح من حيث الدقة والحساسية والنوعية والدقة ودرجة F والمنطقة تحت منحنى ROC (AUC). تظهر النتائج التجريبية أن TL لنموذج VGG16 قوي لتشخيص BC من خلال تصنيف صور الثدي بالأشعة السينية بدقة وحساسية وخصوصية ودقة و F - score و AUC بنسبة 98.96 ٪ و 97.83 ٪ و 99.13 ٪ و 97.35 ٪ و 97.66 ٪ و 0.995 على التوالي لطريقة 80-20 و 98.87 ٪ و 97.27 ٪ و 98.2 ٪ و 98.84 ٪ و 98.04 ٪ و 0.993 لطريقة التحقق المتبادل 10 أضعاف.

Related Organizations
Keywords

Artificial neural network, Artificial intelligence, Deep Learning in Medical Image Analysis, Classification of Brain Tumor Type and Grade, Convolutional neural network, Receiver operating characteristic, transfer learning, Pattern recognition (psychology), deep-learning, Breast Cancer Diagnosis, Residual neural network, Transfer of learning, Deep Learning, Breast cancer, Engineering, Artificial Intelligence, Automated Analysis of Blood Cell Images, convolutional neural networks, Machine learning, Pharmacokinetics, Area under curve, Internal medicine, Cancer, Electronic engineering, Life Sciences, Deep learning, Cross-validation, Transfer Learning, Computer-Aided Detection, Computer science, Sensitivity (control systems), TK1-9971, machine learning, Neurology, Computer Science, Physical Sciences, Medical Image Analysis, Medicine, Electrical engineering. Electronics. Nuclear engineering, Computer Vision and Pattern Recognition, image classification, Neuroscience

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    221
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 0.1%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Top 1%
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Top 0.1%
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
221
Top 0.1%
Top 1%
Top 0.1%
gold