No Cover Image

Journal article 283 views 38 downloads

Explainable breast cancer prediction from 3-dimensional dynamic contrast-enhanced magnetic resonance imaging

Arslan Akbar Orcid Logo, Suya Han Orcid Logo, Naveed Urr Rehman Orcid Logo, Kanwal Ahmed, Hassan Eshkiki Orcid Logo, Fabio Caraffini Orcid Logo

Applied Intelligence, Volume: 55, Issue: 13

Swansea University Authors: Hassan Eshkiki Orcid Logo, Fabio Caraffini Orcid Logo

  • 70043.VoR.pdf

    PDF | Version of Record

    © The Author(s) 2025. This article is licensed under a Creative Commons Attribution 4.0 International License.

    Download (4.99MB)

Abstract

Deep learning models have been instrumental in extracting critical indicators for breast cancer diagnosis - the prevalent malignancy among women worldwide - from baseline magnetic resonance imaging. However, many existing models do not fully leverage the rich spatial information available in the 3D...

Full description

Published in: Applied Intelligence
ISSN: 0924-669X 1573-7497
Published: Springer Science and Business Media LLC 2025
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa70043
Abstract: Deep learning models have been instrumental in extracting critical indicators for breast cancer diagnosis - the prevalent malignancy among women worldwide - from baseline magnetic resonance imaging. However, many existing models do not fully leverage the rich spatial information available in the 3D structure of medical imaging data, potentially overlooking important contextual details. This develops an explainable deep learning framework for classifying breast cancer that leverages the complete 3D and provides classification results alongside visual explanations of the decision-making process. The preprocessing pipeline is fed with 3D sequences containing ‘tumour’ and ‘non-tumour’ regions. It includes a 3D Adaptive Unsharp Mask (AUM) filter to reduce noise and augment image class, followed by normalisation and data augmentation. Classification is then achieved by training an augmented ResNet150 model. Three explainable artificial intelligence (XAI) techniques, including Shapley Additive Explanations, 3D Gradient-Weighted Class Activation Mapping, and Contextual Importance and Utility, are employed to provide improved interpretability. The model demonstrates state-of-the-art performance over the QIN-BREAST dataset, achieving testing accuracies of 98.861% for ‘tumours’ and 99.447% for ‘non-tumours’, as well as over the Duke Breast Cancer Dataset, where it achieves 99.104% for ‘tumours’ and 99.753% for ‘non-tumours’, while offering enhanced interpretability through XAI methods.
Keywords: Breast cancer; Deep learning; DCE-MRI; Explainable AI; RESNET150
College: Faculty of Science and Engineering
Funders: Swansea University
Issue: 13