The use of deep neural networks (DNNs) in medical images has enabled the development of solutions characterized by the need of leveraging information coming from multiple sources, raising the multimodal deep learning. DNNs are known for their ability to provide hierarchical and high-level representations of input data. This capability has led to the introduction of methods performing data fusion at an intermediate level, preserving the distinctiveness of the heterogeneous sources in modality-specific paths, while learning the way to define an effective combination in a shared representation. However, modeling the intricate relationships between different data remains an open issue. In this article, we aim to improve the integration of data coming from multiple sources. We introduce between layers belonging to different modality-specific paths a transfer module (TM) able to perform the cross-modality calibration of the extracted features, reducing the effects of the less discriminative ones. As case of study, we focus on the axillary lymph nodes (ALNs) metastasis evaluation in malignant breast cancer (BC), a crucial prognostic factor, affecting patient's survival. We propose a multi-input single-output 3-D convolutional neural network (CNN) that considers both images acquired with multiparametric magnetic resonance and clinical information. In particular, we assess the proposed methodology using four architectures, namely BasicNet and three ResNet variants, showing the improvement of the performance obtained by including the TM in the network configuration. Our results achieve up to 90% and 87% of accuracy and area under ROC curve, respectively when the ResNet10 is considered, surpassing various fusion strategies proposed in the literature.

Cross-Modality Calibration in Multi-Input Network for Axillary Lymph Node Metastasis Evaluation

Santucci D.;Cordelli E.;Soda P.;
2024-01-01

Abstract

The use of deep neural networks (DNNs) in medical images has enabled the development of solutions characterized by the need of leveraging information coming from multiple sources, raising the multimodal deep learning. DNNs are known for their ability to provide hierarchical and high-level representations of input data. This capability has led to the introduction of methods performing data fusion at an intermediate level, preserving the distinctiveness of the heterogeneous sources in modality-specific paths, while learning the way to define an effective combination in a shared representation. However, modeling the intricate relationships between different data remains an open issue. In this article, we aim to improve the integration of data coming from multiple sources. We introduce between layers belonging to different modality-specific paths a transfer module (TM) able to perform the cross-modality calibration of the extracted features, reducing the effects of the less discriminative ones. As case of study, we focus on the axillary lymph nodes (ALNs) metastasis evaluation in malignant breast cancer (BC), a crucial prognostic factor, affecting patient's survival. We propose a multi-input single-output 3-D convolutional neural network (CNN) that considers both images acquired with multiparametric magnetic resonance and clinical information. In particular, we assess the proposed methodology using four architectures, namely BasicNet and three ResNet variants, showing the improvement of the performance obtained by including the TM in the network configuration. Our results achieve up to 90% and 87% of accuracy and area under ROC curve, respectively when the ResNet10 is considered, surpassing various fusion strategies proposed in the literature.
2024
Axillary lymph node (ALN); breast cancer (BC); convolutional neural networks (CNNs); cross-modality calibration; medical imaging analysis; multimodal deep leaning (MDL)
File in questo prodotto:
File Dimensione Formato  
Cross-Modality Calibration in Multi-Input Network for Axillary Lymph Node Metastasis Evaluation.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 1.04 MB
Formato Adobe PDF
1.04 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12610/86303
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact