波谱学杂志 ›› 2022, Vol. 39 ›› Issue (4): 401-412.doi: 10.11938/cjmr20222969

• 研究论文 • 上一篇    下一篇

基于多模态MRI与深度学习的乳腺病变良恶性鉴别

杨一风,祁章璇,聂生东*()   

  1. 上海理工大学 医学影像工程研究所, 上海 200093
  • 收稿日期:2022-01-04 出版日期:2022-12-05 发布日期:2022-03-15
  • 通讯作者: 聂生东 E-mail:nsd4647@163.com
  • 基金资助:
    国家自然科学基金资助项目(81830052);上海市科技创新行动计划资助项目(18441900500)

Differentiation of Benign and Malignant Breast Lesions Based on Multimodal MRI and Deep Learning

Yi-feng YANG,Zhang-xuan QI,Sheng-dong NIE*()   

  1. Institute of Medical Image Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China
  • Received:2022-01-04 Online:2022-12-05 Published:2022-03-15
  • Contact: Sheng-dong NIE E-mail:nsd4647@163.com

摘要:

为提高基于动态增强磁共振成像(DCE-MRI)的计算机辅助(CAD)方法对乳腺病变良恶性鉴别的精度,本文基于多模态特征融合,提出一种联合非对称卷积和超轻子空间注意模块的卷积神经网络AC_Ulsam_CNN.首先,采用迁移学习方法预训练模型,筛选出对乳腺病变良恶性鉴别最为有效的DCE-MRI扫描时序.而后,基于最优扫描时序图像,搭建基于AC_Ulsam_CNN网络的模型,以增强分类模型的特征表达能力和鲁棒性.最后,将影像特征与乳腺影像数据报告和数据系统(BI-RADS)分级、表观扩散系数(ADC)和时间-信号强度曲线(TIC)类型等多模态信息进行特征融合,以进一步提高模型对病灶的预测性能.采用五折交叉验证方法进行模型验证,本文方法获得了0.826的准确率(ACC)和0.877的受试者工作曲线下面积(AUC).这表明该算法在小样本量数据下可较好区分乳腺病变的良恶性,而基于多模态数据的融合模型也进一步丰富了特征信息,从而提高病灶的检出精度,为乳腺病灶良恶性的自动鉴别诊断提供了新方法.

关键词: 动态增强磁共振成像, 卷积神经网络, 多模态特征融合, 乳腺病变, 良恶性鉴别

Abstract:

To improve the accuracy of computer aided diagnosis (CAD) based on dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) in the differentiation of benign and malignant breast lesions, this study proposed a convolutional neural network model (AC_Ulsam_CNN) that is based on multi-modal feature fusion and the combination of asymmetric convolution (AC) and ultra-lightweight subspace attention module (Ulsam). Firstly, the transfer learning method was used to pre-train the model to screen out the most effective DCE-MRI time phase scan for differentiating benign and malignant breast lesions. Then, a network model based on AC_Ulsam_CNN was constructed based on the optimal time phase scan images to enhance the feature expression ability and robustness of the classification model. Finally, multimodal information such as breast imaging reporting and data system (BI-RADS) classification, apparent diffusion coefficient (ADC) and time-signal intensity curve (TIC) type were incorporated for feature fusion, to further improve the distinguishing performance of benign and malignant breast lesions. The performance of the model was verified by 5-fold cross-validation method, and the accuracy (ACC) of the proposed method was 0.826 and the area under the curve (AUC) was 0.877. The experimental results show that the proposed algorithm performs well in the classification of benign and malignant breast lesions with small sample size, and the fusion model based on multimodal data further enriches the feature information, thus this study improves the detection accuracy of lesions, and provides a new method for automatic differential diagnosis of benign and malignant breast lesions.

Key words: dynamic contrast enhanced magnetic resonance imaging, convolutional neural network, multimodal feature fusion, breast lesions, differentiation of benign and malignant lesions

中图分类号: