Chinese Journal of Magnetic Resonance ›› 2022, Vol. 39 ›› Issue (3): 303-315.doi: 10.11938/cjmr20222988

• Articles • Previous Articles     Next Articles

Automatic Segmentation of Knee Joint Synovial Magnetic Resonance Images Based on 3D VNetTrans

Ying-shan WANG1,Ao-qi DENG3,Jin-ling MAO1,Zhong-qi ZHU1,Jie SHI2,*(),Guang YANG1,Wei-wei MA4,Qing LU4,*(),Hong-zhi WANG1,*()   

  1. 1. Shanghai Key Laboratory of Magnetic Resonance, School of Physics and Electronic Science, East China Normal University, Shanghai 200062, China
    2. Shanghai Guanghua Hospital of Integrated Traditional Chinese and Western Medicine, Shanghai 200052; China
    3. College of Acupuncture and Massage, Shanghai University of Chinese Medicine, Shanghai 200032, China
    4. Renji Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai 200127, China
  • Received:2022-03-23 Online:2022-09-05 Published:2022-05-11
  • Contact: Jie SHI,Qing LU,Hong-zhi WANG E-mail:ghyyfsk@163.com;drluqingsjtu@163.com;hzwang@phy.ecnu.edu.cn

Abstract:

Knee joint is commonly hurt by rheumatoid arthritis (RA). Accurate segmentation of synovium is essential for the diagnosis and treatment of RA. This paper proposes an algorithm based on improved VNet for automatically segmenting knee joint synovial magnetic resonance images. Firstly, the knee joint magnetic resonance images of 39 patients with synovitis were preprocessed. VNetTrans was constructed by embedding Transformer at the bottom of VNet. The MemSwish activation function was used for training. The average Dice score of the final model is 0.758 5 and the HD is 24.6 mm. Compared with VNet, the proposed model increased Dice score by 0.083 6 and decreased HD by 10 mm. Experimental results demonstrated that the proposed algorithm achieved satisfying 3D segmentation of the synovial hyperplasia area in the knee magnetic resonance images. It can be utilized to facilitate the diagnosis and monitoring of RA.

Key words: magnetic resonance image, medical image segmentation, deep learning, synovitis

CLC Number: