消息
×
loading..
Quantification of tissue stiffness with magnetic resonance elastography and finite difference time domain (FDTD) simulation-based spatiotemporal neural network
2025-05-01
发表期刊MAGNETIC RESONANCE IMAGING (IF:2.1[JCR-2023],2.3[5-Year])
ISSN0730-725X
EISSN1873-5894
卷号118
发表状态已发表
DOI10.1016/j.mri.2025.110353
摘要

Quantification of tissue stiffness with magnetic resonance elastography (MRE) is an inverse problem that is sensitive to noise. Conventional methods for the purpose include direct inversion (DI) and local frequency estimation (LFE). In this study, we propose to train a spatiotemporal neural network using MRE data simulated by the Finite Difference Time Domain method (FDTDNet), and to use the trained network to estimate tissue stiffness from MRE data. The proposed method showed significantly better robustness to noise than DI or LFE. For simulated data with signal-to-noise ratio (SNR) of 15 dB, tissue stiffness by FDTDNet had mean absolute error of 0.41 kPa or 7 %, 77.8 % and 84.4 % lower than those by DI and LFE respectively (P < 0.0001). For a homogeneous phantom with driver power decreasing from 30 % to 5 %, FDTDNet, DI and LFE provided stiffness estimates with deviation of 6.9 % (0.21 kPa), 9.2 % (0.28 kPa) and 45.8 % (1.20 kPa) of the respective stiffness level at driver power of 30 %. Detectability of small inclusions in estimated stiffness maps is also critical. For simulated data with inclusions of radius of 0.31 cm, FDTDNet achieved contrast-to-noise ratio (CNR) of 4.20, 6900 % and 347 % higher than DI and LFE respectively (P < 0.0001), and structural similarity index (SSIM) of 0.61, 27 % and 177 % higher than DI and LFE respectively (P < 0.0001). For phantom with inclusion of radius 0.39 cm, CNR of FDTDNet was 2.98, 90 % and 80 % higher than DI and LFE respectively (P < 0.0001) and SSIM was 0.80, 89% and 28 % higher than DI and LFE respectively (P < 0.0001). We also demonstrated the feasibility of FDTDNet in MRE data acquired from calf muscles of human subjects. In conclusion, by using a spatiotemporal neural network trained with simulated data, FDTDNet estimated tissue stiffness from MRE with superior noise robustness and detectability of focal inclusions, therefore showed potential in precisely quantifying MRE of human subjects.

关键词Magnetic resonance elastography Tissue stiffness Deep learning Finite difference time domain method
URL查看原文
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[82171924]
WOS研究方向Radiology, Nuclear Medicine & Medical Imaging
WOS类目Radiology, Nuclear Medicine & Medical Imaging
WOS记录号WOS:001425937400001
出版者ELSEVIER SCIENCE INC
文献类型期刊论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/493506
专题生物医学工程学院
信息科学与技术学院_硕士生
通讯作者Zhang, Jeff L.
作者单位
1.ShanghaiTech Univ, Sch Biomed Engn, Shanghai, Peoples R China
2.Shanghai United Imaging Healthcare Co Ltd, Cent Res Inst, Shanghai, Peoples R China
3.Shanghai Jiao Tong Univ, Ruijin Hosp, Dept Radiol, Sch Med, Shanghai, Peoples R China
第一作者单位生物医学工程学院
通讯作者单位生物医学工程学院
第一作者的第一单位生物医学工程学院
推荐引用方式
GB/T 7714
Zhang, Jiaying,Mu, Xin,Lin, Xi,et al. Quantification of tissue stiffness with magnetic resonance elastography and finite difference time domain (FDTD) simulation-based spatiotemporal neural network[J]. MAGNETIC RESONANCE IMAGING,2025,118.
APA Zhang, Jiaying.,Mu, Xin.,Lin, Xi.,Kong, Xiangwei.,Li, Yanbin.,...&Zhang, Jeff L..(2025).Quantification of tissue stiffness with magnetic resonance elastography and finite difference time domain (FDTD) simulation-based spatiotemporal neural network.MAGNETIC RESONANCE IMAGING,118.
MLA Zhang, Jiaying,et al."Quantification of tissue stiffness with magnetic resonance elastography and finite difference time domain (FDTD) simulation-based spatiotemporal neural network".MAGNETIC RESONANCE IMAGING 118(2025).
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Zhang, Jiaying]的文章
[Mu, Xin]的文章
[Lin, Xi]的文章
百度学术
百度学术中相似的文章
[Zhang, Jiaying]的文章
[Mu, Xin]的文章
[Lin, Xi]的文章
必应学术
必应学术中相似的文章
[Zhang, Jiaying]的文章
[Mu, Xin]的文章
[Lin, Xi]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。