Unsupervised learning for magnetization transfer contrast MR fingerprinting: Application to CEST and nuclear Overhauser enhancement imaging

link to paper

Unsupervised learning for magnetization transfer contrast MR fingerprinting: Application to CEST and nuclear Overhauser enhancement imaging

Beomgu Kang, Byungjai Kim, Michael Schär, HyunWook Park, Hye‐Young Heo

Abstract

Purpose

To develop a fast, quantitative 3D magnetization transfer contrast (MTC) framework based on an unsupervised learning scheme, which will provide baseline reference signals for CEST and nuclear Overhauser enhancement imaging.

Methods

Pseudo‐randomized RF saturation parameters and relaxation delay times were applied in an MR fingerprinting framework to generate transient‐state signal evolutions for different MTC parameters. Prospectively compressed sensing–accelerated (four‐fold) MR fingerprinting images were acquired from 6 healthy volunteers at 3 T. A convolutional neural network framework in an unsupervised fashion was designed to solve an inverse problem of a two‐pool MTC Bloch equation, and was compared with a conventional Bloch equation–based fitting approach. The MTC images synthesized by the convolutional neural network architecture were used for amide proton transfer and nuclear Overhauser enhancement imaging as a reference baseline image.

Results

The fully unsupervised learning scheme incorporated with the two‐pool exchange model learned a set of unique features that can describe the MTC–MR fingerprinting input, and allowed only small amounts of unlabeled data for training. The MTC parameter values estimated by the unsupervised learning method were in excellent agreement with values estimated by the conventional Bloch fitting approach, but dramatically reduced computation time by ~1000‐fold.

Conclusion

Given the considerable time efficiency compared to conventional Bloch fitting, unsupervised learning‐based MTC–MR fingerprinting could be a powerful tool for quantitative MTC and CEST/nuclear Overhauser enhancement imaging.