Generative adversarial networks for noise reduction in low-dose CT

Jelmer M. Wolterink, Tim Leiner, Max A. Viergever, Ivana Išgum

Research output: Contribution to journalArticleAcademicpeer-review

628 Citations (Scopus)


Noise is inherent to low-dose CT acquisition. We propose to train a convolutional neural network (CNN) jointly with an adversarial CNN to estimate routine-dose CT images from low-dose CT images and hence reduce noise. A generator CNN was trained to transform low-dose CT images into routine-dose CT images using voxelwise loss minimization. An adversarial discriminator CNN was simultaneously trained to distinguish the output of the generator from routine-dose CT images. The performance of this discriminatorwas used as an adversarial loss for the generator. Experimentswere performed using CT images of an anthropomorphic phantom containing calcium inserts, as well as patient non-contrast-enhanced cardiac CT images. The phantom and patients were scanned at 20% and 100% routine clinical dose. Three training strategies were compared: The first used only voxelwise loss, the second combined voxelwise loss and adversarial loss, and the third used only adversarial loss. The results showed that training with only voxelwise loss resulted in the highest peak signal-Tonoise ratio with respect to reference routine-dose images. However,CNNstrainedwith adversarial loss capturedimage statistics of routine-dose images better. Noise reduction improved quantification of low-density calcified inserts in phantom CT images and allowed coronary calcium scoring in low-dose patient CT images with high noise levels. Testing took less than 10 s per CT volume. CNN-based lowdose CT noise reduction in the image domain is feasible. Training with an adversarial network improves the CNNs ability to generate imageswith an appearance similar to that of reference routine-dose CT images.
Original languageEnglish
Pages (from-to)2536-2545
JournalIEEE transactions on medical imaging
Issue number12
Publication statusPublished - 2017
Externally publishedYes

Cite this