Deep Learning for Real-time, Automatic, and Scanner-adapted Prostate (Zone) Segmentation of Transrectal Ultrasound, for Example, Magnetic Resonance Imaging–transrectal Ultrasound Fusion Prostate Biopsy

Ruud J. G. van Sloun, Rogier R. Wildeboer, Christophe K. Mannaerts, Arnoud W. Postema, Maudy Gayet, Harrie P. Beerlage, Georg Salomon, Hessel Wijkstra, Massimo Mischi

Research output: Contribution to journalArticleAcademicpeer-review

31 Citations (Scopus)

Abstract

Background: Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation. Objective: To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners. Design, setting, and participants: Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men. Outcome measurements and statistical analysis: Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance. Results and limitations: The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95–99%), a Jaccard index of 0.93 (0.80–0.96), and a Hausdorff distance of 3.0 (1.3–8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95–99%) and 98% (96–99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm's assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson's correlation 0.72, p < 0.001), indicating that possible incorrect segmentations can be identified swiftly. Conclusions: Fusion-guided prostate biopsies, targeting suspicious lesions on MRI using TRUS are increasingly performed. The requirement for (semi)manual prostate delineation places a substantial burden on clinicians. Deep learning provides a means for fast and accurate (zonal) prostate segmentation of TRUS images that translates to different scanners. Patient summary: Artificial intelligence for automatic delineation of the prostate on ultrasound was shown to be reliable and applicable to different scanners. This method can, for example, be applied to speed up, and possibly improve, guided prostate biopsies using magnetic resonance imaging–transrectal ultrasound fusion. Artificial intelligence for automatic delineation of the prostate on ultrasound was shown to be reliable and applicable to different scanners. This method can, for example, be applied to speed up, and possibly improve, guided prostate biopsies using magnetic resonance imaging–transrectal ultrasound fusion.

Original languageEnglish
Pages (from-to)78-85
Number of pages8
JournalEuropean urology focus
Volume7
Issue number1
Early online date23 Apr 2019
DOIs
Publication statusPublished - Jan 2021

Keywords

  • Deep learning
  • Prostate cancer
  • Segmentation
  • Ultrasound
  • magnetic resonance imaging–transrectal ultrasound fusion biopsy

Cite this