Graph attention networks for segment labeling in coronary artery trees

Nils Hampe, Jelmer M. Wolterink, Carlos Collet, Nils Planken, Ivana Išgum

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)

Abstract

Accurately labeled segments of the coronary artery trees are important for diagnostic reporting of coronary artery disease. As current automatic reporting tools do not consider anatomical segment labels, accurate automatic solutions for deriving these labels would be of great value. We propose an automatic method for labeling segments in coronary artery trees represented by centerlines automatically extracted from CCTA images. Using the connectivity between the centerlines, we construct a tree graph. Coronary artery segments are defined as edges of this graph and characterized by location and geometry features. The constructed coronary artery tree is transformed into a linegraph and used as input to a graph attention network, which is trained to classify labels of coronary artery segments. The method was evaluated on 71 CCTA images, achieving an F1-score of 92.4% averaged over all patients and segments. The results indicate that graph attention networks are suitable for coronary artery tree labeling.
Original languageEnglish
Title of host publicationMedical Imaging 2021
Subtitle of host publicationImage Processing
EditorsIvana Isgum, Bennett A. Landman
PublisherSPIE
Volume11596
ISBN (Electronic)9781510640214
DOIs
Publication statusPublished - 2021
EventMedical Imaging 2021: Image Processing - Virtual, Online, United States
Duration: 15 Feb 202119 Feb 2021

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume11596

Conference

ConferenceMedical Imaging 2021: Image Processing
Country/TerritoryUnited States
CityVirtual, Online
Period15/02/202119/02/2021

Keywords

  • Artery labeling
  • Cardiac CT angiography
  • Coronary arteries
  • Graph attention networks
  • Graph convolutional networks

Cite this