Efficient high cone-angle artifact reduction in circular cone-beam CT using deep learning with geometry-aware dimension reduction

Jordi Minnema*, Maureen Van Eijnatten, Henri Der Sarkissian, Shannon Doyle, Juha Koivisto, Jan Wolff, Tymour Forouzanfar, Felix Lucka, Kees Joost Batenburg

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review


High cone-angle artifacts (HCAAs) appear frequently in circular cone-beam computed tomography (CBCT) images and can heavily affect diagnosis and treatment planning. To reduce HCAAs in CBCT scans, we propose a novel deep learning approach that reduces the three-dimensional (3D) nature of HCAAs to two-dimensional (2D) problems in an efficient way. Specifically, we exploit the relationship between HCAAs and the rotational scanning geometry by training a convolutional neural network (CNN) using image slices that were radially sampled from CBCT scans. We evaluated this novel approach using a dataset of input CBCT scans affected by HCAAs and high-quality artifact-free target CBCT scans. Two different CNN architectures were employed, namely U-Net and a mixed-scale dense CNN (MS-D Net). The artifact reduction performance of the proposed approach was compared to that of a Cartesian slice-based artifact reduction deep learning approach in which a CNN was trained to remove the HCAAs from Cartesian slices. In addition, all processed CBCT scans were segmented to investigate the impact of HCAAs reduction on the quality of CBCT image segmentation. We demonstrate that the proposed deep learning approach with geometry-aware dimension reduction greatly reduces HCAAs in CBCT scans and outperforms the Cartesian slice-based deep learning approach. Moreover, the proposed artifact reduction approach markedly improves the accuracy of the subsequent segmentation task compared to the Cartesian slice-based workflow.

Original languageEnglish
Article number135015
JournalPhysics in Medicine and Biology
Issue number13
Publication statusPublished - 7 Jul 2021

Cite this