An interpretable deep learning approach for lesion detection and segmentation on whole-body [18F]FDG PET/CT

Research output: Chapter in Book/Report/Conference proceedingConference paper

Abstract

Automated lesion segmentation is essential to provide fast, reproducible tumor load estimates. Though deep learning methods have achieved unprecedented results in this field, they are often difficult to interpret, hampering their potential integration in the clinic. An interpretable deep learning approach is proposed for segmenting melanoma lesions on whole-body fluorine-18 fluorodeoxyglucose ([18F]FDG) positron emission tomography (PET) / computed tomography (CT). This consists of an automated PET thresholding step to identify FDGavid regions, followed by a three-channel nnU-Net considering the binary mask in addition to the PET and CT images. This segmentation step differentiates healthy from malignant tissue and removes the restriction on lesion boundaries imposed by the thresholding. The proposed method, trained on 267 images and evaluated on two sets acquired at the same institute, achieved mean Dice similarity coefficients (DSC) of 0.779 and 0.638 with mean absolute volume differences of 15.2mL and 22.0 mL. The DSC proved significantly higher compared to a direct, two-channel nnU-Net considering only the PET and CT. The same was observed when retraining and testing on subsets of the public data of the autoPET challenge, containing melanoma, lung cancer and lymphoma patients. In addition, overall results proved superior to a previously proposed two-step approach, where a classification network categorized each component of increased tracer uptake as healthy or malignant. The proposed lesion segmentation method for whole-body [18F]FDG PET/CT incorporates prior thresholding information while allowing more flexibility in the lesion delineation than a pure thresholding approach and increased interpretability over a direct segmentation network.
Original languageEnglish
Title of host publicationProc. SPIE 12926, Medical Imaging 2024: Image Processing
PublisherSPIE
Number of pages10
Volume12926
DOIs
Publication statusPublished - 2 Apr 2024
EventSPIE Medical Imaging 2024 -
Duration: 18 Feb 202422 Feb 2024

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume12926
ISSN (Print)1605-7422

Conference

ConferenceSPIE Medical Imaging 2024
Period18/02/2422/02/24

Bibliographical note

Publisher Copyright:
© 2024 SPIE.

Fingerprint

Dive into the research topics of 'An interpretable deep learning approach for lesion detection and segmentation on whole-body [18F]FDG PET/CT'. Together they form a unique fingerprint.

Cite this