Early detection of at-risk keratoplasties and prediction of future corneal graft rejection from pre-diagnosis endothelial cell images

Naomi M. Joseph, Beth Ann Benetz, Harry Menegay, Silke Oellerich, Lamis Baydoun, Gerrit Melles, Jonathan H. Lass, David Wilson

Research output: Chapter in Book/Report/Conference proceedingConference paper

1 Citation (Scopus)


The status of the donor tissue post-keratoplasty (post-transplant), whether full or partial thickness, is currently assessed for health, function, and complications via clinical evaluations. This includes detection of visible signs of graft rejection on slit lamp biomicroscopy such as keratic precipitates or edema. Corneal endothelial cell (EC) images are utilized to indirectly assess the health of the cornea post keratoplasty with evidence that morphometric changes may occur prior to clinical signs of rejection. We extracted over 190 novel quantitative features from EC images acquired 1-12 months prior to patientś rejection diagnosis date, and used random forest (RF) classifiers to predict future rejection. We automatically segmented the cell borders of 171 EC images using a semi-automated segmentation approach: deep learning U-Net segmentation followed by guided manual correction. Following segmentation, we extracted novel quantitative features that robustly represented the cellular morphology from the EC images. We trained and tested a RF classifier using 5-fold cross validation and minimal Redundancy Maximal Relevance (mRMR) feature selection. From the 5-fold cross validation, we report an area under the receiver operating characteristic curve (AUC) of 0.87 ± 0.03, a sensitivity of 0.86 ± 0.12, and a specificity of 0.86 ± 0.10. The results suggest we can accurately predict a patient's future graft rejection 1- 12 months prior to diagnosis, enabling clinicians to intervene modifying and/or instituting topical corticosteroid therapy earlier with the possibility of lowering graft rejection failures. Success of this classifier could reduce health care costs, patient discomfort, vision loss and the need for repeat keratoplasty.

Original languageEnglish
Title of host publicationMedical Imaging 2021
Subtitle of host publicationComputer-Aided Diagnosis
EditorsMaciej A. Mazurowski, Karen Drukker
PublisherSPIE Press
ISBN (Electronic)9781510640238
Publication statusPublished - 2021
EventMedical Imaging 2021: Computer-Aided Diagnosis - Virtual, Online, United States
Duration: 15 Feb 202119 Feb 2021

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
ISSN (Print)1605-7422


ConferenceMedical Imaging 2021: Computer-Aided Diagnosis
Country/TerritoryUnited States
CityVirtual, Online

Bibliographical note

Funding Information:
This project was supported by the National Eye Institute through Grant No. NIH R21 EY02949801 (DW and BAB) and NIH U10 EY012358 and U10 EY020798 (JHL and BAB). The grants were obtained via collaboration between Case Western Reserve University, University Hospitals Eye Institute, and Cornea Image Analysis Reading Center. The content of this report was solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. There are no conflicts of interest. This work made use of the High-Performance Computing Resource in the Core Facility for Advanced Research Computing at Case Western Reserve University. The veracity guarantor, Chaitanya Kolluru, affirms to the best of his knowledge that all aspects of this paper are accurate. This research was conducted in space renovated using funds from an NIH construction grant (C06 RR12463) awarded to Case Western Reserve University.

Publisher Copyright:
© COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.


  • Cornea
  • Feature extraction
  • Image processing
  • Machine learning


Dive into the research topics of 'Early detection of at-risk keratoplasties and prediction of future corneal graft rejection from pre-diagnosis endothelial cell images'. Together they form a unique fingerprint.

Cite this