Augmented reality in surgery: On the development of real-time interventional planning and navigation for neurosurgical and orthopedic use cases: bench-top to clinical evaluation

Research output: ThesisPhD Thesis

Abstract

Computer-aided navigation (CAN) is a surgical technology which allows a surgeon to
use patient medical image data as a map to guide both surgical planning and execution.
It comprises several interconnected processes: visualization of 3D medical image data,
tracking of surgical instrumentation, definition of a virtual coordinate system around
the patient, and the alignment of the image data to the patient.
Despite quantitative benefits, the technology is often not used due to size, cost, and
unintuitive visualization of 3D patient data as 2D black and white images. Augmented
reality (AR) devices often integrate requisite hardware for CAN into a compact and
mobile head mounted device (HMD) and allow the surgeon to view complex 3D data
as a “Hologram” overlying the patient. This work addresses technical limitations of
such low-cost AR hardware with respect to tracking performance and presents evidence
supporting their use in both neurosurgical and orthopedic domains.
Initial work focused on quick-response (QR) code tracking using the device’s front-
facing red-green-blue (RGB) color sensor, allowing a stable coordinate system to be
defined based on the 3D pose of a static QR marker in space. This both improved spatial
localization of AR visualization to a mean perceived spatial drift of 1.4mm in a dynamic
tracking environment and provided a proof of concept.
Building on this work, the development of monocular infrared (IR) tracking for pose
estimation of existing surgical instrumentation provided an improved method towards
establishing a reference coordinate system and a mechanism for precise user input.
Compared with earlier videometric tracking, the transition to the device’s IR sensor
provided a greater tracking field of view (FoV) and more favorable orientation. This
tracking solution was validated in a video converter (Vicon) motion capture lab and
demonstrated a pose estimation error of 0.78 mm ± 0.74 mm and 0.84° ± 0.64°.
Following this, phantom trials in navigated external ventricular drain (EVD), and total
shoulder and hip arthroplasty were performed. The results demonstrated a reduction
in technique learning curve of the former, and improved outcomes of the latter when
compared to traditional non-navigated techniques. Moreover, AR data registration was
found to be comparable to modern CAN systems.
Clinical trials in both tumor resection planning and EVD were then performed to
assess the efficacy of AR-CAN. In the former, AR-CAN demonstrated a reduction
in preoperative planning time with superior lesion delineation when compared to cur-
rent neuronavigation. Preliminary results in AR navigated EVD placement outcomes
demonstrate 82 % optimal (grade I), 18 % sub-optimal (grade II), and 0 % (grade III).
This currently outperforms literature, given single attempt insertion.
Original languageEnglish
Awarding Institution
  • Vrije Universiteit Brussel
Supervisors/Advisors
  • Vandemeulebroucke, Jef, Supervisor
  • Jansen, Bart, Co-Supervisor
  • Duerinck, Johnny, Co-Supervisor
  • Scheerlinck, Thierry, Co-Supervisor
Award date30 May 2024
Place of PublicationBrussels
Publisher
Print ISBNs9789464948257
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Augmented reality in surgery: On the development of real-time interventional planning and navigation for neurosurgical and orthopedic use cases: bench-top to clinical evaluation'. Together they form a unique fingerprint.

Cite this