Static X-ray computed tomography (CT) volumes are often used as anatomic roadmaps during catheter-based cardiac
interventions performed under X-ray fluoroscopy guidance. These CT volumes provide a high-resolution depiction of
soft-tissue structures, but at only a single point within the cardiac and respiratory cycles. Augmenting these static CT
roadmaps with segmented myocardial borders extracted from live ultrasound (US) provides intra-operative access to
real-time dynamic information about the cardiac anatomy. In this work, using a customized segmentation method based
on a 3D active mesh, endocardial borders of the left ventricle were extracted from US image streams (4D data sets) at a
frame rate of approximately 5 frames per second. The coordinate systems for CT and US modalities were registered
using rigid body registration based on manually selected landmarks, and the segmented endocardial surfaces were
overlaid onto the CT volume. The root-mean squared fiducial registration error was 3.80 mm. The accuracy of the
segmentation was quantitatively evaluated in phantom and human volunteer studies via comparison with manual
tracings on 9 randomly selected frames using a finite-element model (the US image resolutions of the phantom and
volunteer data were 1.3 x 1.1 x 1.3 mm and 0.70 x 0.82 x 0.77 mm, respectively). This comparison yielded 3.70±2.5
mm (approximately 3 pixels) root-mean squared error (RMSE) in a phantom study and 2.58±1.58 mm (approximately 3
pixels) RMSE in a clinical study. The combination of static anatomical roadmap volumes and dynamic intra-operative
anatomic information will enable better guidance and feedback for image-guided minimally invasive cardiac
interventions.
This work presents an integrated system for multimodality image guidance of minimally invasive medical procedures.
This software and hardware system offers real-time integration and registration of multiple image streams with
localization data from navigation systems. All system components communicate over a local area Ethernet network,
enabling rapid and flexible deployment configurations. As a representative configuration, we use X-ray fluoroscopy
(XF) and ultrasound (US) imaging. The XF imaging system serves as the world coordinate system, with gantry geometry
derived from the imaging system, and patient table position tracked with a custom-built measurement device using linear
encoders. An electromagnetic (EM) tracking system is registered to the XF space using a custom imaging phantom that
is also tracked by the EM system. The RMS fiducial registration error for the EM to X-ray registration was 2.19 mm,
and the RMS target registration error measured with an EM-tracked catheter was 8.81 mm. The US image stream is
subsequently registered to the XF coordinate system using EM tracking of the probe, following a calibration of the US
image within the EM coordinate system. We present qualitative results of the system in operation, demonstrating the
integration of live ultrasound imaging spatially registered to X-ray fluoroscopy with catheter localization using
electromagnetic tracking.
We developed an augmented reality navigation system for MR-guided interventions. A head-mounted display provides in real-time a stereoscopic video-view of the patient, which is augmented with three-dimensional medical information to perform MR-guided needle placement procedures. Besides with the MR image information, we augment the scene with 3D graphics representing a forward extension of the needle and the needle itself. During insertion, the needle can be observed virtually at its actual location in real-time, supporting the interventional procedure in an efficient and intuitive way. In this paper we report on quantitative results of AR guided needle placement procedures on gel phantoms with embedded targets of 12mm and 6mm diameter; we furthermore evaluate our first animal experiment involving needle insertion into deep lying anatomical structures of a pig.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.