Multimodal registration of intraoperative ultrasound and preoperative contrast enhanced computed tomography (CT) imaging is the basis for image guided percutaneous hepatic interventions. Currently, the surgeon manually performs a rigid registration using vessel structures and other anatomical landmarks for visual guidance. We have previously presented our approach for an automation of this intraoperative registration step based on the definition of bijective correspondences between the vessel structures using an automatic graph matching.1 This paper describes our method for refinement and expansion of the matched vessel graphs, resulting in a high number of bijective correspondences. Based on these landmarks, we could extend our method to a fully deformable registration. Our system was applied successfully on CT and ultrasound data of nine patients, which are studied in this paper. The number of corresponding vessel points could be raised from a mean of 9.6 points after the graph matching to 70.2 points using the presented refinement method. This allows for the computation of a smooth deformation field. Furthermore, we can show that our deformation calculation raises the registration accuracy for 3 of the 4 chosen target vessels in pre-/postoperative CT with a mean accuracy improvement of 44%.
Registration of intra-operative ultrasound with preoperative CT is highly desirable as a navigational aid for surgeons and
interventional radiologists. Image-based solutions generally achieve poor results due to substantially different image
appearance of ultrasound and CT. A method is presented that uses surface information and tracked ultrasound to
improve registration results. Tracked ultrasound is combined with surface and image-based registration techniques to
register ultrasound to CT. Surface data is acquired using an optically tracked range sensor, for example time-of-flight
camera. Range data is registered to CT using robust point-set registration; this registration provides an approximate
transformation from tracker to CT coordinates. The ultrasound probe is also optically tracked. The probe position and
surface-based registration provide a first estimate for the position of the ultrasound image in CT coordinates. This
estimate is subsequently refined by a final image-based registration stage. Initial tests using Coherent Point Drift
algorithm for registering surface data to CT show favorable results. Tests using both simulated and real time-of-flight
range data have good convergence over a wide initial translation and rotation misalignment domain. Preliminary testing
using time-of-flight surface data suggests that surface to CT registration may be useful as an initial guess enabling later
more precise (but less robust) image based methods for registering ultrasound images to CT. We believe this method will
enable image-based algorithms to robustly converge to an optimal registration solution.
The purpose of this paper is to present a detailed description of our real-time navigation system for computer assisted
surgery. The system was developed with laparoscopic partial nephrectomies as a first application scenario.
The main goal of the application is to enable tracking of the tumor position and orientation during a surgery.
Our system is based on ultrasound to CT registration and electromagnetic tracking. The basic idea is to process
tracking information to generate an augmented reality (AR) visualization of a tumor model in the camera image
of a laparoscopic camera. Thereby it enhances the surgeon's view on the current scene and therefore facilitates
higher safety during the surgery. So far we have applied our system in vitro during two phantom trials with a
surgeon which yielded promising results.
We present an image-guided intervention system based on tracked 3D elasticity imaging (EI) to provide a novel
interventional modality for registration with pre-operative CT. The system can be integrated in both laparoscopic and
robotic partial nephrectomies scenarios, where this new use of EI makes exact intra-operative execution of pre-operative
planning possible. Quick acquisition and registration of 3D-B-Mode and 3D-EI volume data allows intra-operative
registration with CT and thus with pre-defined target and critical regions (e.g. tumors and vasculature). Their real-time
location information is then overlaid onto a tracked endoscopic video stream to help the surgeon avoid vessel damage
and still completely resect tumors including safety boundaries.
The presented system promises to increase the success rate for partial nephrectomies and potentially for a wide range of
other laparoscopic and robotic soft tissue interventions. This is enabled by the three components of robust real-time
elastography, fast 3D-EI/CT registration, and intra-operative tracking. With high quality, robust strain imaging (through
a combination of parallelized 2D-EI, optimal frame pair selection, and optimized palpation motions), kidney tumors that
were previously unregistrable or sometimes even considered isoechoic with conventional B-mode ultrasound can now be
imaged reliably in interventional settings. Furthermore, this allows the transformation of planning CT data of kidney
ROIs to the intra-operative setting with a markerless mutual-information-based registration, using EM sensors for intraoperative
motion tracking.
Overall, we present a complete procedure and its development, including new phantom models - both ex vivo and
synthetic - to validate image-guided technology and training, tracked elasticity imaging, real-time EI frame selection,
registration of CT with EI, and finally a real-time, distributed software architecture. Together, the system allows the
surgeon to concentrate on intervention completion with less time pressure.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.