Misalignment of teeth or jaws can impact the ability to chew or speak, increase the risk of gum disease or tooth decay, and potentially influence a person’s (psychological) well-being. Orthodontic treatments of misaligned teeth are complex procedures that employ dental braces to apply forces in order to move the teeth or jaws to their correct position. Photographs are typically used to document the treatment. An automatic analysis of those photographs could support the decision making and monitoring process. In this paper, we propose an automatic model-based end-to-end 3-D reconstruction approach of the teeth from five photographs with predefined viewing directions (i.e. the photographs used in orthodontic treatment documentation). It uses photo- or view-specific 2-D coupled shape models to extract the teeth contours from the images. The shape reconstruction is then carried out by a deformation-based reconstruction approach that utilizes 3-D coupled shape models and minimizes a silhouette-based loss. The optimal model parameters are determined by an optimization which maximizes the overlaps between the projected 2-D outlines of individual teeth of the 3-D model and the contours extracted from the corresponding photograph. After that the point displacements between the projected outline and segmented contour are used to iteratively deform the 3-D shape model of each tooth for all five views. Back-projection into shape space ensures that the 3-D coupled shape model consists of (statistically) valid teeth. Evaluation on 22 data sets shows promising results with an average symmetric surface distance of 0.848mm and an average DICE coefficient of 0.659.
Ultrasound (U/S) is a fast and non-expensive imaging modality that is used for the examination of various anatomical structures, e.g. the kidneys. One important task for automatic organ tracking or computer-aided diagnosis is the identification of the organ region. During this process the exact information about the transducer location and orientation is usually unavailable. This renders the implementation of such automatic methods exceedingly challenging. In this work we like to introduce a new automatic method for the detection of the kidney in 3D U/S images. This novel technique analyses the U/S image data along virtual scan lines. Here, characteristic texture changes when entering and leaving the symmetric tissue regions of the renal cortex are searched for. A subsequent feature accumulation along a second scan direction produces a 2D heat map of renal cortex candidates, from which the kidney location is extracted in two steps. First, the strongest candidate as well as its counterpart are extracted by heat map intensity ranking and renal cortex size analysis. This process exploits the heat map gap caused by the renal pelvis region. Substituting the renal pelvis detection with this combined cortex tissue feature increases the detection robustness. In contrast to model based methods that generate characteristic pattern matches, our method is simpler and therefore faster. An evaluation performed on 61 3D U/S data sets showed, that in 55 cases showing none or minor shadowing the kidney location could be correctly identified.
Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke
The purpose of this paper is to present a detailed description of our real-time navigation system for computer assisted
surgery. The system was developed with laparoscopic partial nephrectomies as a first application scenario.
The main goal of the application is to enable tracking of the tumor position and orientation during a surgery.
Our system is based on ultrasound to CT registration and electromagnetic tracking. The basic idea is to process
tracking information to generate an augmented reality (AR) visualization of a tumor model in the camera image
of a laparoscopic camera. Thereby it enhances the surgeon's view on the current scene and therefore facilitates
higher safety during the surgery. So far we have applied our system in vitro during two phantom trials with a
surgeon which yielded promising results.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.