Early detection of breast cancer, one of the leading causes of death by cancer for women in the US is key to any strategy designed to reduce breast cancer mortality. Breast self-examination (BSE) is considered as the most cost- effective approach available for early breast cancer detection because it is simple and non-invasive, and a large fraction of breast cancers are actually found by patients using this technique today. In BSE, the patient should use a proper search strategy to cover the whole breast region in order to detect al possible tumors. At present there is no objective approach or clinical data to evaluate the effectiveness of a particular BSE strategy. Even if a particular strategy is determined to be the most effective, training women to use it is still difficult because there is no objective way for them to know whether they are doing it correctly. We have developed a system using vision-based motion tracking technology to gather quantitative data about the breast palpation process for analysis of the BSE technique. By tracking position of the fingers, the system can provide the first objective quantitative data about the BSE process, and thus can improve our knowledge of the technique and help analyze its effectiveness. By visually displaying all the touched position information to the patient as the BSE is being conducted, the system can provide interactive feedback to the patient and create a prototype for a computer-based BSE training system. We propose to use color features, put them on the finger nails and track these features, because in breast palpation the background is the breast itself which is similar to the hand in color. This situation can hinder the ability/efficiency of other features if real time performance is required. To simplify feature extraction process, color transform is utilized instead of RGB values. Although the clinical environment will be well illuminated, normalization of color attributes is applied to compensate for minor changes in illumination. Neighbor search is employed to ensure real time performance, and a three-finger pattern topology is always checked for extracted features to avoid any possible false features. After detecting the features in the images, 3D position parameters of the colored fingers are calculated using the stereo vision principle. In the experiments, a 15 frames/second performance is obtained using an image size of 160 X 120 and an SGI Indy MIPS R4000 workstation. The system is robust and accurate, which confirms the performance and effectiveness of the proposed approach. The system is robust and accurate, which confirms the performance and effectiveness of the proposed approach. The system can be used to quantify search strategy of the palpation and its documentation. With real-time visual feedback, it can be used to train both patients and new physicians to improve their performance of palpation and thus visual feedback, it can be used to train both patients and new physicians to improve their performance of palpation and thus improve the rate of breast tumor detection.
|