KEYWORDS: 3D modeling, Visualization, Visual process modeling, 3D image processing, Performance modeling, 3D vision, 3D visualizations, Eye, Eye models, 3D acquisition
Compared to the good performance that can be achieved by many 2D visual attention models, predicting salient regions of a 3D scene is still challenging. An efficient way to achieve this can be to exploit existing models designed for 2D content. However, the visual conflicts caused by binocular disparity and changes of viewing behavior in 3D viewing need to be dealt with. To cope with these, the present paper proposes a simple framework for extending
2D attention models for 3D images, well as evaluates center-bias in 3D-viewing condition. To validate the results, a database is created, which contains eye-movements of 35 subjects recorded during free viewing of eighteen 3D images and their corresponding 2D version. Fixation density maps indicate a weaker center-bias in the viewing of 3D images. Moreover, objective metric results demonstrate the efficiency of the proposed model and a large added value of center-bias when it is taken into account in computational modeling of 3D visual attention.
The influence of a monocular depth cue, blur, on the apparent depth of stereoscopic scenes will be studied in this paper.
When 3D images are shown on a planar stereoscopic display, binocular disparity becomes a pre-eminent depth cue. But
it induces simultaneously the conflict between accommodation and vergence, which is often considered as a main reason
for visual discomfort. If we limit this visual discomfort by decreasing the disparity, the apparent depth also decreases.
We propose to decrease the (binocular) disparity of 3D presentations, and to reinforce (monocular) cues to compensate
the loss of perceived depth and keep an unaltered apparent depth. We conducted a subjective experiment using a twoalternative
forced choice task. Observers were required to identify the larger perceived depth in a pair of 3D images
with/without blur. By fitting the result to a psychometric function, we obtained points of subjective equality in terms of
disparity. We found that when blur is added to the background of the image, the viewer can perceive larger depth
comparing to the images without any blur in the background. The increase of perceived depth can be considered as a
function of the relative distance between the foreground and background, while it is insensitive to the distance between
the viewer and the depth plane at which the blur is added.
This paper presents the results of two psychophysical experiments and an associated computational analysis
designed to quantify the relationship between visual salience and visual importance. In the first experiment,
importance maps were collected by asking human subjects to rate the relative visual importance of each object
within a database of hand-segmented images. In the second experiment, experimental saliency maps were
computed from visual gaze patterns measured for these same images by using an eye-tracker and task-free
viewing. By comparing the importance maps with the saliency maps, we found that the maps are related, but
perhaps less than one might expect. When coupled with the segmentation information, the saliency maps were
shown to be effective at predicting the main subjects. However, the saliency maps were less effective at predicting
the objects of secondary importance and the unimportant objects. We also found that the vast majority of early
gaze position samples (0-2000 ms) were made on the main subjects, suggesting that a possible strategy of early
visual coding might be to quickly locate the main subject(s) in the scene.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.