Recently, there are tremendous growths in the area of 3D stereoscopic visualization. The 3D stereoscopic visualization technology has been used in a growing number of consumer products such as the 3D televisions and the 3D glasses for gaming systems. This technology refers to the idea that human brain develops depth of perception by retrieving information from the two eyes. Our brain combines the left and right images on the retinas and extracts depth information. Therefore, viewing two video images taken at slightly distance apart as shown in Figure 1 can create illusion of depth [8]. Proponents of this technology argue that the stereo view of 3D visualization increases user immersion and performance as more information is gained through the 3D vision as compare to the 2D view. However, it is still uncertain if additional information gained from the 3D stereoscopic visualization can actually improve user performance in real world situations such as in the case of teleoperation.
In a geographic information system (GIS), suitability analysis is used to model the spatial distribution of suitability within a region of interest with regard to a planning goal. This analysis is based on the combination of multiple geospatial source datasets, which spatially overlap, and each encodes a factor that contributes a certain weight to the overall suitability. “Possibility space” refers to an event space that represents all possible outcomes of the suitability analysis. This paper proposes an interactive possibility space for real-time visualization and exploration with a goal to help understand meaningful relationships between variable combinations and the suitability outcomes. A case study for siting wind farm locations in northwest Iowa is presented to demonstrate the practical application and usefulness of the possibility space.
In this paper we describe a novel approach for comparing users' spatial cognition when using different depictions of 360-
degree video on a traditional 2D display. By using virtual cameras within a game engine and texture mapping of these
camera feeds to an arbitrary shape, we were able to offer users a 360-degree interface composed of four 90-degree views,
two 180-degree views, or one 360-degree view of the same interactive environment. An example experiment is described
using these interfaces. This technique for creating alternative displays of wide-angle video facilitates the exploration of
how compressed or fish-eye distortions affect spatial perception of the environment and can benefit the creation of
interfaces for surveillance and remote system teleoperation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.