Stereophotogrammetry typically employs a pair of cameras, or a single moving camera, to acquire pairs of images from different camera positions, in order to create a three dimensional ‘range map’ of the area being observed. Applications of this technique for building three-dimensional shape models include aerial surveying, remote sensing, machine vision, and robotics. Factors that would be expected to affect the quality of the range maps include the projection function (distortion) of the lenses and the contrast (modulation) and signal-to-noise ratio (SNR) of the acquired image pairs. Basic models of the precision with which the range can be measured assume a pinhole-camera model of the geometry, i.e. that the lenses provide perspective projection with zero distortion. Very-wide-angle or ‘fisheye’ lenses, however (for e.g. those used by robotic vehicles) typically exhibit projection functions that differ significantly from this assumption. To predict the stereophotogrammetric range precision for such applications, we extend the model to the case of an equidistant lens projection function suitable for a very-wide-angle lens. To predict the effects of contrast and SNR on range precision, we perform numerical simulations using stereo image pairs acquired by a stereo camera pair on NASA’s Mars rover Curiosity. Contrast is degraded and noise is added to these data in a controlled fashion and the effects on the quality of the resulting range maps are assessed.
It is widely accepted that the knowledge of the frequencies of the spectral response functions (SRF) of the channels of
hyperspectral sounders at the 10 parts per million (ppm) of frequency level is adequate for the retrieval of temperature
and moisture profiles and data assimilation for weather forecasting. However, SI traceability and knowledge at the 1 ppm
level and better are required to separate artifacts in the knowledge of the SRF due to orbital and seasonal instrument
effects from diurnal and seasonal effects due to climate change. We use examples from AIRS to discuss a spectral
calibration that uses the SI traceable upwelling radiance spectra to achieve an absolute accuracy of 0.5 ppm.
This paper presents a graphical software infrastructure for stereo display. It enables the development of low-cost/short development cycle stereo applications that are portable - not only across platforms, but across display types as well. Moreover, it allows not just images but entire GUI's (Graphics User Interface) to be displayed in stereo consistently across many platforms.
Java Advanced Display Infrastructure for Stereo (JADIS) provides a common interface for displaying GUI components in stereo using either specialized stereo display hardware (e.g. liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard computer displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience (anaglyphs) without sacrificing high-quality display on dedicated hardware.
JADIS has been released as Open Source and is available via the Open Channel foundation website[1]. It has been integrated into several applications for stereo viewing and processing of data acquired by current and future NASA Mars surface missions (e.g. Mars Exploration Rover (MER), Phoenix Lander, Mars Science Laboratory (MSL)).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.