Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) instrument is a hyperspectral sounder slated to undergo thermal vacuum testing within a year. The University of Wisconsin - Madison is authoring a software suite to answer the requirement of testing the conversion of raw interferogram images into calibrated high-resolution spectra. The software consists of algorithm components that assemble into a processing pipeline as well as a testing harness utilizing a lightweight scripting language. The processing requirements for an imaging FTS are considerable, and necessitate an understanding of maximum achievable accuracy as well as exploration of tradeoffs in the interest of processing efficiency. We present an overview of the design of this testing software.
KEYWORDS: Databases, Calibration, Data processing, Matrices, Telecommunications, Control systems, Aerospace engineering, Satellites, Data modeling, Data archive systems
Future meteorological sounding instrumentation for aircraft and satellite platforms will include hyperspectral imaging infrared spectrometers with high time and space resolution, capable of providing terabytes of raw data per day. In tandem with the development of the instruments themselves, corresponding software must be architected to be capable of timely, efficient and accurate processing of the raw data produced. Design candidates for such a software architecture must respond to use cases including deployment in large-scale distributed production environments with stringent reliability specifications; phasing of research algorithms through testing and validation into production use; marshalling of data product views to metadata-aware analysis and archival systems; maintenance of software supporting multiple similar instrument systems over the course of decades; and most importantly, delivery of fully annotated datasets to end-users with real-time latencies. Consistent techniques in the specification and propagation of metadata for both algorithm software and data content are of paramount concern in manipulating large quantities of data over long stretches of time. Further, long-term maintainability and cost-effectiveness of the system can be assured by improving reusability of both systems software and science software, through defining well-specified interfaces for software components and implementing automated mechanisms for integration and testing. We illustrate current design work, avenues of research and lessons learned on a software component architecture and corresponding development practices addressing the aforementioned concerns.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.