PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Wright Laboratory's Dynamic Infrared Missile Evaluator (DIME) Facility has been performing hardware-in-the-loop (HWIL) infrared seeker exploitation and countermeasures development for over thirty years. A unique feature of this facility is the close integration of hardware exploitation, computer simulation, and HWIL testing, which facilitates feedback between predicted and actual seeker performance. This paper provides an introduction to the DIME Facility's HWIL testing approach and capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Flight testing of ballistic missile interceptors is limited by the complexity and costs of the test exercises, and inherent test limitations. While flight testing remains an essential part of ballistic missile weapons systems test programs, alternate test methods, such as hardware-in-the- loop (HWIL) testing, are used to fully characterize system operational characteristics. An integrated test approach that spans all levels of testing has been developed by the Ballistic Missile Defense Organization (BMDO) to characterize their developing and fielded weapon systems. The BMDO has developed an integrated three tiered HWIL testing concept to assess the performance of its acquisition programs. The three tiers are the element level which includes sub-element level testing of critical components, the weapon system level, and the family of systems (Fos) level. The objective of element level testing is to evaluate the performance of each element of the weapon system with emphasis on exercising the processing components of the element under test. Element level testing is focused on testing the internal interactions of each test element in a simulated environment. The objective of weapon system level is to evaluate the data exchange between the elements of the weapon system in an integrated, simulated operational environment. The primary purpose of FoS level testing is assessing the interoperability of ballistic missile defense (BMD) weapon systems, both under development and fielded. This paper examines the test processes and objectives which occur at each of the three HWIL tiers, describes facilities and testbeds BMDO is using to implement this approach, and discusses how data from each of the tiers is used to help address program flight test issues and reduce program risk.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
An optical signal injector (OSI) system has been developed for use in the hardware-in-the-loop (HWIL) testing of laser radar (LADAR) seekers. The OSI, in conjunction with a scene generator, generates optical signals simulating the return signals of a LADAR seeker and delivers them to a Unit Under Test. The signals produced by the OSI represent range and intensity (reflectivity) data of a target scene from a given HWIL scenario. The OSI has a modular architecture to allow for easy modification (e.g., operating wavelength, number of optical channels) and is primarily composed of commercial off-the-shelf components to improve reliability and reduce cost. Presented here is a description of the OSI and its capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Kinetic Energy Weapon (KEW) program under the Ballistic Missile Defense Office (BMDO) need high fidelity, fast framing infrared (IR) imaging seekers. As imaging sensors have matured to support BMDO, the complexity of functions assigned to the KEW weapon systems has amplified the necessity for robust hardware-in-the-loop (HWIL) simulation facilities to reduce program risk. Tactical weapon systems are also turning to imaging focal plane array seekers. They too require more sophisticated HWIL testing. The IR projector, an integral component of a HWIL simulation, must reproduce the real world with enough fidelity that the unit- under-test's software will respond to the projected scenario of images as through it were viewing the real world. The MOSFET resistor array IR scene projector shows great promise in cryogenic vacuum chamber as well as room temperature testing. Under the Wideband Infrared Scene Projector (WISP) program, a second generation resistor array has been delivered and characterized. Characterization measurements to include: spectral output, dynamic range capability, apparent temperature, rise time, and fall time, have been accomplished on the second generation array at the Kinetic Kill Vehicle Hardware-in-the Loop Simulator facility and the Guided Weapons Evaluation Facility, Eglin AFB, FL. Dynamic range output exceeds to WISP specification. Other parameters such as, rise time etc., either meet or are close to meeting system specifications. The final design of the WISP arrays is currently in progress based on these results. Also reported on in this document are performance measurements of the analog drive electronics' noise level, accuracy and resolution. The performance of the drive electronics had to be established before any radiometric output could be measured.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Barry E. Cole, Robert E. Higashi, Jeff A. Ridley, J. Holmen, Earl Thomas Benser, Robert G. Stockbridge, Robert Lee Murrer Jr., Lawrence E. Jones, Eddie Burroughs Jr.
Proceedings Volume Technologies for Synthetic Environments: Hardware-in-the-Loop Testing II, (1997) https://doi.org/10.1117/12.280971
An addressable mosaic array of resistively heated microbridges offers the potential to project accurate dynamic infrared (IR) imagery. The main purpose of this imagery is to be used in the evaluation of IR instruments from seekers to FLIRs. With the growing development of lower cost uncooled IR imagers, scene projectors also offer the potential for dynamic testing of these new instruments. In past years we have described developments in a variety of IR projectors systems designed for different purposes. In this paper we will describe recent developments in these technologies aimed at improving or understanding temporal and radiative performance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper describes the cumulative status-to-date, and current developments in the British Aerospace IR scene projector system technology in early 1997. The systems have been developed for Hardware-in-the-loop simulation in missile test and evaluation facilities. Historically, the technology has been called `Thermal Picture Synthesis' an early equivalent of what is now known as Infra-red Scene Projection. Earlier generations of system were based on a monolithic resistor-substrate construction, a modification of which is still used for ground target simulations (TPS3), whereas the more recent systems for air target simulations are based on fully suspended resistor designs (TPS4). These projector systems incorporating full scale arrays have been fabricated at up to 256 X 256 complexity. Research work is being carried out on high temperature arrays for air-to- air countermeasure simulations, and the first TPS5 full system at 512 X 512 complexity has completed its design stage and has recently moved into fabrication. Research testbed arrays of 768 X 768 have just been made, and 1024 X 1024 arrays are presently being fabricated. The paper includes an initial introduction to the basics of the technology, and is followed by a section on certain specialized features to combat inherent issues in the technology. Specifications and the current status of each category of device is then given.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A dynamic infrared (IR) scene projector which is based upon diode lasers is now operational at the US Army Missile Command's Research, Development, and Engineering Center. The projector is referred to as the Laser Diode Array Projector. It utilizes a 64-element linear array of Pb-salt diode lasers coupled with a high-speed optical scanning system, drive electronics and synchronization electronics to generate in-band IR scenes. The projector is interfaced to a real-time scene generation computer which is capable of 3D scene generation. This paper describes the process for calibration of the projector and the correction of spatial non-uniformities which are inherent in the projector design. Each laser within the system must be calibrated so that its output power is linear with respect to input gray level. The calibration table for each laser is stored in the projector electronics memory and is applied in real-time. In addition, spatial variations in perceived pixel intensity must be corrected such that the output scene is uniform. Gain and offset correction factors for each pixel are used to correct the spatial non-uniformities. The gain and offset terms are applied to each pixel in real-time by the projector drive electronics. The projector's overall performance characteristics, including the non-uniformity correction (NUC) performance level achieved-to-date, are presented in the paper. Issues associated with NUC limitations are also discussed. Sample images generated with the projector and captured by an InSb FPA sensor are included in the text.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
During the past several years, the technology for designing and fabricating thermal pixel arrays (TPAs) using silicon micromachined CMOS devices has been developed adequately to support the development of a real-time infrared test set (RTIR) for sensors and seekers. The TPA is a custom application-specific integrated circuit device that is fabricated using a low-cost commercial CMOS process. The system architecture and development of the initial RTIR Test Set is described. The RTIR is a compact, self-contained test instrument that is intended for test and evaluation of infrared systems in the field. In addition to the TPA, the RTIR contains projection optics and electronics which drive the TPA, provide TPA nonuniformity compensation, data translation, data transformation, and user interface. The RTIR can display internal test patterns (static and dynamic), external digital video data, and NTSC video. The initial RTIR unit incorporates a 64 X 64 TPA to provide flickerless infrared scenes at 30 frames per second. Additional TPAs are under development with formats of 128 X 128, 256 X 256, and 512 X 512 pixels.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The introduction of spatial, spectral and temporal discrimination in IR seekers has greatly increased the level of sophistication required of hardware-in-the-loop test facilities. In the long term, IR scene projectors offer promise of meeting many of these requirements. However, Wright Laboratory's Dynamic Infrared Missile Evaluator (DIME) Facility has ongoing requirements to evaluate state- of-the-art IR seekers. This paper describes reliable and effective techniques integrated into the DIME to meet current requirements for testing seekers with spatial, spectral and temporal discrimination capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The TSG is used for evaluating infrared missile seekers by dynamic targets and EOCM realistic environment. The system generates multi mode primary and secondary targets, up to three flares and jammers combined with thermal background image of 10 degree(s) field of view. Each component is independently controlled to provide 2D trajectory, velocity and acceleration. Four orders of magnitude in LOS angular velocity can be accomplished. The system allows for variation of sources angular size, radiated intensity and other spatial and temporal modulation. All sources are combined in a collimated output beam. The beam is further projected through an optical relay and a Field Of Regard assembly. This mechanism displays the whole scenario in a wide angle span onto the seeker aperture. Further system improvements involves combining dynamic infrared scene projector with high temperature sources under real time high dynamics, for better performances with imaging seekers of maneuvered platforms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Resistive array technology is finding increasing application in representing synthetic infrared targets and backgrounds. Pixel-to-pixel radiance nonuniformity is the prominent noise source from resistive arrays and must be compensated or otherwise mitigated for high-fidelity testing of infrared imaging sensors. Any imaging method for measuring and correcting nonuniformity noise is subject to theoretical performance limitations due to sensor measurement noise, geometrical resolution, background offset, and optical resolution. We derive general performance bounds as functions of sensor parameters, which are equally applicable to staring and scanning nonuniformity correction (NUC) sensors. A thorough understanding of the theoretical limitations of the process allows intelligent specification of the NUC sensor, procedures, algorithms, and processing power required for any scene projection application. We describe the NUC approach developed for the US Army's Dynamic Infrared Scene Projector (DIRSP). We also exhibit the features of our software package, which calculates emitter calibration curves from automatically collected laboratory data. We show how the code deals with practical considerations such as detector fill factor, incorrect magnification, rotation between emitter array and NUC sensor, optics anisoplanity, dead pixels, data overload, and automatic detection of emitter signals. The paper concludes by showing a glimpse of procedures, algorithms, and sensor to be used in the DIRSP.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In a series of measurements made to characterize the performance of a Wideband Infrared Scene Projector (WISP) system, timing artifacts were observed in one set of tests in which the projector update was synchronized with the camera readout. The projector was driven with images that varied from frame to frame, and the measured images were examined to determine if they varied from frame to frame in a corresponding manner. It was found that regardless of the relative time delay between the projector update and sensor readout, each output image was a result of two input images. By analyzing the timing characteristics of the camera integration scheme and the WISP update scheme it was possible to understand effects in the measured images and simulate images with the same effects. This paper describes the measurements and the analyses. Although the effects were due to the unique camera integration and readout scheme, the effects could show up when testing other sensors. Thus also presented in this paper are techniques for testing with resistive array projectors, so that the timing artifacts observed with various kinds of cameras are minimized or eliminated.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ever increasing developments in imaging infrared (IR) seekers that are being designed for Ballistic Missile Defense Office guided interceptor programs have amplified the necessity for robust hardware-in-the-loop (HWIL) testing to reduce program risk. Successful IR HWIL testing requires a high fidelity spatial, spectral, and temporal IR projector. Recent characterization measurements of a 512 X 512 metal-oxide semiconductor field-effect transistor (MOSFET) resistor array show that resistor array technology is a leading contender for the IR projector. As with any array device, nonuniform performance between individual elements of the array is a concern. This paper addresses a simplified approach to accomplishing the nonuniformity correction of a resistor array in real-time. The first step in this process is to obtain a nominal output curve typical of the resistors' MOSFET output. The key feature of this simplified process is that all output curves specific to individual resistors can be related to this typical curve with a simple gain and offset correction. In practice, the inverse of the typical output curve is stored in a look-up table in order to obtain the required command for a desired output and then a correcting gain and offset are applied. Results from this process show great promise.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Kinetic Energy Weapon (KEW) programs under the Ballistic Missile Defense Office (BMDO) need high fidelity infrared (IR) seekers. As imaging sensors have matured to support BMDO, the complexity of functions assigned to the KEW weapon systems has magnified the necessity for robust hardware-in- the-loop (HWIL) simulation facilities to reduce program risk. The IR projector, an integral component of a HWIL simulation, must reproduce the real world with enough fidelity that the unit-under-test algorithms respond to the projected images as though it were viewing the real world. For test scenarios involving unresolved objects, IR projector arrays have limitations which constrain testing accuracy. These arrays have limited dynamic range, spatial resolution, and spatial bandwidth for unresolved targets, decoys, and debris. The Steerable Laser Projector (SLP) will allow the HWIL simulation facility to address these testing issues. The Kinetic Kill Vehicle Hardware-in-the-loop Simulation (KHILS) facility located at Eglin AFB, FL is now in the process of integrating a projector array with the SLP. This new projector combines the capabilities of both projector technologies to provide KHILS with a unique asset that addresses many of the challenges that are required to support testing of state-of-the-art IR guided weapons.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This is the second in a series of papers describing an on- going investigation into the detailed performance of our resistor array infra-red scene projector devices and systems. The purpose is to extract understanding and information which will enable validation of simulations involving the systems, and design compromises to be resolved. Following last year's conclusions, the importance of Non Uniformity Correction is reinforced and the concept of Local Step Error and its importance is developed and investigated practically. A test methodology is developed, and the first steps in practical measurements are reported.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
H. Ronald Marlin, Richard L. Bates, Miles H. Sweet, Rowena M. Carlson, R. Barry Johnson, Diehl H. Martin, Ronald Chung, Jon C. Geist, Michael Gaitan, et al.
Proceedings Volume Technologies for Synthetic Environments: Hardware-in-the-Loop Testing II, (1997) https://doi.org/10.1117/12.280947
During the past several years, the technology for designing and fabricating thermal pixel arrays (TPAs) using silicon micromachined CMOS devices has been developed to support the development of a real-time infrared test set (RTIR) for sensors and seekers. The TPA is a custom application- specific integrated circuit device that is fabricated using a low-cost commercial CMOS process. The RTIR is a compact, self-contained test instrument that is intended for test and evaluation of infrared systems in the field. This paper describes characterization of TPA pixels, including directional radiant intensity distribution, spatial radiance distribution, temperature distribution, cross talk, spectral radiance, air pressure effects, and rise and fall times.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Target and Background Representation for Synthetic Test Environments--Joint Session with Conf. 7502
The generation of high-fidelity imagery of infrared radiation from aircraft targets is a computationally intensive task. These calculations must include details associated with the heating of the airframe, generation of the exhaust flowfield, and transport of the emitted, reflected, and absorbed radiation through the atmosphere. Additionally, spatial and temporal features such as complex airframe geometries, hot body parts, engine exhaust states, and atmospheric path must be consistently resolved regardless of aircraft and sensor orientation to eliminate nonphysical artifacts. This paper presents computational techniques to compute aircraft infrared radiation imagery for high frame rate applications at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator facility located at Eglin AFB. Details concerning the underlying phenomenologies are also presented to provide an understanding of the computational rationale. Finally, several example calculations are presented to illustrate the level of fidelity that can be achieved using these methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Techniques and tools for validation of real-time infrared target signature models are presented. The model validation techniques presented in this paper were developed for hardware-in-the-loop (HWIL) simulations at the U.S. Army Missile Command's Research, Development, and Engineering Center. Real-time target model validation is a required deliverable to the customer of a HWIL simulation facility and is a critical part of ensuring the fidelity of a HWIL simulation. There are two levels of real-time target model validation. The first level is comparison of the target model to some baseline or measured data which answers the question `are the simulation inputs correct?'. The second level of validation is a simulation validation which answers the question `for a given target model input is the simulation hardware and software generating the correct output?'. This paper deals primarily with the first level of target model validation. IR target signature models have often been validated by subjective visual inspection or by objective, but limited, statistical comparisons. Subjective methods can be very satisfying to the simulation developer but offer little comfort to the simulation customer since subjective methods cannot be documented. Generic statistical methods offer a level of documentation, yet are often not robust enough to fully test the fidelity of an IR signature. Advances in infrared seeker and sensor technology have led to the necessity of system specific target model validation. For any HWIL simulation it must be demonstrated that the sensor responds to the real-time signature model in a manner which is functionally equivalent to the sensor's response to a baseline model. Depending on the application, a baseline method can be measured IR imagery or the output of a validated IR signature prediction code. Tools are described that generate validation data for HWIL simulations at MICOM and example real-time model validations are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Technology effects are underway at Arnold Engineering Development Center to extend closed-loop Direct Write Scene Generation capabilities to include advanced signal-injection and thermal-array optical projection capabilities. Laser- projection for sensor optics with or without optics installed, signal-injection, and thermal-array optical projection schemes provide direct simulation of dynamic electro-optic sensor systems. FPAs and electro-optic sensors are stimulated with simulated infrared scenes for optical diagnostics and evaluation of focal plane arrays or electro- optic sensor systems, and to simulate complex mission scenarios. Closed-loop operation can provide high-fidelity simulation of complex infrared scenes, sensor optical blurring, and other temporal effects such as jitter. Although all optical stimulation and testing methods have inherent advantages, compromises, and limitations, there is a common desire to not only maximize optical simulation and photonic stimulation fidelity through advanced verification and validation efforts, but to also minimize computational requirements for high-performance diagnostics. Computational and source-to-FPA oversampling have very similar fidelity defects and requirements for signal-injection, laser- projection, and thermal-array optical projection diagnostic- methods. This paper briefly describes scene generation and projection technology and corresponding research devoted to sampling issues and criteria related to FPA oversampling, corresponding fidelity defects, and performance trades.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper presents an analysis of spatial blurring and sampling effects for a sensor viewing a pixelized scene projector. It addresses the ability of a projector to simulate an arbitrary continuous radiance scene using a field of discrete elements. The spatial fidelity of the projector as seen by an imaging sensor is shown to depend critically on the width of the sensor MTF or spatial response function, and the angular spacing between projector pixels. Quantitative results are presented based on a simulation that compares the output of a sensor viewing a reference scene to the output of the sensor viewing a projector display of the reference scene. Dependence on the blur of the sensor and projector, the scene content, and alignment both of features in the scene and sensor samples with the projector pixel locations are addressed. We attempt to determine the projector characteristics required to perform hardware-in-the-loop testing with adequate spatial realism to evaluate seeker functions like autonomous detection, measuring radiant intensities and angular positions or unresolved objects, or performing autonomous recognition and aimpoint selection for resolved objects.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Image filtering in sampled dynamic infrared scene projection systems is examined from the point of view of providing an improved insight into the choice of the pixel mapping ratio between the projector and imaging unit-under-test. The 2D vector analysis underlying the transfer of image information in such systems is reviewed and is applied to the dynamic infrared scene projection case. It is shown that the 4:1 (2 X 2:1) pixel mapping ratio previously recommended in a desirable criterion from the spatial fidelity viewpoint, particularly when high spatial frequency information represented by point sources and scene edges is being projected. Cost constraints can, however, prevent the 4:1 mapping ratio from being met, in which case the effects on hardware-in-the-loop simulation validity need to be examined carefully. The vector analysis presented here provides a tool useful for the future examination of such cases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Real-time infrared (IR) scene generation from HardWare-in- the-Loop (HWIL) testing of IR seeker systems is a complex problem due to the required frame rates and image fidelity. High frame rates are required for current generation seeker systems to perform designation, discrimination, identification, tracking, and aimpoint selection tasks. Computational requirements for IR signature phenomenology and sensor effects have been difficult to perform in real- time to support HWIL testing. Commercial scene generation hardware is rapidly improving and is becoming a viable solution for HWIL testing activities being conducted at the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator facility at Eglin AFB, Florida. This paper presents computational techniques performed to overcome IR scene rendering errors incurred with commercially available hardware and software for real-time scene generation in support of HWIL testing. These techniques provide an acceptable solution to real-time IR scene generation that strikes a balance between physical accuracy and image framing rates. The results of these techniques are investigated as they pertain to rendering accuracy and speed for target objects which begin as a point source during acquisition and develop into an extended source representation during aimpoint selection.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
An approach to utilize the symmetric multiprocessing environment of the Silicon GraphicsTM (SGI) OnyxTM and Onyx2TM has been developed to support the generation of infrared (IR) scenes in real-time. This development, supported by the Navy, is driven by a desire to maximize use of commercial-off-the-shelf (COTS) hardware and minimize cost and development time. In the past, real-time IR scene generators have been developed as custom architectures that are expensive and difficult to maintain. An SGI based Real- time Infrared Scene Simulator (RISS) system was developed to utilize the SGI's fast symmetric multiprocessing hardware to perform real-time IR scene radiance calculations for scene objects. During real-time scene simulation, the SGI symmetric multiprocessors are used to update polygon vertex locations and compute radiometrically accurate floating point radiance values. The output of this process can be utilized to drive a variety of scene rendering engines including the SGI Reality Engine2TM, the SGI Infinite RealityTM, and Amherst Systems' custom Scene Rendering System. This paper will discuss the critical technologies that apply to infrared scene generation and hardware-in-the- loop testing using COTS hardware. Specifically, the application of RISS high-fidelity real time radiance algorithms on the SGI's symmetric multiprocessing hardware will be discussed. Also, issues relating to the integration of the real-time algorithms with various rendering engines and external real-time control will be addressed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Infrared Scene Generation Technologies: Phenomenology
Virtual reality is a new way to enhance the human interaction with a simulator or another man-machine system. Synthetic environment of virtual reality provides new possibilities of human activity due to the use of various sensor channels. Stereo visualization provides human immersion to virtual space and is the important feature of virtual reality. The investigations of technical and programming tools for stereo visualization of highly accurate and detailed 3D models of objects and the terrain with geo-specifically placed objects like buildings, roads, forests and other special landmarks are discussed. Hardware includes liquid crystal shutter glasses and Intel Pentium computer with standard monitor. Use of original photogrammetric and rendering software under MS Windows provides very realistic `walk-through' and `fly-over' simulations. These tools are cheaper than ones oriented to powerful workstations. The examples of animations and virtual spaces with designed for airplane pilot training 3D site models of real scenes are demonstrated.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Quick Image Display (QUID) model accurately computes and displays radiance images at animation rates while the target undergoes unrestricted flight or motion. Animation rates are obtained without sacrificing radiometric accuracy by using three important innovations. First, QUID has been implemented using the OpenGL graphics library which utilizes available graphics hardware to perform the computationally intensive hidden surface calculations. Second, the thermal emission and reflectance calculations are factorized into angular and wavelength dependent terms for computational efficiency. Third, QUID uses OpenGL supported texture mapping to simulate pseudo curved surface reflectance. The size of the glint texture is controlled by paint/surface properties and the surface normals at the facet's vertices. QUID generates IR radiance maps, in-band and spectral signatures for high level of detail targets with thousands of facets. Model features are illustrated with radiance and radiance contrast images of aircraft.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ultraviolet (UV) extensions have been incorporated in the Real-time IR/EO Scene Simulator to support hardware-in-the- loop (HWIL) testing of UV missile warning systems. The UV extensions add capability to model missile plume signatures and off-axis atmospheric scattering in the solar blind UV spectral region (200 - 400 nm). Preliminary testing and validation of the UV rendering algorithms were performed. Simulated UV missile signatures were compared to measured static and free-flight test data. This work was performed to support the test and evaluation of modern missile warning systems at Wright labs Integrated Defensive Avionics Lab. This paper will discuss development of HWIL testing capability for UV missile warning receiver systems. Requirements for real-time UV simulation will be defined and a real-time architecture that addresses these requirements will be discussed. Specifically, the development of real- time UV rendering algorithms to support modeling of atmospherics and backgrounds to the UV solar blind wavelength region will be outlined. Issues regarding implementation of spatial UV scattering effects in a real- time rendering environment will be addressed. Development of the UV sensor model, and its potential implementation in a real-time hardware/software direct injection device will be illustrated. The results of some preliminary validation using laboratory and live firing test data will also be discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Hardware-in-the-loop testing of IR/EO sensor systems requires the application of real-time IR/EO simulation/stimulation technology. Stimulation of the IR/EO system under test can be achieved by either scene projection or by direct signal injection. Although less intrusive, projection technologies are (to date) somewhat immature, thus requiring the use of direct-injection approaches for many sensor test applications. Even where projection can be applied, direct-injection testing provides additional flexibility and test capabilities which can augment the projection-based testing. The development of multiple, system-specific sensor interfaces provides one solution, however it is a very costly approach in terms of both time and money. What is needed is a reconfigurable, universal programmable interface (UPI), which can be utilized with minimal changes to stimulate a wide variety of IR/EO sensor systems. A UPI can also be utilized during the sensor development phase to evaluate and refine sensor designs. An opportunity now exists to leverage emerging trends in technology to develop such an interface, given a clear understanding of the processing and interface requirements. This paper discusses the development of processing and interface requirements for the UPI, as well as the resulting system design concept which will be used for its implementation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The technology for 3D model design of real world scenes and its photorealistic rendering are current topics of investigation. Development of such technology is very attractive to implement in vast varieties of applications: military mission planning, crew training, civil engineering, architecture, virtual reality entertainments--just a few were mentioned. 3D photorealistic models of urban areas are often discussed now as upgrade from existing 2D geographic information systems. Possibility of site model generation with small details depends on two main factors: available source dataset and computer power resources. In this paper PC based technology is presented, so the scenes of middle resolution (scale of 1:1000) be constructed. Types of datasets are the gray level aerial stereo pairs of photographs (scale of 1:14000) and true color on ground photographs of buildings (scale ca.1:1000). True color terrestrial photographs are also necessary for photorealistic rendering, that in high extent improves human perception of the scene.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A challenging problem associated with performing hardware- in-the-loop tests of imaging infrared seekers is projecting images that are spatially realistic. The problem is complicated by the fact that the targets may be small and unresolved at acquisition and grow to fill the field of view during the final guidance updates. Although characteristics of the projection system are usually thought of as determining the spatial realism, the imagery used to drive the projector is also important. For a pixelized projector, the driving imagery must be sampled at a rate determined by the sample spacing of the pixels in the projector. If the scenes contain important information that is small compared to the projector pixel spacing (that is, if they have important information at high spatial frequencies), then information may be lost in the sampling process if the images are not adequately bandlimited. This bandlimiting can be accomplished by prefiltering the scenes. At acquisition, targets are usually small; thus, prefiltering is necessary to preserve information about the target. Without such prefiltering, for example, infinitesimally small targets would never be seen unless they just happened to be at the exact location where the scene is sampled for a projector pixel. This paper reports the results of a study of various filters that might be used for prefiltering synthetic imagery generated to drive projectors in the KHILS facility. Projector and seeker characteristics typical of the KHILS facility were adopted for the study. Since the radiance produced by projectors is always positive, filters that can produce negative values were not considered. Figures of merit were defined based on the sensor-measured quantities such as radiant intensity, centroid, and spot size. The performance of prefilters of various shapes and sizes and for typical projector and seeker characteristics will be reported.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Laboratory Test and Evaluation of imaging infrared (I2R) systems is being greatly enhanced through the use of the Electro-Optics Sensor Flight Evaluation Laboratory (EOSFEL) and the Electro-Optics Target Acquisition Sensor Evaluation Laboratory (EOTASEL) at the US Army Redstone Technical Test Center. In addition to other standard and future test support, these laboratories will be utilized to support tactical I2R missile system interoperability testing. The EOSFEL is a state-of-the-art, performance grade, Hardware-In-the-Loop test capability for in-band, closed- loop test and evaluation of optically guided missile seekers, guidance sections, and control sections. The EOTASEL is a class 100,000 clean room laboratory, with state-of-the-art test capability for evaluating the performance of electro-optical target acquisition and fire control subsystems in a hardware/human-in-the-loop environment. With I2R missile systems being developed to work with electro-optical target acquisition subsystems, such as the second generation Forward Looking Infrared sights, the need arises for testing the interoperability of these sensor subsystems within the cost effective confines of the laboratory. Interoperability testing today is currently performed at the system level in real-world field environments, which is very expensive and costly to identify problems at this level. This paper describes a realistic technique for performing high fidelity laboratory interoperability testing which utilizes the EOSFEL and EOTASEL including two Dynamic Infrared Scene Projector systems, a five-axis flight motion simulator, a two-axis platform motion simulator, climatic chambers, supporting instrumentation, and computer control.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Advanced Simulation Laboratory (ASL) at Lockheed Martin Electronics and Missiles in Orlando, Florida, has developed and integrated a suite of visualization tools designed to support and enhance hardware-in-the-loop simulation. State- of-the-art computer image generators with multiple channels and viewpoints using detailed, fully articulated models provide real-time views of the system under test. Workstations serve as symbol generators overlaying flight symbology and textual data on the virtual scene. A digital moving map displays mission progress, critical points and target symbology. A real-time data monitor program allows engineers to organize data logically for display during the run. National Instrument's LabView provides virtual gauges, meters, status indicators and even stripcharts. The ASL visualization tools are fully integrated into the lab video system. As a result, all visualization channels can be recorded and routed to any display in the lab including the multi-screen, multimedia demonstration and briefing room. Video routing and host control for the image generators, symbol generators and digital map are under control of a real-time, scripted visualization control program. These visualization techniques have been used successfully in two HWIL simulation projects and are considered an integral part of any future projects.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Arnold Engineering Development Center (AEDC) Scene Generation Test Capability program has completed the development of a laser based Direct Write Scene Generation (DWSG) facility that provides dynamic mission simulation testing for infrared (IR) Focal Plane Arrays (FPAs) and their associated signal processing electronics. The AEDC DWSG Focal Plane Array Test Capability includes lasers operating at 0.514, 1.06, 5.4, or 10.6 micrometers , and Acousto- Optic Deflectors (AODs) which modulate the laser beam position and amplitude. Complex Radio Frequency (RF) electronics controls each AOD by providing multi-frequency inputs. These inputs produce a highly accurate and independent multi-beam deflection, or `rake', that is swept across the FPA sensor under test. Each RF amplitude input to an AOD translates into an accurate and independent beam intensity in the rake. Issues such as scene fidelity, sensor frame rates, scenario length, and real-time laser beam position adjustments require RF control electronics that employs the use of advanced analog and digital signal processing techniques and designs. By implementing flexible system architectures in the electronics, the overall capability of the DWSG to adapt to emerging test requirements is greatly enhanced. Presented in this paper is an overview of the signal processing methodology and designs required to handle the DWSG requirement. Further, the paper will summarize the current status of recent AEDC technology efforts tasked with the implementation of real-time and closed-loop scene manipulation including sensor optical simulation using the DWSG. The paper will describe a proof- of-principle demonstration which used high speed digital signal processors inherent in the DWSG electronic design to compute the rotation, translation, optical effects via convolution, and system calibration functions during scene projection. Additionally, recent concepts which are based on DWSG electronic designs to enable integrated multi-sensor testing will be presented. These concepts establish a method for the separate or simultaneous test and evaluation of different IR sensor types using various kinds of sensor stimulation. Examples of sensor stimulation would include laser based projection such as DWSG or resistive-thermal arrays, and direct analog or digital signal injection schemes.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Holographic Interferometry has been successfully employed to characterize the materials and behavior of diverse types of structures under stress. Specialized variations of this technology have also been applied to define dynamic and vibration related structural behavior. Such applications of holographic technique offer some of the most effective methods of modal and dynamic analysis available. Real-time dynamic testing of the model behavior of aerodynamic control structures for advanced missiles systems has always required advanced instrumentation for data collection in either actual flight test or wind-tunnel simulations. Advanced optical holographic techniques are alternate methods which define actual behavioral data on the ground in a noninvasive hardware-in-the-loop environment. These methods offer significant insight in both the development and subsequent operational test and modeling of advanced composite control structures and their integration with total vehicle system dynamics. Structures and materials can be analyzed with very low amplitude excitation and the resultant data can be used to adjust the accuracy of mathematically derived structural models. Holographic Interferometry has offered a powerful tool to aid in the primary engineering and development of advanced graphite-epoxy fiber composite structures which are finding increased use in advanced aerodynamic platforms. Smart weapon and missile control structure applications must consider environments where extremes in vibration and mechanical stresses can affect both operation and structural stability. These are ideal requisites for analysis using advanced holographic methods in the initial design and subsequent test of such advanced components. Holographic techniques are non-destructive, real-time, and definitive in allowing the identification of vibrational modes, displacements, and motion geometries. Deriving such information without having to resort to in-flight data collection methods can be crucial to the determination of mechanical configurations and designs, as well as critical operational parameters.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Infrared Scene Generation Technologies: Phenomenology
Resistor arrays are the leading technology for testing tactical imaging infrared sensors with a real-time Dynamic Infrared Scene Projector (DIRSP) system. The fundamental goal of a DIRSP system is to project `in-band' infrared imagery to a level of detail such that a Unit Under Test (UUT) perceives and responds to the synthesized scenes just as it would to the real world scenes. In the real world, these tactical scenes are continuous functions that contain both low and high spatial frequencies. Unfortunately, resistor arrays have a discrete number of elements requiring a sampled version of the scenario. The output of the DIRSP is a stepwise continuous radiance distribution that is projected through the DIRSP optics, the UUT optics, and onto the UUT detector array. In many sensors, the UUT detector array produces a sampled version of the irradiance. This continuous to digital to continuous to digital system requires careful analysis regarding the aliasing that may result. Results of such an analysis are presented here. Specifically, the aliasing issues are addressed with results obtained for the typical case of a slightly undersampled sensor (regarded in testing as `natural' aliasing). The analysis indicates the scene projector's spatial frequency limit (i.e., its folding frequency) should exceed the average of the UUT sensor' cutoff spatial frequency and the spatial frequency cutoff of the scene pre-filter (or scene band limit if pre-filtering is not used). This constraint does not eliminate aliasing. Rather is provides for the natural aliasing present in the sensor while avoiding spurious effects from unnatural aliasing in the creation and projection of the synthetic tactical scenes. The scene projector requirement developed in this work is applicable for tactical imagers and imaging missile seekers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
As cost becomes an increasingly important factor in the development and testing of infrared (IR) sensors and flight computer/processors, the need for accurate hardware-in-the- loop simulations is critical. In the past, expensive and complex dedicated scene generation hardware was needed to attain the fidelity necessary for accurately testing systems under test (SUT). Recent technological advances and innovative applications of established technologies are beginning to allow development of cost effective replacements for dedicated scene generators. These new scene generators are mainly constructed from commercial off-the- shelf (COTS) hardware and software components. At the U.S. Army Missile Command (MICOM) researchers have developed such a dynamic IR scene generator (IRSG) built around COTS hardware and software. The IRSG is being used to provide inputs to an IR scene projector for in-band sensor testing and for direct signal injection into the sensor or processor electronics. Using this `baseline' IRSG, up to 120 frames per second (Hz) of 12-bit intensity images are being generated at 640 by 640 pixel resolution. The IRSG SUT-to- target viewpoint is dynamically updated in real time by a six-degree-of-freedom SUT simulation executing on a facility simulation computer, synchronized with an external signal from the SUT hardware, and compensates for system latency using a special purpose hardware component implemented on a single VME card. Multiple dynamic targets, terrain/backgrounds, countermeasures, and atmospheric effects in real time by the facility simulation computer via a shared memory interface to the IRSG. The `next generation' IRSG is currently under development at MICOM using `next generation' COTS hardware and software. `Next generation' performance specifications are estimated to yield 16-bit intensity, 250 - 300 Hz frame rate, at 1024 X 1024 pixel resolution.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The present limit on the number of display elements (dixels) that can be fabricated on one suspended membrane resistive infrared emitter array is just over 250,000. There is an increasing demand for simulated infrared (3 - 5 micrometers and 7 - 12 micrometers ) scenes made up from two to four times that many elements in order to increase the radiation. A method for increasing the apparent number of dixels in a dynamic IR image has been developed using a reflective mosaic image combiner. The heart of the image combiner is a wedge-shaped mirror with a razor-sharp edge. An experiment involving two arrays and one mirror was conducted to verify the basic concepts, and the results of that experiment are reported. The tasks of overcoming the many difficulties involved in making the mosaic image appear to be free of artifacts are discussed. The primary source for difficulties are the mirror, the alignment of the images of the arrays to the mirror, the temperatures of the projected scenes, the individual emitter arrays, and the ability of nonuniformity correction to compensate for radiance decreases at the mosaic seam(s). This mosaic image combiner technique is important to the US Army STRICOM because they are sponsoring the construction of the Dynamic Infrared Scene Projector (DIRSP) for use at the Redstone Technical Test Center. One of the requirements of DIRSP is to project high fidelity images with high spatial resolution. The mosaic image combiner described in this paper is a relatively low risk approach meeting that requirement using existing state of the art emitter arrays.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.