KEYWORDS: Receivers, LIDAR, Sensors, Signal to noise ratio, Mirrors, Amplifiers, Optical amplifiers, Microelectromechanical systems, Transistors, Control systems
This paper presents research at the Army Research Laboratory (ARL) on a laser radar (LADAR) imager for surveillance from small unmanned air vehicles (UAV). The LADAR design is built around a micro-electro-mechanical system (MEMS) mirror and a low-cost pulsed erbium fiber laser to yield a low-cost, compact, and low-power system. In the simplest sense the LADAR measures the time-of-flight of a short laser pulse to the target and return as a means to determine range to a target. The two-axis MEMS mirror directs the light pulse to a point in the scene and establishes the angular direction to a pixel. The receiver looks over the entire region scanned by the laser and produces a voltage proportional to the amount of laser light reflected from the scene. The output of the receiver is sampled by an analog-to-digital convertor. The net result is a data file containing a range and a horizontal and vertical angle that identifies the position of every image voxel in the scene and its amplitude. This data is displayed on a computer using standard and stereo techniques to render a three-dimensional image of the scene. At this time, the LADAR operating parameters are set to form images of 256 (h) × 128 (v) pixels over a 15° × 7.5° field of view and 50 m range swath at a 5-6 Hz frame-rate to 160 m range. In the prior year, we built an initial flight package that we have flown in an auto-gyro that yielded encouraging imagery of ground targets at an altitude of roughly 100 m. Here we discuss progress to improve the performance of the LADAR to image at an altitude of 160 m and increase its mechanical robustness for extensive data collection activities.
The ability to visualize sensor data, from either ground or airborne sensors, with respect to the associated 3-D terrain is powerful. Fusion3D is a software application for stereoscopic visualization of 3-D terrain data that was developed in the Image Processing Branch at the U.S. Army Research Laboratory. It uses a 3-D display, 3-D glasses, and a 3-D mouse to quickly view province-sized 3-D maps in stereo. It is capable of ingesting large 3-D datasets from a variety of sources and includes many useful features to aid a user in the exploitation of 3-D terrain data, such as features for providing route planning, mensuration, and line-of-sight analysis. Additionally, in a recent attempt to further improve situational awareness, Fusion3D was modified to support overlaid real-time data, from both ground and airborne sensors, onto a 3-D terrain map. We are calling the result a 3-D Sensor Common Operating Picture (3-D Sensor COP). Discovery of sensor location and data across coalition assets allows for greater diversity of sensor use and improved data and sensor interoperability. In this presentation, we will show ground and airborne data, collected at a recent exercise, overlaid on a 3-D terrain map of an urban environment. Using this data collection, we will describe how an analyst would use the sensor and terrain data to improve their understanding of the environment.
The Army Research Laboratory (ARL) has continued to research a short-range ladar imager for use on small unmanned ground vehicles (UGV) and recently small unmanned air vehicles (UAV). The current ladar brassboard is based on a micro-electro-mechanical system (MEMS) mirror coupled to a low-cost pulsed erbium fiber laser. It has a 5-6 Hz frame rate, an image size of 256 (h) x 128 (v) pixels, a 42º x 21º field of regard, 35 m range, eyesafe operation, and 40 cm range resolution with provisions for super-resolution. Experience with driving experiments on small ground robots and efforts to extend the use of the ladar to UAV applications has encouraged work to improve the ladar’s performance. The data acquisition system can now capture range data from the three return pulses in a pixel (that is first, last, and largest return), and information such as elapsed time, operating parameters, and data from an inertial navigation system. We will mention the addition and performance of subsystems to obtain eye-safety certification. To meet the enhanced range requirement for the UAV application, we describe a new receiver circuit that improves the signal-to-noise (SNR) several-fold over the existing design. Complementing this work, we discuss research to build a low-capacitance large area detector that may enable even further improvement in receiver SNR. Finally, we outline progress to build a breadboard ladar to demonstrate increased range to 160 m. If successful, this ladar will be integrated with a color camera and inertial navigation system to build a data collection package to determine imaging performance for a small UAV.
LADAR is among the pre-eminent sensor modalities for autonomous vehicle navigation. Size, weight, power and cost constraints impose significant practical limitations on perception systems intended for small ground robots. In recent years, the Army Research Laboratory (ARL) developed a LADAR architecture based on a MEMS mirror scanner that fundamentally improves the trade-offs between these limitations and sensor capability. We describe how the characteristics of a highly developed prototype correspond to and satisfy the requirements of autonomous navigation and the experimental scenarios of the ARL Robotics Collaborative Technology Alliance (RCTA) program. In particular, the long maximum and short minimum range capability of the ARL MEMS LADAR makes it remarkably suitable for a wide variety of scenarios from building mapping to the manipulation of objects at close range, including dexterous manipulation with robotic arms. A prototype system was applied to a small (approximately 50 kg) unmanned robotic vehicle as the primary mobility perception sensor. We present the results of a field test where the perception information supplied by the LADAR system successfully accomplished the experimental objectives of an Integrated Research Assessment (IRA).
Future robots and autonomous vehicles require compact low-cost Laser Detection and Ranging (LADAR) systems for
autonomous navigation. Army Research Laboratory (ARL) had recently demonstrated a brass-board short-range eye-safe
MEMS scanning LADAR system for robotic applications. Boeing Spectrolab is doing a tech-transfer (CRADA) of this
system and has built a compact MEMS scanning LADAR system with additional improvements in receiver sensitivity,
laser system, and data processing system. Improved system sensitivity, low-cost, miniaturization, and low power
consumption are the main goals for the commercialization of this LADAR system. The receiver sensitivity has been
improved by 2x using large-area InGaAs PIN detectors with low-noise amplifiers. The FPGA code has been updated to
extend the range to 50 meters and detect up to 3 targets per pixel. Range accuracy has been improved through the
implementation of an optical T-Zero input line. A compact commercially available erbium fiber laser operating at 1550
nm wavelength is used as a transmitter, thus reducing the size of the LADAR system considerably from the ARL brassboard
system. The computer interface has been consolidated to allow image data and configuration data (configuration
settings and system status) to pass through a single Ethernet port. In this presentation we will discuss the system
architecture and future improvements to receiver sensitivity using avalanche photodiodes.
KEYWORDS: LIDAR, Mirrors, Field programmable gate arrays, Sensors, Receivers, Transmitters, Data acquisition, Signal to noise ratio, Microelectromechanical systems, Robots
The Army Research Laboratory (ARL) is researching a short-range ladar imager for navigation, obstacle/collision
avoidance, and target detection/identification on small unmanned ground vehicles (UGV).To date, commercial UGV
ladars have been flawed by one or more factors including low pixelization, insufficient range or range resolution, image
artifacts, no daylight operation, large size, high power consumption, and high cost. ARL built a breadboard ladar based
on a newly developed but commercially available micro-electro-mechanical system (MEMS) mirror coupled to a lowcost
pulsed Erbium fiber laser transmitter that largely addresses these problems. Last year we integrated the ladar and
associated control software on an iRobot PackBot and distributed the ladar imagery data via the PackBot's computer
network. The un-tethered PackBot was driven through an indoor obstacle course while displaying the ladar data realtime
on a remote laptop computer over a wireless link. We later conducted additional driving experiments in cluttered
outdoor environments. This year ARL partnered with General Dynamics Robotics Systems to start construction of a
brass board ladar design. This paper will discuss refinements and rebuild of the various subsystems including the
transmitter and receiver module, the data acquisition and data processing board, and software that will lead to a more
compact, lower cost, and better performing ladar. The current ladar breadboard has a 5-6 Hz frame rate, an image size of
256 (h) × 128 (v) pixels, a 60° × 30° field of regard, 20 m range, eyesafe operation, and 40 cm range resolution (with
provisions for super-resolution or accuracy).
The Army Research Laboratory (ARL) is researching a short-range ladar imager for small unmanned ground vehicles for
navigation, obstacle/collision avoidance, and target detection and identification. To date, commercial ladars for this
application have been flawed by one or more factors including, low pixelization, insufficient range or range resolution,
image artifacts, no daylight operation, large size, high power consumption, and high cost. In the prior year we conceived
a scanned ladar design based on a newly developed but commercial MEMS mirror and a pulsed Erbium fiber laser. We
initiated construction, and performed in-lab tests that validated the basic ladar architecture. This year we improved the
transmitter and receiver modules and successfully tested a new
low-cost and compact Erbium laser candidate. We further
developed the existing software to allow adjustment of operating parameters on-the-fly and display of the imaged data in
real-time. For our most significant achievement we mounted the ladar on an iRobot PackBot and wrote software to
integrate PackBot and ladar control signals and ladar imagery on the PackBot's computer network. We recently remotely
drove the PackBot over an inlab obstacle course while displaying the ladar data real-time over a wireless link. The ladar
has a 5-6 Hz frame rate, an image size of 256 (h) × 128 (v) pixels, a 60° x 30° field of regard, 20 m range, eyesafe
operation, and 40 cm range resolution (with provisions for super-resolution or accuracy). This paper will describe the
ladar design and update progress in its development and performance.
The Army Research Laboratory (ARL) is researching a short-range ladar imager for small unmanned ground vehicles for
navigation, obstacle/collision avoidance, and target detection and identification. To date, commercial ladars for this
application have been flawed by one or more factors including, low pixelization, insufficient range or range resolution,
image artifacts, no daylight operation, large size, high power consumption, and high cost. The ARL conceived a
scanned ladar design based on a newly developed but commercial MEMS mirror and a pulsed Erbium fiber laser. The
desired performance includes a 6 Hz frame rate, an image size of 256 (h) × 128 (v) pixels, a 60° × 30° field of regard, 20
m range, eyesafe operation, and 40 cm range resolution (with provisions for super-resolution or accuracy). The ladar will
be integrated on an iRobot PackBot. To date, we have built and tested the transceiver when mounted in the PackBot armmounted
sensor head. All other electronics including the data acquisition and signal processing board, the power
distribution board, and other smaller ancillary boards are built and operating. We are now operating the ladar and
working on software development. This paper will describe the ladar design and progress in its development and
performance.
KEYWORDS: Visualization, Sensors, Free space, 3D modeling, 3D visualizations, Analytical research, Calibration, Visual process modeling, Signal processing, Particles
Aircraft in flight typically charge via electrostatic processes; this charge is the source of measurable electric fields.
Nearby objects may perturb this field, so there is interest in developing electrostatic sensors that can sense nearby
objects like power lines by measuring their effects on the field. A major obstacle to developing these sensors is that
there are often large variations in the field due to other causes. The problem is particularly difficult for helicopters
where the rotating blades cause large periodic variations in field intensity. The Army Research Laboratory (ARL) has
developed a model that predicts these self-generated variations so they can be cancelled out and the smaller variations
from other objects can be seen.
A new code is presented that was developed at ARL for visualization of the complex fields present on helicopters. The
fields are different on virtually every part of the aircraft and vary with both the main and tail rotor positions. The code
combines a large number of different quasi-static calculations in a single animated display where field strengths are
"painted" as textures on the aircraft model. The code allows the user to view the time variations from any viewpoint in
stereo. The stereo viewing is very important for clear and easy interpretation of the complex field patterns produced by
the models.
Shipboard infrared search and track (IRST) systems can detect sea-skimming anti-ship missiles at long ranges, but cannot distinguish missiles from slowly moving false targets and clutter. In a joint Army-Navy program, the Army Research Laboratory (ARL) is developing a ladar to provide unambiguous range and velocity measurements of targets detected by the distributed aperture system (DAS) IRST system being developed by the Naval Research Laboratory (NRL) sponsored by the Office of Naval Research (ONR). By using the ladar's range and velocity data, false alarms and clutter objects will be distinguished from incoming missiles. Because the ladar uses an array receiver, it can also provide three-dimensional (3-D) imagery of potential threats at closer ranges in support of the force protection/situational awareness mission. The ladar development is being accomplished in two phases. In Phase I, ARL designed, built, and reported on an initial breadboard ladar for proof-of-principle static platform field tests. In Phase II, ARL was tasked to design, and test an advanced breadboard ladar that corrected various shortcomings in the transmitter optics and receiver electronics and improved the signal processing and display code. The advanced breadboard will include a high power laser source utilizing a long pulse erbium amplifier built under contract. Because award of the contract for the erbium amplifier was delayed, final assembly of the advanced ladar is delayed. In the course of this year's work we built a "research receiver" to facilitate design revisions, and when combined with a low-power laser, enabled us to demonstrate the viability of the components and subsystems comprising the advanced ladar.
A new technique is presented for visualizing high-resolution terrain elevation data. It produces
realistic images at small scales on the order of the data resolution and works particularly well when
natural objects are present. Better visualization at small scales opens up new applications, like site
surveillance for security and Google Earth-type local search and exploration tasks that are now done
with 2-D maps. The large 3-D maps are a natural for high-resolution stereo display.
The traditional technique drapes a continuous surface over the regularly spaced elevation values.
This technique works well when displaying large areas or in cities with large buildings, but falls
apart at small scales or for natural objects like trees. The new technique visualizes the terrain as a
set of disjoint square patches. It is combined with an algorithm that identifies smooth areas within
the scene. Where the terrain is smooth, such as in grassy areas, roads, parking lots and rooftops, it
warps the patches to create a smooth surface. For trees or shrubs or other areas where objects are
under-sampled, however, the patches are left disjoint. This has the disadvantage of leaving gaps in
the data, but the human mind is very adept at filling in this missing information. It has the strong
advantage of making natural terrain look realistic, trees and bushes look stylized but still look
natural and are easy to interpret. Also, it does not add artifacts to the map, like filling in blank
vertical walls where there are alcoves and other structure and extending bridges and overpasses
down to the ground.
The new technique is illustrated using very large 1-m resolution 3-D maps from the Rapid Terrain
Visualization (RTV) program, and comparisons are made with traditional visualizations using these
maps.
Shipboard infrared search and track (IRST) systems can detect sea-skimming anti-ship missiles at long ranges. Since IRST systems cannot measure range and line-of-sight velocity, they have difficulty distinguishing missiles from slowly moving false targets and clutter. In a joint Army-Navy program, the Army Research Laboratory (ARL) is developing a chirped amplitude modulation ladar to provide range and velocity measurements for tracking of targets handed over to it by the distributed aperture system IRST (DAS-IRST) under development at the Naval Research Laboratory (NRL) under Office of Naval Research (ONR) sponsorship. By using an array receiver based on Intevac Inc.'s Electron Bombarded Active Pixel Sensor (EBAPS) operating near 1.5 μm wavelength, ARL's ladar also provides 3D imagery of potential threats in support of the force protection mission. In Phase I, ARL designed and built a breadboard ladar system for proof-of-principle static platform field tests. In Phase II, ARL is improving the ladar system to process and display 3D imagery and range-Doppler plots in near real-time, to re-register frames in near real-time to compensate for platform and target lateral motions during data acquisition, and to operate with better quality EBAPS tubes with higher quantum efficiency and better response spatial uniformity. The chirped AM ladar theory, breadboard design, performance model results, and initial breadboard preliminary test results were presented last year at this conference. This paper presents the results of tests at the Navy's Chesapeake Bay Detachment facility. The improvements to the ladar breadboard since last year are also presented.
Over the past years, three-dimensional (3-D) terrain mapping technology has improved to the degree that it is now extremely useful for site surveillance applications. Resolution and accuracy in absolute (world) coordinates of 1 m or better are now available. City-size areas can be collected and high-quality maps produced in a few days at reasonable cost. Maps are already available for many sites of interest and availability will increase as costs continue to drop and more applications are developed for them. The 3-D maps are useful in all phases of site security. I show how the maps are useful for planning, where they allow easy delineation of the areas to be monitored and optimum sensor placements. I show how the maps can be used for target detection algorithms, where the portions of each sensor's field of view that fall outside the area to be monitored can be masked out to reduce false alarms. Also, since the range to the map is known for each pixel within the sensor field of view, the scale of any potential target is also known and algorithms do not have to accommodate a wide range of potential target sizes. Finally, I show how electro-optical/infrared imagery can be projected onto the 3-D map to provide context. Previous detections, target tracks, and other information can also be added to the display to enhance its value. I have worked with a map of the Adelphi site of the U.S. Army Research Laboratory, projecting electro-optical and infrared imagery onto it with very encouraging results. I have also calculated sightlines for a radar considered for the roof of our main building. The tools are practical with current hardware at reasonable prices.
Shipboard infrared search and track (IRST) systems can detect sea-skimming, anti-ship missiles at long ranges. Since IRST systems cannot measure range and line-of-sight (LOS) velocity, they have difficulty distinguishing missiles from false targets and clutter. In a joint Army-Navy program, the Army Research Laboratory (ARL) is developing a ladar based on the chirped amplitude modulation (AM) technique to provide range and velocity measurements of potential targets handed-over by the distributed aperture system - IRST (DAS-IRST) being developed by the Naval Research Laboratory (NRL) and sponsored by the Office of Naval Research (ONR). Using the ladar's range and velocity data, false alarms and clutter will be eliminated, and valid missile targets' tracks will be updated. By using an array receiver, ARL's ladar will also provide 3D imagery of potential threats for force protection/situational awareness. The concept of operation, the Phase I breadboard ladar design and performance model results, and the Phase I breadboard ladar development program were presented in paper 5413-16 at last year's symposium. This paper will present updated design and performance model results, as well as recent laboratory and field test results for the Phase I breadboard ladar. Implications of the Phase I program results on the design, development, and testing of the Phase II brassboard ladar will also be discussed.
The U.S. Army Research Laboratory (ARL) has developed a number of near-infrared, prototype laser detection and ranging (LADAR) Systems based on the chirp, amplitude-modulated LADAR (CAML) architecture. The use of self-mixing detectors in the receiver, that have the ability to internally detect and down-convert modulated optical signals, have significantly simplified the LADAR design. Recently, ARL has designed and fabricated single-pixel, self-mixing, InGaAs-based, metal-semiconductor-metal detectors to extend the LADAR operating wavelength to 1.55 mm and is currently in the process of designing linear arrays of such detectors. This paper presents fundamental detector characterization measurements of the new 1.55 mm detectors in the CAML architecture and some insights on the design of 1.55 μm linear arrays.
The Army Research Laboratory is researching system architectures and components required to build a 32x32 pixel scannerless ladar breadboard. The 32x32 pixel architecture achieves ranging based on a frequency modulation/continuous wave (FM/cw) technique implemented by directly amplitude modulating a near-IR diode laser transmitter with a radio frequency (RF) subcarrier that is linearly frequency modulated (i.e. chirped amplitude modulation). The backscattered light is focused onto an array of metal-semiconductor-metal (MSM) detectors where it is detected and mixed with a delayed replica of the laser modulation signal that modulates the responsivity of each detector. The output of each detector is an intermediate frequency (IF) signal (a product of the mixing process) whose frequency is proportional to the target range. Pixel read-out is achieved using code division multiple access techniques as opposed to the usual time-multiplexed techniques to attain high effective frame rates. The raw data is captured with analog-to-digital converters and fed into a PC to demux the pixel data, compute the target ranges, and display the imagery. Last year we demonstrated system proof-of-principle for the first time and displayed an image of a scene collected in the lab that was somewhat corrupted by pixel-to-pixel cross-talk. This year we report on system modifications that reduced pixel-to-pixel cross-talk and new hardware and display codes that enable near real-time stereo display of imagery on the ladar's control computer. The results of imaging tests in the laboratory will also be presented.
KEYWORDS: LIDAR, Sensors, Receivers, Modulation, Interference (communication), Signal to noise ratio, Signal detection, Signal processing, Prototyping, Optical amplifiers
The U.S. Army Research Laboratory (ARL) is investigating a ladar architecture based on FM/cw radar principles, whereby the range information is contained in the low-frequency mixing product derived by mixing a reference ultra-high frequency (UHF) chirp with an optically detected, time-delayed UHF chirp scattered from a target. ARL is also investigating the use of metal-semiconductor-metal (MSM) detectors as unique self-mixing detectors, which have the ability to internally detect and down-convert the modulated optical signals. ARL has recently incorporated a 1x32 element linear MSM self-mixing detector array into a prototype FM/cw ladar system and performed a series of characterization and outdoor image collection experiments using this prototype. This paper discusses the basic performance of the prototype system and presents some fundamental measurements as well as ladar imagery taken on the ARL Adelphi campus.
The Army Research Laboratory is researching a focal plane array (FPA) ladar architecture that is applicable for smart munitions, reconnaissance, face recognition, robotic navigation, etc.. Here we report on progress and test results attained over the past year related to the construction of a 32x32 pixel FPA ladar laboratory breadboard. The near-term objective of this effort is to evaluate and demonstrate an FPA ladar using chirped amplitude modulation; knowledge gained will then be used to build a field testable version with a larger array format. The ladar architecture achieves ranging based on a frequency modulation/continuous wave technique implemented by directly amplitude modulating a near-IR diode laser transmitter with a radio frequency (rf) subcarrier that is linearly frequency modulated (chirped amplitude modulation). The diode's output is collected and projected to form an illumination field in the downrange image area. The returned signal is focused onto an array of optoelectronic mixing, metal-semiconductor-metal detectors where it is detected and mixed with a delayed replica of the laser modulation signal that modulates the responsivity of each detector. The output of each detector is an intermediate frequency (IF) signal resulting from the mixing process whose frequency is proportional to the target range. This IF signal is continuously sampled over a period of the rf modulation. Following this, a signal processor calculates the discrete fast Fourier transform over the IF waveform in each pixel to establish the ranges and amplitudes of all scatterers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.