The Habitable Worlds Observatory (HWO) is a future NASA flagship mission which will use a segmented telescope and coronagraphic instruments to discover and characterize exoplanets, including exoEarths – Earth-like planets orbiting other stars. HWO will require extraordinary optical stability, with wavefront drift performance measured in the picometers. This paper explores how active control of the telescope optics, using metrology systems that include laser distance gauges, segment edge sensors, and picometer precision actuators, can provide the needed telescope stability. Together with wavefront sensing and deformable mirrors in the coronagraph, this approach can control the entire coronagraphic beam train, to stabilize the electric field in the coronagraph. The HWO Technology Assessment Group is developing three “Exploratory Analytic Cases,” which are conceptual designs for HWO that differ in some respects, to provide a basis for detailed analysis. This paper addresses EAC1, a deployed-aperture concept that draws on JWST heritage. EAC1 uses 19 1.8-meter hexagonal segments to form its off-axis Primary Mirror (PM), as sketched in Figure 1. EAC2 will use fewer, larger “keystone” segments in a non-deployed off-axis PM configuration, and EAC3 will be a larger, on-axis deployed telescope using smaller keystone shaped segments.
The Alignment and Phasing System (APS) of the Thirty Meter Telescope (TMT) will use Shack-Hartmann-type measurements to determine the alignment of the telescope mirrors as well as the relative alignment and shapes of the segments of the primary mirror (M1). These measurements are required to be made with high accuracy in order for the telescope to produce diffraction-limited images. As TMT commissioning time will be limited, full performance must also be achieved as quickly as possible, and routine operations times need to be minimized. From the earliest stages of the design, the TMT APS team has therefore emphasized work that ensures that APS will work as well and as close to out of the box as possible and reduces the associated risks. APS is based on the Phasing Camera System (PCS) with more than 60 years of combined operation between the two telescopes of the W. M. Keck Observatory. In addition to the vast experience available from this heritage, there has been a great deal of effort on algorithm and software development, analytical studies and simulations, experiments, and prototyping. Here, we describe some of these efforts and explain why we are confident that this critical subsystem of TMT will achieve its goals.
We extend our previous demonstration of the first on-sky primary mirror segment closed-loop control on Keck using a vector-Zernike wavefront sensor (vZWFS), which improved the Strehl ratio on the NIRC2 science camera by up to 10 percentage points. Segment co-phasing errors contribute to Keck contrast limits and will be necessary to correct for the segmented Extremely Large Telescopes and future space missions. The goal of the post-AO vZWFS on Keck is to monitor and correct segment co-phasing errors in parallel with science observations. The ZWFS is ideal for measuring phase discontinuities and is one of the most sensitive WFSs, but has limited dynamic range. The Keck vZWFS consists of a metasurface mask imposing two different phase shifts to orthogonal polarizations, split into two pupil images, extending its dynamic range. We report on the vZWFS closed-loop co-phasing performance and early work towards understanding the interactions between the AO system and segment phasing. We discuss a comparison of the AO performance when co-phasing by aligning segment edges, as is currently done at Keck, compared with aligning to the average phase over the segments, as is done by the vZWFS.
We propose an approach for coarse alignment of a segmented space telescope using science instrument images. The recommended steps go from large post launch rigid body misalignments to within the capture range of coarse phasing where segment piston error is the predominant residual wavefront error. These steps include five data collection and analysis methods comprising of metrology capture, segment capture and identification, segment translation, segment stacking, and fine alignment. Using a proposed architecture for the NASA Habitable Worlds Observatory (HWO) we describe the details of our recommended approach for each telescope alignment step. We then compare this recommended sequence to alternative alignment progressions used in existing segmented testbeds and telescopes in terms of number of data collections required. This model-based demonstration establishes that the recommended coarse and fine alignment sequence performs more efficiently in time and resource cost, handing off to coarse and fine phasing activities further along the telescope commissioning process.
Current approaches for phasing of segmented space telescopes have required complex dedicated optics and mechanisms, such as Dispersed Hartmann sensors or grisms. These methods do not scale well as the number of segments increases. The broadband phasing approach used at the Keck Observatory does scale well and can work on space telescopes without the need for any additional hardware. We show that this method implemented as white light interferometry (WLI), using a standard imaging detector and filters, has a capture range limited only by the range of the segment actuators and can easily phase the mirrors to within the capture range of single wavelength phasing methods. An analysis of the Keck broadband phasing performance is presented and used to develop a formula for implementation of WLI on other segmented telescopes. As an example, a WLI implementation for the NASA Habitable Worlds Observatory telescope is developed and demonstrated via detailed wave-optics simulations. The implementation, performance and limitations of the proposed WLI method are discussed in detail in the paper.
The Habitable Worlds Observatory will have uniquely stringent wavefront stability requirements, in the single-digit picometers for observations lasting days, to preserve coronagraph contrast for imaging earth-like exoplanets. This need will be addressed using high-precision Wavefront Sensing and Control methods, including continuous picometerprecision metrology and control of the Optical Telescope Assembly (OTA). This paper reviews methods for initializing and maintaining the OTA wavefront, evolved from those used for the James Webb Space Telescope, but extended to much higher precision. It concludes by identifying performance targets for WFSC technology development, to help guide NASA technology investments.
Primary mirror segment shape correction via Warping Harness (WH) control adjustment is key to obtaining the required image performance of the Thirty Meter Telescope (TMT). We analyzed two separate experimental activities to better predict the segment WH performance. First, we took measurements of WH influence functions and Singular Value Decomposition (SVD) modes on a prototype TMT segment and compared these to model predictions. Second, we applied the TMT control algorithm on-sky at the Keck Observatory during their segment exchange and warping activities. We then used these measurements to improve our WH control simulations to include the observed effects. Altogether, the prototype segment measurements, on-sky TMT control algorithm measurements, and detailed simulation helped to better predict segment correction performance for TMT.
KEYWORDS: Systems modeling, Interfaces, Thirty Meter Telescope, Control systems, Databases, Systems engineering, Cameras, Mirrors, Model-based design, Astronomy
This paper presents a novel method for verifying interfaces and generating interface control documents (ICDs) from a system model in SysMLTM. In systems and software engineering, ICDs are key artifacts that specify the interface(s) to a system or subsystem, and are used to control the documentation of these interfaces. ICDs enable independent teams to develop connecting systems that use the specified interfaces. In the context of the Thirty Meter Telescope (TMT), interface control documents also act as contracts for delivered subsystems. The Alignment and Phasing system (APS) is one such subsystem. APS is required to implement a particular interface, and formulates requirements for the interfaces to be provided by other components of TMT that interface with APS. As the design of APS matures, these interfaces are frequently refined, making it necessary for related ICDs to be updated. In current systems engineering practice, ICDs are maintained manually. This manual maintenance can lead to a loss in integrity and accuracy of the documents over time, resulting in the documents no longer reflecting the actual state of the interfaces of a system. We show how a system model in SysMLTM can be used to generate ICDs automatically. The method is demonstrated through application to interface control documents pertaining to APS. Specifically, we apply the method to the interface of APS to the primary mirror control system (M1CS) and of APS to the Telescope Control System (TCS). We evaluate the newly introduced method through application to two case studies.
We present an estimate of the optical performance of the Thirty Meter Telescope (TMT) after execution of the full telescope alignment plan. The TMT alignment is performed by the Global Metrology System (GMS) and the Alignment and Phasing System (APS). The GMS first measures the locations of the telescope optics and instruments as a function of elevation angle. These initial measurements will be used to adjust the optics positions and build initial elevation look-up tables. Then the telescope is aligned using starlight as the input for the APS at multiple elevation angles. APS measurements are used to refine the telescope alignment to build elevation and temperature dependent look-up tables. Due to the number of degrees of freedom in the telescope (over 10,000), the ability of the primary mirror to correct aberrations on other optics, the tight optical performance requirements and the multiple instrument locations, it is challenging to develop, test and validate these alignment procedures. In this paper, we consider several GMS and APS operational scenarios. We apply the alignment procedures to the model-generated TMT, which consists of various quasi-static errors such as polishing errors, passive supports errors, thermal and gravity deformations and installation position errors. Using an integrated optical model and Monte-Carlo framework, we evaluate the TMT's aligned states using optical performance metrics at multiple instrument and field of view locations. The optical performance metrics include the Normalized Point Source Sensitivity (PSSN), RMS wavefront error before and after Adaptive Optics (AO) correction, pupil position change, and plate scale distortion.
The narrowband segment phasing algorithm that was originally developed at Keck was replaced many years ago by a broadband algorithm that, although slower and less accurate than the former, has proved to be much more robust. A thorough investigation into the lack of robustness of the narrowband algorithm has now shown that this results from systematic errors (∼ 20 nm on average) that are wavelength-dependent. We show that the seemingly continuous distribution of these chromatic errors in fact results from (at least) two independent causes. The largest and most problematic effects are due to “plateaus” of unremoved material that were covered by supports during the ion beam figuring of three of the segments, but other smaller chromatic effects are also shown to be present and these are not yet understood. If the purely chromatic effects can be eliminated, we show that the intrinsic accuracy of the narrowband algorithm is about 6 nm (surface).
The Keck telescope segments were manufactured by stressed mirror polishing of large circular pieces of Zerodur that were then cut into hexagons and finished by Ion Beam Figuring (IBF). It has long been believed that this process results in segments with little or no edge effects. As a result, this same general approach is planned for segment manufacturing for the Thirty Meter Telescope (TMT) and the European Extremely Large Telescope (E-ELT). However, recent measurements at the Keck telescope suggest that at least some of the Keck segments have significant aberrations within 60 mm of the edge. These aberrations impact the telescope phasing and the overall telescope image quality. We present interferometric measurements of multiple Keck segments, characterizing the surface errors near the edges over spatial periods from ~5 cm down to ~1 mm. We show that the largest phasing and image quality effects are due to plateaus of unremoved material, left behind after IBF as a result of obscuration by the IBF supports. Apart from these plateaus, the edge quality is relatively good, though not as good as in the segment interiors. Some residual phasing and image quality effects remain, and these are not currently understood.
The narrowband phasing algorithm that was originally developed at Keck has largely been replaced by a broad- band algorithm that, although it is slower and less accurate than the former, has proved to be much more robust. A systematic investigation into the lack of robustness of the narrowband algorithm has shown that it results from systematic errors (of order 20 nm) that are wavelength-dependent. These errors are not well-understood at present, but they do not appear to arise from instrumental effects in the Keck phasing cameras, or from the segment coatings. This leaves high spatial frequency aberrations or scattering within 60 mm of the segment edges as the most likely origin of the effect.
Alignment and Phasing System (APS) is responsible for the optical alignment via starlight of the approximately 12,000 degrees of freedom of the primary, secondary and tertiary mirrors of Thirty Meter Telescope (TMT). APS is based on the successful Phasing Camera System (PCS) used to align the Keck Telescopes. Since the successful APS conceptual design in 2007, work has concentrated on risk mitigation, use case generation, and alignment algorithm development and improvement. Much of the risk mitigation effort has centered around development and testing of prototype APS software which will replace the current PCS software used at Keck. We present an updated APS design, example use cases and discuss, in detail, the risk mitigation efforts.
KEYWORDS: Systems engineering, Observatories, Computer aided design, Control systems, Data modeling, Systems modeling, Performance modeling, Databases, Interfaces, Thirty Meter Telescope
This paper provides an overview of the system design, architecture, and construction phase system engineering processes of the Thirty Meter Telescope project. We summarize the key challenges and our solutions for managing TMT systems engineering during the construction phase. We provide an overview of system budgets, requirements and interfaces, and the management thereof. The requirements engineering processes, including verification and plans for collection of technical data and testing during the assembly and integration phases, are described. We present configuration, change control and technical review processes, covering all aspects of the system design including performance models, requirements, and CAD databases.
We have developed a system model using the System Modeling Language (SysML) for the Alignment and Phasing System (APS) on the Thirty Meter Telescope (TMT). APS is a Shack-Hartmann wave-front sensor that will be used to measure the alignment and phasing of the primary mirror segments, and the alignment of the secondary and tertiary mirrors. The APS system model contains the ow-down of the Level 1 TMT requirements to APS (Level 2) requirements, and from there to the APS sub-systems (Level 3) requirements. The model also contains the operating modes and scenarios for various activities, such as maintenance alignment, post-segment exchange alignment, and calibration activities. The requirements ow-down is captured in SysML requirements diagrams, and we describe the process of maintaining the DOORS database as the single-source-of-truth for requirements, while using the SysML model to capture the logic and notes associated with the ow-down. We also use the system model to capture any needed communications from APS to other TMT systems, and between the APS sub-systems. The operations are modeled using SysML activity diagrams, and will be used to specify the APS interface documents. The modeling tool can simulate the top level activities to produce sequence diagrams, which contain all the communications between the system and subsystem needed for that activity. By adding time estimates for the lowest level APS activities, a robust estimate for the total time on-sky that APS requires to align and phase the telescope can be obtained. This estimate will be used to verify that the time APS requires on-sky meets the Level 1 TMT requirements.
We have developed an integrated optical model of the semi-static performance of the Thirty Meter Telescope. The model includes surface and rigid body errors of all telescope optics as well as a model of the Alignment and Phasing System Shack-Hartmann wavefront sensors and control algorithms. This integrated model allows for simulation of the correction of the telescope wavefront, including optical errors on the secondary and tertiary mirrors, using the primary mirror segment active degrees of freedom. This model provides the estimate of the predicted telescope performance for system engineering and error budget development. In this paper we present updated performance values for the TMT static optical errors in terms of Normalized Point Source Sensitivity and RMS wavefront error after Adaptive Optics correction. As an example of a system level trade, we present the results from an analysis optimizing the number of Shack-Hartmann lenslets per segment. We trade the number of lenslet rings over each primary mirror segment against the telescope performance metrics of PSSN and RMS wavefront error.
We quantify the accuracy of the Keck telescope segment surface figure measurements made on sky by the Phasing Camera System (PCS), a Shack-Hartmann wavefront sensor that uses long integration times to average over the effects of atmospheric turbulence. These measurements are used to determine the settings for warping harnesses that significantly reduce the segment surface errors. When a series of six measurements is performed on the same segment in rapid succession, the Root Mean Square (RMS) segment surface, as reconstructed by 2nd through 4th order Zernike polynomials, is determined with an accuracy of 6.0 ‡ 3.2 nm (error on the mean). However, when we compare measurements on the same segment separated by several hours the inferred surface RMS accuracy is 9.0 ‡ 5.0 nm, or 50% larger. This suggests that there are systematic errors on the order of 7 nm that vary throughout the night. In this paper we investigate and quantify the potential causes of these systematic errors, which together with statistical errors, constitute a fundamental limit for the performance of segment warping harnesses. Such measurements are currently the baseline warping harness inputs for the Thirty Meter Telescope and the European Extremely Large Telescope.
The Laser Communication Relay Demonstration will feature a geostationary satellite communicating via optical
links to multiple ground stations. The first ground station (GS-1) is the 1m OCTL telescope at Table Mountain in California. The optical link will utilize pulse position modulation (PPM) and differential phase shift keying (DPSK) protocols. The DPSK link necessitates that adaptive optics (AO) be used to relay the incoming beam into the single mode fiber that is the input of the modem. The GS-1 AO system will have two MEMS Deformable mirrors to achieve the needed actuator density and stroke limit. The AO system will sense the aberrations with a Shack-Hartmann wavefront sensor using the light from the communication link’s 1.55 μm laser to close the loop. The system will operate day and night. The system’s software will be based on heritage software from the Palm 3000 AO system, reducing risk and cost. The AO system is being designed to work at r0 greater than 3.3 cm (measured at 500 nm and zenith) and at elevations greater than 20° above the horizon. In our worst case operating conditions we expect to achieve Strehl ratios of over 70% (at 1.55 μm), which should couple 57% of the light into the single mode DPSK fiber. This paper describes the conceptual design of the AO system, predicted performance and discusses some of the trades that were conducted during the design process.
KEYWORDS: Real-time computing, Actuators, Adaptive optics, Control systems design, Computing systems, Servomechanisms, Control systems, Cameras, Mirrors, Algorithm development
This paper reflects, from a computational perspective, on the experience gathered in designing and implementing realtime
control of the PALM-3000 adaptive optics system currently in operation at the Palomar Observatory. We review
the algorithms that serve as functional requirements driving the architecture developed, and describe key design issues
and solutions that contributed to the system’s low compute-latency. Additionally, we describe an implementation of
dense matrix-vector-multiplication for wavefront reconstruction that exceeds 95% of the maximum achievable
bandwidth on NVIDIA GeForce 8800GTX GPU.
The first of a new generation of high actuator density AO systems developed for large telescopes, PALM-3000 is
optimized for high-contrast exoplanet science but will support operation with natural guide stars as faint as V ~ 18.
PALM-3000 began commissioning in June 2011 on the Palomar 200" telescope and has to date over 60 nights of
observing. The AO system consists of two Xinetics deformable mirrors, one with 66 by 66 actuators and another with
21 by 21 actuators, a Shack-Hartman WFS with four pupil sampling modes (ranging from 64 to 8 samples across the
pupil), and a full vector matrix multiply real-time system capable of running at 2KHz frame rates. We present the details
of the completed system, and initial results. Operating at 2 kHz with 8.3cm pupil sampling on-sky, we have achieved a
K-band Strehl ratio as high as 84% in ~1.0 arcsecond visible seeing.
Modeling is an integral part of systems engineering. It is utilized in requirement validation, system verification, as well as for supporting design trade studies. Modeling highly complex systems poses particular challenges, including the definition and interpretation of system performance, and the combined evaluation of physical processes spanning a wide range of time frames. Our solution is based on statistical interpretation of system performance and a unique image quality metric developed by TMT. The Stochastic Framework and Point Source Sensitivity allow us to properly estimate and combine the optical effects of various disturbances and telescope imperfections.
The Thirty Meter Telescope (TMT) is a Ritchey-Chritien optical telescope with a 30-meter diameter primary mirror made up of 492 hexagonal segments. Such a large and complex optical system requires detailed modeling of the optical performance during the design phase. An optical modeling computational framework has been developed to support activities related to wavefront & image performance prediction. The model includes effects related to mirror shape sensing & control, mirror alignment & phasing, M1 segment control, low order wavefront correction, adaptive optics simulation for high order wavefront correction, and high contrast imaging. Here we give an overview of this optical simulation framework, the modeling tools and algorithms that are used, and a set of sample analyses. These tools have been used in many aspects of the system design process from mirror specification to instrument & sensor design to algorithm development and beyond.
The segments in the Keck telescopes are routinely phased using a Shack-Hartmann wavefront sensor with subapertures
that span adjacent segments. However, one potential limitation to the absolute accuracy of this
technique is that it relies on a lenslet array (or a single lens plus a prism array) to form the subimages. These
optics have the potential to introduce wavefront errors and stray reflections at the subaperture level that will bias
the phasing measurement. We present laboratory data to quantify this effect, using measured errors from Keck
and two other lenslet arrays. In addition, as part of the design of the Thirty Meter Telescope Alignment and
Phasing System we present a preliminary investigation of a lenslet-free approach that relies on Fresnel diffraction
to form the subimages at the CCD. Such a technique has several advantages, including the elimination of lenslet
aberrations.
KEYWORDS: Telescopes, Point spread functions, Vignetting, Telescope design, Distortion, Error analysis, Systems modeling, Optical transfer functions, Space telescopes, Thirty Meter Telescope
The Normalized Point Source Sensitivity (PSSN) has previously been defined and analyzed as an On-Axis
seeing-limited telescope performance metric. In this paper, we expand the scope of the PSSN definition to
include Off-Axis field of view (FoV) points and apply this generalized metric for performance evaluation of the
Thirty Meter Telescope (TMT). We first propose various possible choices for the PSSN definition and select
one as our baseline. We show that our baseline metric has useful properties including the multiplicative feature
even when considering Off-Axis FoV points, which has proven to be useful for optimizing the telescope error
budget. Various TMT optical errors are considered for the performance evaluation including segment alignment
and phasing, segment surface figures, temperature, and gravity, whose On-Axis PSSN values have previously
been published by our group.
We evaluate how well the performance of the Thirty Meter Telescope (TMT) can be maintained against thermally
induced errors during a night of observation. We first demonstrate that using look-up-table style correction for
TMT thermal errors is unlikely to meet the required optical performance specifications. Therefore, we primarily
investigate the use of a Shack-Hartmann Wavefront Sensor (SH WFS) to sense and correct the low spatial
frequency errors induced by the dynamic thermal environment. Given a basic SH WFS design, we position
single or multiple sensors within the telescope field of view and assess telescope performance using the JPL
optical ray tracing tool MACOS for wavefront simulation. Performance for each error source, wavefront sensing
configuration, and control scheme is evaluated using wavefront error, plate scale, pupil motion, pointing error,
and the Point Source Sensitivity (PSSN) as metrics. This study provides insight into optimizing the active optics
control methodology for TMT in conjunction with the Alignment and Phasing System (APS) and primary mirror
control system (M1CS).
The PALM-3000 upgrade to the Palomar Adaptive Optics system will deliver extreme adaptive optics correction to a
suite of three infrared and visible instruments on the 5.1 meter Hale telescope. PALM-3000 uses a 3388-actuator
tweeter and a 241-actuator woofer deformable mirror, a wavefront sensor with selectable pupil sampling, and an
innovative wavefront control computer based on a cluster of 17 graphics processing units to correct wavefront
aberrations at scales as fine as 8.1 cm at the telescope pupil using natural guide stars. Many components of the system,
including the science instruments and a post-coronagraphic calibration wavefront sensor, have already been
commissioned on the sky. Results from a laboratory testbed used to characterize the remaining new components and
verify all interfaces are reported. Deployment to Palomar Observatory is planned August 2010, with first light expected
in early 2011.
We describe the lab characterization of the new 3,388-actuator deformable mirror (DM3388) produced by Xinetics, Inc.
for the PALM-3000 adaptive optics (AO) system1 under development by Jet Propulsion Laboratory and Caltech Optical
Observatories. This square grid 66-by-66 actuator mirror has the largest number of actuators of any deformable mirror
currently available and will enable high-contrast imaging for direct exoplanet imaging science at the Palomar 200"
diameter Hale Telescope. We present optical measurements of the powered and unpowered mirror surface, influence
functions, linearity of the actuators, and creep of the actuators. We also quantify the effect of changes in humidity.
In this communication we address the problem of post coronagraphic wavefront reconstruction. In high contrast
imaging applications it is crucial to estimate the wavefront after the coronagraph, as close as possible to the
science camera, in order to minimize non-common path errors. However closing the loop on such a measurement
is a difficult exercise since several low order modes have been cancelled by the coronagraphs, thus leading to
ill-posed inversion problems. Moreover sensing at the science detector is an intrusive method that disrupts the
course of the observations. The Gemini Planet Imager (GPI) calibration system, based on a post-coronagraphic
interferometer, provides an estimate of mid to high spatial frequencies aberrations that alleviates these two issues.
However such a measurement have an intrinsic limitations that is related to the differential path errors between
the two arm of the interferometer. In this paper we show how to devise wavefront reconstruction algorithms that
account for these differential path errors. We identify two regimes, relative and absolute wavefront sensing, that
depend on the magnitudes of the aberrations and the design of the coronagraph. We illustrate the performances
for each regime. Finally we present experimental results obtained during the validation phase of show the results
on laboratory data.
We report on the preliminary design of W.M. Keck Observatory's (WMKO's) next-generation adaptive optics (NGAO)
facility. This facility is designed to address key science questions including understanding the formation and evolution
of today's galaxies, measuring dark matter in our galaxy and beyond, testing the theory of general relativity in the
Galactic Center, understanding the formation of planetary systems around nearby stars, and exploring the origins of our
own solar system. The requirements derived from these science questions have resulted in NGAO being designed to
have near diffraction-limited performance in the near-IR (K-Strehl ~ 80%) over narrow fields (< 30" diameter) with
modest correction down to ~ 700 nm, high sky coverage, improved sensitivity and contrast and improved photometric
and astrometric accuracy. The resultant key design features include multi-laser tomography to measure the wavefront
and correct for the cone effect, open loop AO-corrected near-IR
tip-tilt sensors with MEMS deformable mirrors (DMs)
for high sky coverage, a high order MEMS DM for the correction of atmospheric and telescope static errors to support
high Strehls and high contrast companion sensitivity, point spread function (PSF) calibration to benefit quantitative
astronomy, a cooled science path to reduce thermal background, and a high-efficiency science instrument providing
imaging and integral field spectroscopy.
The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and
demonstrating, in hardware, the future technologies of wavefront sensing and control algorithms for active optical
systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a
dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront
sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate active optical
devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed
optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware
integrations and tests.
The primary mirror segment aberrations after shape corrections with warping harness have been identified as
the single largest error term in the Thirty Meter Telescope (TMT) image quality error budget. In order to better
understand the likely errors and how they will impact the telescope performance we have performed detailed
simulations. We first generated unwarped primary mirror segment surface shapes that met TMT specifications.
Then we used the predicted warping harness influence functions and a Shack-Hartmann wavefront sensor model
to determine estimates for the 492 corrected segment surfaces that make up the TMT primary mirror. Surface
and control parameters, as well as the number of subapertures were varied to explore the parameter space. The
corrected segment shapes were then passed to an optical TMT model built using the Jet Propulsion Laboratory
(JPL) developed Modeling and Analysis for Controlled Optical Systems (MACOS) ray-trace simulator. The
generated exit pupil wavefront error maps provided RMS wavefront error and image-plane characteristics like
the Normalized Point Source Sensitivity (PSSN). The results have been used to optimize the segment shape
correction and wavefront sensor designs as well as provide input to the TMT systems engineering error budgets.
The PALM-3000 upgrade to the Palomar Adaptive Optics system on the 5.1 meter Hale telescope will deliver extreme adaptive optics correction in near-infrared wavelengths and diffraction-limited images in visible wavelengths. PALM-3000 will use a 3388-actuator tweeter and a 241-actuator woofer deformable mirror, a Shack-Hartmann wavefront sensor with selectable pupil sampling, and an innovative wavefront control computer based on a cluster of 17 graphics processing units to correct wavefront aberrations at scales as fine as 8.1 cm at the telescope pupil using natural guide
stars. The system is currently undergoing integration and testing, with deployment at Palomar Observatory planned in early 2011. We present the detailed design of key aspects of the adaptive optics system, and the current status of the deformable mirror characterization, wavefront sensor performance, and testbed activities.
The primary, secondary and tertiary mirrors of the Thirty Meter Telescope (TMT), taken together, have approximately
12,000 degrees of freedom in optical alignment. The Alignment and Phasing System (APS) will use
starlight and a variety of Shack-Hartmann based measurement techniques to position the segment pistons, tips,
and tilts, segment figures, secondary rigid body motion, secondary figure and the tertiary figure to correctly align
the TMT. We present a conceptual design of the APS including the requirements, alignment modes, predicted
performance, software architecture, and an optical design.
An autonomous wavefront sensing and control software suite (APRC) has been developed as a method to calibrate the
internal static errors in the Palomar Adaptive Optics system. An image-based wavefront sensing algorithm, Adaptive
Modified Gerchberg-Saxton Phase Retrieval (MGS), provides wavefront error knowledge upon which actuator command
voltages are calculated for iterative wavefront control corrections. This automated, precise calibration eliminates non-common
path error to significantly reduce AO system internal error to the controllable limit of existing hardware, or can
be commanded to prescribed polynomials to facilitate high contrast astronomy. System diagnostics may be performed
through analysis of the wavefront result generated by the phase retrieval software.
We present a cost-effective scalable real-time wavefront control architecture based on off-the-shelf graphics processing
units hosted in an ultra-low latency, high-bandwidth interconnect PC cluster environment composed of modules written
in the component-oriented language of nesC. We demonstrate the architecture is capable of supporting the most
computation and memory intensive wavefront reconstruction method (vector-matrix-multiply) at frame rates up to 2
KHz with latency under 250 &mgr;s for the PALM-3000 adaptive optics systems, a state-of-the-art upgrade on the 5.1 meter
Hale Telescope that consists of a 64x64 subaperture Shack-Hartmann wavefront sensor and a 3368 active actuator high
order deformable mirror in series with a 349 actuator "woofer" DM. This architecture can easily scale up to support
larger AO systems at higher rates and lower latency.
KEYWORDS: Telescopes, Optical transfer functions, Error analysis, Space telescopes, Point spread functions, Wavefronts, Systems modeling, Spatial frequencies, Computer simulations, Thirty Meter Telescope
We investigate a new metric, Normalized Point Source Sensitivity (PSSN), for characterizing the seeing limited
performance of the Thirty Meter Telescope. As the PSSN metric is directly related to the photometric error of
background limited observations, it truly represents the efficiency loss in telescope observing time. The PSSN
metric properly accounts for the optical consequences of wavefront spatial frequency distributions due to different
error sources, which makes it superior to traditional metrics such as the 80% encircled energy diameter. We
analytically show that multiplication of individual PSSN values due to individual errors is a good approximation
for the total PSSN when various errors are considered simultaneously. We also numerically confirm this feature
for Zernike aberrations, as well as for the numerous error sources considered in the TMT error budget using a
ray optics simulator, Modeling and Analysis for Controlled Optical Systems. We also discuss other pertinent
features of the PSSN including its relations to Zernike aberration and RMS wavefront error.
The out-of-plane degrees of freedom (piston, tip, and tilt) of each of the 492 segments in the Thirty Meter Telescope
primary mirror will be actively controlled using three actuators per segment and two edge sensors along each intersegment
gap. We address two important topics for this system: edge sensor design, and the correction of fabrication and
installation errors.
The primary mirror segments are passively constrained in the three lateral degrees of freedom. We evaluate the segment
lateral motions due to the changing gravity vector and temperature, using site temperature and wind data, thermal
modeling, and finite-element analysis.
Sensor fabrication and installation errors combined with these lateral motions will induce errors in the sensor readings.
We evaluate these errors for a capacitive sensor design as a function of dihedral angle sensitivity. We also describe
operational scenarios for using the Alignment and Phasing System to correct the sensor readings for errors associated
with fabrication and installation.
We describe the work that has gone into taking the sodium Laser Guide Star (LGS) program on the Palomar AO system
from a successful experiment to a facility instrument. In particular, we describe the operation of the system, the BTO
(beam transfer optics) system which controls the path of the laser in the dome, the aircraft safety systems and the optical
systems which allow us to take advantage of the unique properties of the macro/micro pulse laser. In addition we
present on sky performance results that demonstrate K-band Strehl ratios of up to 48%
Deployed as a multi-user shared facility on the 5.1 meter Hale Telescope at Palomar Observatory, the PALM-3000 highorder
upgrade to the successful Palomar Adaptive Optics System will deliver extreme AO correction in the near-infrared,
and diffraction-limited images down to visible wavelengths, using both natural and sodium laser guide stars. Wavefront
control will be provided by two deformable mirrors, a 3368 active actuator woofer and 349 active actuator tweeter,
controlled at up to 3 kHz using an innovative wavefront processor based on a cluster of 17 graphics processing units. A
Shack-Hartmann wavefront sensor with selectable pupil sampling will provide high-order wavefront sensing, while an
infrared tip/tilt sensor and visible truth wavefront sensor will provide low-order LGS control. Four back-end instruments
are planned at first light: the PHARO near-infrared camera/spectrograph, the SWIFT visible light integral field
spectrograph, Project 1640, a near-infrared coronagraphic integral field spectrograph, and 888Cam, a high-resolution
visible light imager.
We consider high-resolution optical modeling of the Thirty Meter Telescope for the purpose of error budget and instrumentation trades utilizing the Modeling and Analysis for Controlled Optical Systems tool. Using this ray-trace and diffraction model we have simulated the TMT optical errors related to multiple effects including segment alignment and phasing, segment surface figures, temperature, and gravity. We have then modeled the effects of each TMT optical error in terms of the Point Source Sensitivity (a multiplicative image plane metric) for a seeing limited case and an adaptive optics corrected case (for the NFIRAOS). This modeling provides the information necessary to rapidly conduct design trades with respect to the planned telescope instrumentation and to optimize the telescope error budget.
PALM-3000 is proposed to be the first visible-light sodium laser guide star astronomical adaptive optics system. Deployed as a multi-user shared facility on the 5.1 meter Hale Telescope at Palomar Mountain, this state-of-the-art upgrade to the successful Palomar Adaptive Optics System will have the unique capability to open the visible light spectrum to diffraction-limited scientific access from the ground, providing angular imaging resolution as fine as 16 milliarcsec with modest sky coverage fraction.
High-contrast imaging, particularly direct detection of extrasolar planets, is a major science driver for the next
generation of extremely large telescopes such as the segmented Thirty Meter Telescope. This goal requires
more than merely diffraction-limited imaging, but also attention to residual scattered light from wavefront errors
and diffraction effects at the contrast level of 10-8-10-9. Using a wave-optics simulation of adaptive optics
and a diffraction suppression system we investigate diffraction from the segmentation geometry, intersegment
gaps, obscuration by the secondary mirror and its supports. We find that the large obscurations pose a greater
challenge than the much smaller segment gaps. In addition the impact of wavefront errors from the primary
mirror, including segment alignment and figure errors, are analyzed. Segment-to-segment reflectivity variations
and residual segment figure error will be the dominant error contributors from the primary mirror. Strategies to
mitigate these errors are discussed.
The next generation of adaptive optics (AO) systems, often referred to as extreme adaptive optics (ExAO), will use higher numbers of actuators to achieve wavefront correction levels below 100 nm, and so enable a host of new observations such as high-contrast coronagraphy. However, the number of potential coronagraph types is increasing rapidly, and selection of the most advantageous coronagraph is subject to many factors. Here it is pointed out that experiments in the ExAO regime can already be carried out with existing hardware, by using a well-corrected subaperture on an existing telescope. For example, by magnifying a 1.5 m diameter off-axis subaperture onto the AO system's deformable mirror (DM) on the Palomar Hale telescope, we have recently achieved stellar Strehl ratios as high as 92% to 94%, corresponding to wavefront errors of 85 - 100 nm. Using this approach, a wide variety of ExaO experiments can thus be carried out well before "next generation" ExAO systems are deployed on large telescopes. The potential experiments include infrared ExAO imaging and performance optimization, a comparison of coronagraphic approaches in the ExAO regime, visible wavelength AO, and predictive AO.
In this paper, we provide an overview of the adaptive optics (AO) program for the Thirty Meter Telescope (TMT) project, including an update on requirements; the philosophical approach to developing an overall AO system architecture; the recently completed conceptual designs for facility and instrument AO systems; anticipated first light capabilities and upgrade options; and the hardware, software, and controls interfaces with the remainder of the observatory. Supporting work in AO component development, lab and field tests, and simulation and analysis is also discussed. Further detail on all of these subjects may be found in additional papers in this conference.
Direct detection of exo-planets from the ground may be feasible with the advent of extreme-adaptive optics
(ExAO) on large telescopes. A major hurdle to achieving high contrasts behind a star suppression system
(10-8/hr-1/2) at small angular separations, is the "speckle noise" due to residual atmospheric and telescope-based
quasistatic amplitude and phase errors at mid-spatial frequencies. We examine the potential of a post-coronagraphic,
interferometric wavefront sensor to sense and adaptively correct just such errors. Pupil and focal
plane sensors are considered and the merits and drawbacks of each scheme are outlined. It is not inconceivable to
implement both schemes or even a hybrid scheme within a single instrument to significantly improve its scientific
capabilities. This work was carried out in context of the proposed Planet Formation Imager instrument for
Thirty Meter Telescope (TMT) project.
Direct detection of extrasolar Jovian planets is a major scientific motivation for the construction of future extremely
large telescopes such as the Thirty Meter Telescope (TMT). Such detection will require dedicated high-contrast AO
systems. Since the properties of Jovian planets and their parent stars vary enormously between different populations, the
instrument must be designed to meet specific scientific needs rather than a simple metric such as maximum Strehl ratio.
We present a design for such an instrument, the Planet Formation Imager (PFI) for TMT. It has four key science
missions. The first is the study of newly-formed planets on 5-10 AU scales in regions such as Taurus and Ophiucus -
this requires very small inner working distances that are only possible with a 30m or larger telescope. The second is a
robust census of extrasolar giant planets orbiting mature nearby stars. The third is detailed spectral characterization of
the brightest extrasolar planets. The final targets are circumstellar dust disks, including Zodiacal light analogs in the
inner parts of other solar systems. To achieve these, PFI combines advanced wavefront sensors, high-order MEMS
deformable mirrors, a coronagraph optimized for a finely- segmented primary mirror, and an integral field spectrograph.
We have built and field tested a multiple guide star tomograph with four Shack-Hartmann wavefront sensors. We predict the wavefront on the fourth sensor channel estimated using wavefront information from the other three channels using synchronously recorded data. This system helps in the design of wavefront sensors for future extremely large telescopes that will use multi conjugate adaptive optics and multi object adaptive optics. Different wavefront prediction algorithms are being tested with the data obtained. We describe the system, its current capabilities and some preliminary results.
The Thirty Meter Telescope (TMT) is a collaborative project between the California Institute of Technology
(CIT), the University of California (UC), the Association of Universities for Research in Astronomy (AURA),
and the Association of Canadian Universities for Research in Astronomy (ACURA). The Alignment and Phasing
System (APS) for the Thirty Meter Telescope will be a Shack-Hartmann type camera that will provide a variety
of measurements for telescope alignment, including segment tip/tilt and piston, segment figure, secondary and
tertiary figure, and overall primary/secondary/tertiary alignment. The APS will be modeled after the Phasing
Camera System (PCS), which performed most, but not all, of these tasks for the Keck Telescopes. We describe
the functions of the APS, including a novel supplemental approach to measuring and adjusting the segment
figures, which treats the segment aberrations as global variables.
Direct detection of planets around nearby stars requires the development of high-contrast imaging techniques, because of their very different respective fluxes. We thus investigated the innovative coronagraphic approach based on the use of a four-quadrant phase mask (FQPM). Simulations showed that, combined with high-level wavefront correction on an unobscured off-axis section of a large telescope, this method allows high-contrast imaging very close to stars, with detection capability superior to that of a traditional coronagraph. A FQPM instrument was thus built to test the feasibility of near-neighbor observations with our new off-axis approach on a ground-based telescope. In June 2005, we deployed our instrument to the Palomar 200-inch telescope, using existing facilities as much as possible for rapid implementation. In these initial observations, using data processing techniques specific to FQPM coronagraphs, we reached extinction levels of the order of 200:1. Here we discuss our simulations and on-sky results obtained so far.
The James Webb Space Telescope (JWST) Coarse Phase Sensor utilizes Dispersed Hartmann Sensing (DHS)1 to measure the inter-segment piston errors of the primary mirror. The DHS technique was tested on the Keck Telescope. Two DHS optical components were built to mate with the Keck optical and mechanical interfaces. DHS images were acquired using 20 different primary mirror configurations. The mirror configurations consisted of random segment pistons applied to 18 of the 36 segments. The inter-segment piston errors ranged from phased (approximately 0 μm) to as large as ±25 μm. Two broadband exposures were taken for each primary mirror configuration: one for the DHS component situated at 0°, and one for the DHS component situated at 60°. Finally, a "closed-loop" DHS sensing and control experiment was performed. Sensing algorithms developed by both Adaptive Optics Associates (AOA) and the Jet Propulsion Laboratory (JPL)2 were applied to the collected DHS images. The inter-segment piston errors determined by the AOA and JPL algorithms were compared to the actual piston steps. The data clearly demonstrates that the DHS works quite well as an estimator of segment-to-segment piston errors using stellar sources.
Direct detection of planets around nearby stars requires the development of high-contrast imaging techniques, because of their very different respective fluxes. This led us to investigate the new coronagraphic approach based on the use of a four-quadrant phase mask (FQPM). Combined with high-level wavefront correction on an unobscured off-axis section of a large telescope, this method allows high-contrast imaging very close to stars. Calculations indicate that for a given ground-based on-axis telescope, use of such an off-axis coronagraph provides a near-neighbor detection capability superior to that of a traditional coronagraph utilizing the full telescope aperture. A near-infrared laboratory experiment was first used to test our FQPM devices, and a rejection of 2000:1 was achieved. We next built an FQPM instrument to test the feasibility of near-neighbor observations with our new off-axis approach on a ground-based telescope. In June 2005, we deployed our instrument to the Palomar 200-inch telescope, using existing facilities as much as possible for rapid implementation. In these initial observations, stars were rejected to about the 100:1 level. Here we discuss our laboratory and on-sky experiments, and the results obtained so far.
We describe the current performance of an adaptive optics testbed for free space optical communication. This adaptive optics system allows for simulation of night and day-time observing on a 1 meter telescope with a 97 actuator deformable mirror. In lab-generated seeing of 2.1 arcseconds (at 0.5μm) the system achieves a Strehl of 21% at 1.064μm (210nm RMS wavefront). Predictions of the system's performance based on real-time wavefront sensor telemetry data and analytical equations are shown to agree with the observed image performance. We present experimentally measured gains in communications performance of 2-4dB in the received signal power when AO correction is applied in the presence of high background and turbulence at an uncoded bit error rate of 0.1. The data source was a 100Mbps on-offkeyed signal detected with an IR-enhanced avalanche photodiode detector as the receiver.
Work is underway at the University of Chicago and Caltech Optical Observatories to implement a sodium laser guide star adaptive optics system for the 200 inch Hale telescope at Palomar Observatory. The Chicago sum frequency laser (CSFL) consists of two pulsed, diode-pumped, mode-locked Nd:YAG lasers working at 1.064 micron and 1.32 micron wavelengths. Light from the two laser beams is mixed in a non-linear crystal to produce radiation centered at 589 nm with a spectral width of 1.0 GHz (FWHM) to match that of the Sodium-D2 line. Currently the 1.064 micron and 1.32 micron lasers produce 14 watts and 8 watts of TEM-00 power respectively. The laser runs at 500 Hz rep. rate with 10% duty cycle. This pulse format is similar to that of the MIT-Lincoln labs and allows range gating of unwanted Rayleigh scatter down an angle of 60 degrees to zenith angle. The laser system will be kept in the Coude lab and will be projected up to a laser launch telescope (LLT) bore-sited to the Hale telescope. The beam-transfer optics, which conveys the laser beam from the Coude lab to the LLT, consists of motorized mirrors that are controlled in real time using quad-cell positioning systems. This needs to be done to prevent laser beam wander due to deflections of the telescope while tracking. There is a central computer that monitors the laser beam propagation up to the LLT, the interlocks and safety system status, laser status and actively controls the motorized mirrors. We plan to install a wide-field visible camera (for high flying aircraft) and a narrow field of view (FoV) IR camera (for low-flying aircraft) as part of our aircraft avoidance system.
As adaptive optics (AO) matures, it becomes possible to envision AO systems oriented towards specific important scientific goals rather than general-purpose systems. One such goal for the next decade is the direct imaging detection of extrasolar planets. An "extreme" adaptive optics (ExAO) system optimized for extrasolar planet detection will have very high actuator counts and rapid update rates - designed for observations of bright stars - and will require exquisite internal calibration at the nanometer level. In addition to extrasolar planet detection, such a system will be capable of characterizing dust disks around young or mature stars, outflows from evolved stars, and high Strehl ratio imaging even at visible wavelengths. The NSF Center for Adaptive Optics has carried out a detailed conceptual design study for such an instrument, dubbed the eXtreme Adaptive Optics Planet Imager or XAOPI. XAOPI is a 4096-actuator AO system, notionally for the Keck telescope, capable of achieving contrast ratios >107 at angular separations of 0.2-1". ExAO system performance analysis is quite different than conventional AO systems - the spatial and temporal frequency content of wavefront error sources is as critical as their magnitude. We present here an overview of the XAOPI project, and an error budget highlighting the key areas determining achievable contrast. The most challenging requirement is for residual static errors to be less than 2 nm over the controlled range of spatial frequencies. If this can be achieved, direct imaging of extrasolar planets will be feasible within this decade.
KEYWORDS: Point spread functions, Diffraction, Adaptive optics, Photometry, Telescopes, Monte Carlo methods, Wavefronts, Computer simulations, Optical transfer functions, Modulation transfer functions
Strehl ratio is the most commonly used metric for adaptive optics (AO) performance. It is also the most misused metric. Every Strehl ratio measurement algorithm has subtle differences that result in different measured values. This creates problems when comparing different measurements of the same AO system and even more problems when trying to compare results from different systems. To determine how much the various algorithm difference actually impacted the measured values, we created a series of simulated point spread functions (PSF). The simulated PSFs were then sent around to the various members of the project who then measured the Strehl ratio. The measurements were done blindly, with no knowledge of the true Strehl ratio. We then compared the various measurements to the truth values. Each measurement cycle turned up impacts which were further investigated in the next cycle. We present the results of our comparisons showing the scatter in measured Strehl ratios and our best recommendations for computing an accurate Strehl ratio.
The nascent field of planet detection has yielded a host of extra-solar planet detections. To date, these detections have been the result of indirect techniques: the planet is inferred by precisely measuring its effect on the host star. Direct observation of extra-solar planets remains a challenging yet compelling goal. In this vein, the Center for Adaptive Optics has proposed a ground-based, high-actuator density extreme AO system (XAOPI), for a large (~10 m) telescope whose ultimate goal is to directly evidence a specific class of these objects: young and massive planets. Detailed system wave-front error budgets suggest that this system is a feasible, if not an ambitious, proposition. One key element in this error budget is the calibration and maintenance of the science camera wave front with respect to the wave-front sensor which currently has an allowable contribution of ~ 5 nanometers rms. This talk first summarizes the current status of calibration on existing ground-based AO systems, the magnitude of this effect in the system error budget and current techniques for mitigation. Subsequently, we will explore the nature of this calibration error term, it’s source in the non-commonality between the science camera and wave front sensor, and the effect of the temporal evolution of non-commonality. Finally, we will describe preliminary plans for sensing and controlling this error term. The sensing techniques include phase retrieval, phase contrast and external metrology. To conclude, a calibration scenario that meets the stringent requirement for XAOPI will be discussed.
High dynamic range coronagraphy targeted at discovering planets around nearby stars is often associated with monolithic, unobstructed aperture space telescopes. With the advent of extreme adaptive optics (ExAO) systems with thousands of sensing and correcting channels, the benefits of placing a near-infrared coronagraph on a large segmented mirror telescope become scientifically interesting. This is because increased aperture size produces a tremendous gain in achievable contrast at the same angular distance from a point source at Strehl ratios in excess of 90\% (and at lower Strehl ratios on future giant telescopes such as the Thirty Meter Telescope). We outline some of the design issues facing such a coronagraph, and model a band-limited coronagraph on an aperture with a Keck-like pupil. We examine the purely diffractive challenges facing the eXtreme AO Planetary Imager (XAOPI) given the Keck pupil geometry, notably its inter-segment gap spacing of 6~mm.
Classical Lyot coronagraphs, with hard-edged occulting stops, are not efficient enough at suppressing diffracted light, given XAOPI's scientific goal of imaging a young Jupiter at a separation as close as 0.15 arcseconds (4λD at H on Keck) from its parent star. With a 4000 channel ExAO system using an anti-aliased spatially-filtered wavefront sensor planned for XAOPI, we wish to keep diffracted light due to coronagraphic design at least as low as the noise floor set by AO system limitations. We study the band-limited Lyot coronagraph (BLC) as a baseline design instead of the classical design because of its efficient light suppression, as well as its analytical simplicity. We also develop ways of investigating tolerancing coronagraphic mask fabrication by utilizing the BLC design's mathematical tractability.
Dispersed Fringe Sensing (DFS) is an efficient and robust method for coarse phasing of segmented primary mirrors (from a quarter of a wavelength up to the depth of focus of a single segment, typically several tens of microns). Unlike phasing techniques currently used for ground-based segmented telescopes, DFS does not require the use of edge sensors to sense changes in the relative heights of adjacent segments; this makes it particularly well-suited to the phasing of space-borne segmented telescopes, such as the James Webb Space Telescope (JWST). In this work we validate DFS by using it to measure the pistons of the segments of one of the Keck telescopes; the results agree with those of the Shack-Hartmann based phasing scheme currently in use at Keck to within 2% over a range of initial piston errors of ±16 μm.
Ground based adaptive optics is a potentially powerful technique for direct imaging detection of extrasolar planets. Turbulence in the Earth's atmosphere imposes some fundamental limits, but the large size of ground-based telescopes compared to spacecraft can work to mitigate this. We are carrying out a design study for a dedicated ultra-high-contrast system, the eXtreme Adaptive Optics Planet Imager (XAOPI), which could be deployed on an 8-10m telescope in 2007. With a 4096-actuator MEMS deformable mirror it should achieve Strehl >0.9 in the near-IR. Using an innovative spatially filtered wavefront sensor, the system will be optimized to control scattered light over a large radius and suppress artifacts caused by static errors. We predict that it will achieve contrast levels of 107-108 at angular separations of 0.2-0.8" around a large sample of stars (R<7-10), sufficient to detect Jupiter-like planets through their near-IR emission over a wide range of ages and masses. We are constructing a high-contrast AO testbed to verify key concepts of our system, and present preliminary results here, showing an RMS wavefront error of <1.3 nm with a flat mirror.
The California Extremely Large Telescope (CELT) project has recently completed a 12-month conceptual design phase that has investigated major technology challenges in a number of Observatory subsystems, including adaptive optics (AO). The goal of this effort was not to adopt one or more specific AO architectures. Rather, it was to investigate the feasibility of adaptive optics correction of a 30-meter diameter telescope and to suggest realistic cost ceilings for various adaptive optics capabilities. We present here the key design issues uncovered during conceptual design and present two non-exclusive "baseline" adaptive optics concepts that are expected to be further developed during the following preliminary design phase. Further analysis, detailed engineering trade studies, and certain laboratory and telescope experiments must be performed, and key component technology prototypes demonstrated, prior to adopting one or more adaptive optics systems architectures for realization.
Adaptive optics systems with Shack-Hartmann wavefront sensors require reconstruction of the atmospheric phase error from subaperture slope measurements, with every sensor in the array being used in the computation of each actuator command. This fully populated reconstruction matrix can result in a significant computational burden for adaptive optics systems with large numbers of actuators. A method for generating sparse wavefront reconstruction matrices for adaptive optics is proposed. The method exploits the relevance of nearby subaperture slope measurements for control of an individual actuator, and relies upon the limited extent of the influence function for a zonal deformable mirror. Relying only on nearby sensor information can significantly reduce the calculation time for wavefront reconstruction. In addition, a hierarchic controller is proposed to recover some of the global wavefront information. The performance of these sparse wavefront reconstruction matrices was evaluated in simulation, and tested on the Palomar Adaptive Optics System. This paper presents some initial results from the simulations and experiments.
Adaptive optics (AO) systems currently under investigation will require at least two orders of magitude increase in the number of actuators, which in turn translates to effectively a 104 increase in compute latency. Since the performance of an AO system invariably improves as the compute latency decreases, it is important to study how today's computer systems will scale to address this expected increase in actuator utilization. This paper answers this question by characterizing the performance of a single deformable mirror (DM) Shack-Hartmann natural guide star AO system implemented on the present-generation digital signal processor (DSP) TMS320C6701 from Texas Instruments. We derive the compute latency of such a system in terms of a few basic parameters, such as the number of DM actuators, the number of data channels used to read out the camera pixels, the number of DSPs, the available memory bandwidth, as well as the inter-processor communication (IPC) bandwidth and the pixel transfer rate. We show how the results would scale for future systems that utilizes multiple DMs and guide stars. We demonstrate that the principal performance bottleneck of such a system is the available memory bandwidth of the processors and to lesser extent the IPC bandwidth. This paper concludes with suggestions for mitigating this bottleneck.
We present resolved images of the occultation of a binary star by Titan, recorded with the Palomar Observatory adaptive optics system on 20 December 2001 UT. These constitute the first resolved observations of a stellar occultation by a small body, and demonstrate several unique capabilities of diffraction-limited imaging systems for the study of planetary atmospheres. Two refracted stellar images are visible on Titan's limb throughout both events, displaying scintillations due to local density variations. Precise relative astrometry of the refracted stellar images with respect to the unnocculted component of the binary allows us to directly measure their altitude in Titan's atmosphere. Their changing positions also lead to simple demonstration of the finite oblateness of surfaces of constant pressure in Titan's mid-latitude stratosphere, consistent with the only previous measurement of Titan's zonal wind field.
Ultra-high contrast imaging with giant segmented mirror telescopes will involve light levels of order 10-6 times that of the central diffraction spike or less. At these levels it is important to quantify accurately a variety of diffraction effects, including segmentation geometry, intersegment gaps, obscuration by the secondary mirror and its supports, and segment alignment and figure errors, among others. We describe an accurate method for performing such calculations and present preliminary results in the context of the California Extremely Large Telescope.
We propose thin silicon wafers as the building blocks of highly segmented space telescope primary mirrors. Using embedded MEMS actuators operating at high bandwidth control, this technology can achieve diffraction-limited image quality in the 3-300 micron wavelength range. The use of silicon wafers as cryogenic mirror segments is carried forward considering a point design of a future FAIR-class NASA ORIGINS mission.
We recognize four major economic factors that justify a massive paradigm shift in the fabrication of ultralightweight space telescopes:
The precise process control and repeatability of silicon wafer manufacturing dramatically reduces the huge labor investment in mirror figuring experienced with Hubble Space Telescope.
Once developed, the incremental cost of additional space telescopes based upon proven silicon manufacturing techniques can be very small. We estimate the marginal cost of a 30m mirror when deploying a constellation can be as low as $36 million (Year 2002 dollars).
Federal R&D funding in the area of microelectromechanical devices and advanced 3-D silicon processing is certain to have far greater economic return than similar investments in other technologies, such as optical membrane technology.
The $300B per year silicon processing industry will continue to drive increased MEMS functionality, higher product yields, and lower cost. These advances will continue for decades.
The intention here is to present the case for the economic advantage of silicon as a highly functional optical substrate that can be fabricated using unparalleled industry experience with precision process control. We maintain that many architectures superior to the ASSiST concept presented here are possible, and hope that this effort prompts future thinking of the silicon wafer telescope paradigm
We have developed and tested extensively three different methods for phasing the primary mirror segments of the Keck telescopes. Two of these, referred to as the broadband and narrowband algorithms respectively, are physical optics generalizations of the Shack-Hartmann technique. The third, Phase Discontinuity Sensing, is a physical optics generalization of curvature sensing. We evaluate and compare experimental results with these techniques with regard to capture range (as large as 30 micrometers ), run-to-run variation (as small as 6 nm), execution time (as short as twenty minutes), systematic errors, ease of implementation, and other factors, in the context of the Keck telescopes and also of future very large ground-based telescopes.
We discuss conceptual design issues for a 1600 actuator tweeter mirror/multiconjugate AO upgrade to the 349 actuator Palomar Adaptive Optics System (PALAO). Based upon a 42 X 42 actuator Photonex deformable mirror technology, developed by Xinetics, Inc., this upgrade would enable unique science at visible wavelengths and deliver unprecedented near-infrared Strehl ratios for modestly bright (mV equals 9) guide stars. When used in conjunction with the existing 349 actuator Xinetics, Inc. deformable mirror, a series of pressing issues regarding the practical utility of multiconjugate adaptive correction for extremely large telescopes could be addressed. By utilizing a low noise (EEV39) wavefront sensor camera developed by SciMeasure Analytical Systems, Inc., this system would provide on-axis K-band Strehl ratio of > 95%, improving scientific throughput and enabling the detection and spectroscopy of unresolved companions in an unprecedented contrast space around nearby stars.
We describe the current performance of the Palomar 200 inch (5 m) adaptive optics system, which in December of 1998 achieved its first high order (241 actuators) lock on a natural guide star. In the K band (2.2 micrometer), the system has achieved Strehl ratios as high as 50% in the presence of 1.0 arcsecond seeing (0.5 micrometer). Predictions of the system's performance based on the analysis of real-time wavefront sensor telemetry data and an analysis based on a fitted Kolmogorov atmospheric model are shown to both agree with the observed science image performance. Performance predictions for various seeing conditions are presented and an analysis of the error budget is used to show which subsystems limit the performance of the AO system under various atmospheric conditions.
SciMeasure, in collaboration with Emory University and the Jet Propulsion Laboratory (JPL), has developed an extremely versatile CCD controller for use in adaptive optics, optical interferometry, and other applications requiring high-speed readout rates and/or low read noise. The overall architecture of this controller system will be discussed and its performance using both EEV CCD39 and MIT/LL CCID-19 detectors will be presented. Initially developed for adaptive optics applications, this controller is used in the Palomar Adaptive Optics program (PALAO), the AO system developed by JPL for the 200' Hale telescope at Palomar Mountain. An overview of the PALAO system is discussed and diffraction-limited science results will be shown. Recently modified under NASA SBIR Phase II funding for use in the Space Interferometry Mission testbeds, this controller is currently in use on the Micro- Arcsecond Metrology testbed at JPL. Details of a new vacuum- compatible remote CCD enclosure and specialized readout sequence programming will also be presented.
The essential benefit of adaptive optics is delivering a telescope point-spread function (PSF) limited by diffraction rather than by atmospheric turbulence. In practice, achieving diffraction-limited PSF diameters is relatively routine for modern high-order systems, at least within a restricted isoplanatic patch containing the guide star. The lower- intensity wings of the PSF, though, are often highly complex in their structure and subject to variability over short time scales. Spurious bright knots can occur among the secondary Airy maxima, and the 'waffle-mode' artifact may be problematic with a broad class of adaptive optics approaches. Good temporal stability of the adaptive-optic PSF is generally highly desirable if the full advantage of that spectacular PSF is to be realized; it is absolutely critical for many specific high resolution programs that can only be attempted with adaptive optics. In the course of commissioning the high-order adaptive optics system built at JPL for the Palomar 200-inch telescope, we have investigated PSF stability in the field under a variety of conditions. We discuss here our experimental findings at Palomar, and their implications for some key scientific programs.
We presented a detailed observational study of the capabilities of the Palomar Adaptive Optics System and the PHARO near infrared camera in coronagraphic mode. The camera provides two different focal plane occulting masks consisting of completely opaque circular disks of diameter 0.433 arcsec and 0.965 arcsec, both within the cryogenic dewar. In addition, three different pupil plane apodizing masks (a.k.a. Lyot masks) are provided which downsize the beam. The six different combinations of Lyot mask and focal plane mask provide for different levels of suppression of the point spread function of a bright star centered on the focal plane mask. We obtained images of the bright nearby star Gliese 614 with all six different configurations in the K-band filter. Herein, we provide an analysis of the dynamic range achievable with these configurations. The dynamic range (the ratio of the primary star intensity to the intensity of the faintest point source detectable in the images) is a complicated function of not only the angular separation of the primary star and companion, but also of the azimuthal angle because of the complex point spread function of the primary star, which is also wavelength dependent. However, beyond 2.5 arcseconds from the star, regardless of the wavelength of the observation, the detection limit of a companion is simply the limiting magnitude of the image, as determined by the sensitivity of the PHARO camera. Within that radius, the dynamic range is at least 8 magnitudes at the 5(sigma) level and as high as 12 in a one second exposure. This represents a substantial gain over similar techniques without adaptive optics, which are generally limited to radii beyond two arcsec. We provide a quantitative discussion and recommendation for the optimal configuration along with a detailed comparison with recent theoretical predictions of AO coronagraphic performance.
KEYWORDS: Stars, Adaptive optics, K band, Signal to noise ratio, Point spread functions, Monte Carlo methods, Photometry, Imaging systems, Wavefronts, Cameras
Images corrected with adaptive optics benefit from an increase in the amount of flux contained within the diffraction-limited core. The degree of this correction is measured by the Strehl ratio, equal to the ratio of the maximum observed intensity to the maximum theoretical intensity. Natural guide star adaptive optics systems are limited by the need for a guide star of adequate magnitude within suitable proximity to the science target. Thus, the above-described benefit can only be obtained for objects over a fraction of the total sky. Two nights of imaging the central region of the open star cluster NGC6871 with the Palomar Adaptive Optics System has supplied measurements of the Strehl ratio for numerous stars within the field. These measurements were used to calculate K band isoplanatic angles of 39 arcseconds (UT 1999 May 31) and 50 arcseconds (UT 1999 August 1). These isoplanatic angles are compared to those derived from Kolmogorov atmospheric theory, and their implications for adaptive optics systems are discussed.
Herbig-Haro objects are bright optical emission-line sources associated with tightly collimated jets ejected from pre-main- sequence stars. Only a few hundred are known. In optical images, they appear to be dense knots of material at the outer ends of the jets, and often exhibit streaming wake morphologies suggestive of bow shocks. Their optical spectra show characteristics of high-velocity shocks, with line-widths typically 100 km/s. HH objects often occur in pairs consistent with the bipolar morphology of outflows from YSOs; when radio maps of NH3 are made, high-density central regions consistent with collimating disks are seen. HH objects also often appear in a series along a jet, presumably where the jet undergoes a particularly energetic interaction with the ambient medium. Adaptively-corrected near-infrared studies of HH objects can reveal much about their workings at fine spatial scales. Narrow-band NIR filters sensitive to transitions of molecular hydrogen and other selected species are excellent tracers of shock excitation, and many HH objects have been observed to show complex structure in these lines down to the arc second level. By pushing to higher spatial resolution with adaptive optics, much more detailed information about the nature of the shock fronts may be obtained. In this paper we describe our first observations of HH objects with the AO system on the Palomar 200-inch telescope.
During its first year of shared-risk observations, the PALAO/PHARO adaptive optics system has been employed to obtain near-infrared R approximately 1000 spectra of solar system targets at spectroscopic slit widths of 0.5 and 0.1 arcsec, and corresponding spatial resolution along the slit as fine as 0.08 arcsec. Phenomena undergoing initial investigation include condensate formation in the atmospheres of Neptune, and the Saturnian moon, Titan. We present the results of this AO spectroscopy campaign and discuss AO specific considerations in the reduction and interpretation of this data.
Because of limitations in the alignment process resulting from the telescope active control system and from atmospheric turbulence, the segments of the Keck Telescope primary mirror are never perfectly aligned. We describe a scheme for classifying these misalignments in terms of modes of the primary mirror. We describe these modes in terms of noise propagation, symmetry (specifically their resemblance to Zernike polynomials), and how discontinuous they are. Modal spectra of typical mirror misalignments are presented. The edge discontinuities which result from these misalignments are significantly smaller than what one would expect from random, uncorrelated tip/tilt errors of the same size, a fact which may have important implications for adaptive optics systems on segmented mirror telescopes.
We describe a method for phasing segmented optics which makes use of a novel variation of the established technique of curvature sensing. In traditional curvature sensing, the difference between inside-of-focus and outside-of-focus images provides a direct measure of the curvature of a relatively smooth wavefront. We illustrate how this approach can be extended to enable one to measure and correct the discontinuous wavefronts associated with segmented mirrors. A detailed algorithm, based not on curvature measurement, but on correlation of the `difference image' with theoretical images or templates, is presented. In a series of tests of this `Phase Discontinuity Sensing' or PDS algorithm at the Keck 1 Telescope, at a wavelength of 3.3 microns, the RMS piston error (averaged over the 36 primary mirror segments) was repeatedly reduced from about 230 nm to 40 nm or less. Furthermore the PDS phasing solution was shown to be consistent with our previous `phasing camera' results (to within 66 nm RMS), providing strong independent confirmation of this earlier approach.
Jon Magnuson, Mitchell Troy, Mark Gibney, Kenneth Krall, Jon Tindall, Bradley Flanders, Michael Kovacich, David McIntyre, William Lutjens, Nielson Schulenburg
KEYWORDS: Sensors, Detection and tracking algorithms, Algorithm development, Data processing, Satellites, Missiles, Software development, Data centers, Surveillance, Lead
The Midcourse Space Experiment program will launch a satellite with several optical surveillance sensors onboard that will observe targets launched separately in dedicated and cooperative target programs. The satellite is scheduled to be launched in 1994 and the targets will be observed in several missions over the ensuing eighteen months. The Early Midcourse Target Experiments Team is developing ground based software that will process data to collect target signature phenomenology and demonstrate key surveillance system functions of the IR sensors during the early midcourse phase of a ballistic missile trajectory. Satellite sensor data will be transmitted to the ground and hosted at the Early Midcourse Data Analysis Center (EMDAC). The Early Midcourse Data Reduction and Analysis Workstation (EMDRAW) is a testbed for the algorithm chain of software modules which process the data from end to end, from Time Dependent Processing through object detection and tracking to discrimination. This paper will present the EMDRAW testbed and the baseline algorithm chain.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.