|
1.INTRODUCTIONGRAVITY1 is an interferometer under development for the Very Large Telescope Interferometer (VLTI) that will combine the light of four telescopes in the near-infrared, resulting in images with a resolution of about 2 milli-arcseconds. Additionally it is designed to provide astrometry with a precision of 10 micro-arcseconds. Its main science goal is to detect motions close to the event horizon of the Galactic Center, but it is also well suited to study the kinematics of the Broad Line Region of Active Galactic Nuclei, disks around young stellar objects, evolved stars and even asteroids in our solar system. The entire observing sequence is controlled by the INstrument Software (INS) which is presented in two papers in these proceedings. Paper I focuses on the hardware aspects;2 here we give an overview of the software implementation with an emphasis on the high-level aspects. 2.INSTRUMENT SOFTWARE: OVERVIEWThe instrument software of GRAVITY is a distributed application in the framework of the ESO VLT software† that runs on a number of Linux workstations and real-time computers. Most computers run one or more “environments”. These are dedicated applications in charge of specific control tasks, e.g. the “workstation environment” (on the instrument workstation) includes processes for observing control while the different “ICS environments” (on the LCUs close to the hardware) most importantly run a server process to forward device commands to the proper hardware. The environments also run a number of helper processes and a messaging system to command the various control processes; the state of devices (e.g. shutter closed) as well as transient data (e.g. a pixel to visibility matrix, P2VM) are stored in an online database whose contents can be “scanned” between environments to share the data. On the highest level and during observations, the instrument is controlled through observing templates which command the GRAVITY Observation Software (OS), a server process that runs on the instrument workstation and controls its subsystems: the Instrument Control Software (ICS) and the Detector Control Software (DCS). It also controls real-time applications running under Tools for Advanced Control (TAC) and the Interferometer Supervisory System (ISS) which distributes these commands to all telescopes and the VLTI infrastructure. 2.1The Observation Software (OS) and the VLTI interfaceThe Observation Software (Fig. 1) and other high-level tasks are implemented as server processes and run physically on the “instrument workstation”, a standard Linux server that is connected via Ethernet to the instrument network. The machine currently runs three “environments”:
The main processes that are running on the instrument workstation and have been developed specifically for use in GRAVITY are:
Apart from these processes that have been created specifically for GRAVITY, there is also a large number of processes that belong to the standard VLT software environment. These processes have been adapted (e.g. by overloading functions that are empty by default) for our use. Some of the relevant high-level processes are:
2.2The Instrument Control Software (ICS) and real-time applications using TACThe Instrument Control Software consists of a server process running on the instrument workstation which evaluates commands and forwards them to the correct Local Control Unit (LCU). The LCUs run a real-time operating system (vxWorks) and provide the interface to the actual hardware (see Fig. 3). GRAVITY consists of two ICS LCUs with various devices such as motors, lamps, shutters, … attached.2 GRAVITY is further making use of five additional LCUs that run Tools for Advanced Control (TAC), a framework created for real-time control processes such as the fast control loops of GRAVITY (metrology, fringe tracker, tip/tilt/piston control and differential delay line control). The communication between these LCUs (and the additional non-realtime Kalman workstation) is displayed in Fig. 4. Communication between these TAC LCUs is handled through a fast fiber link, the Reflective Memory Network (RMN). 2.3The Detector Control Software (DCS)Detector control for the three infrared detectors of GRAVITY (two HAWAII-2RG and one SELEX) uses the standard ESO NGC architecture where the data reception task runs on a Linux computer which transmits the data frame-by-frame via Ethernet to the instrument workstation where the data are displayed (in the real-time displays), evaluated (e.g. by the acquisition camera control process) and eventually saved to disk (see Fig. 5). The workstation connected to the SELEX detector (which is used for the fast fringe tracking) has an additional direct link to the real-time fringe tracker TAC LCUs in order to process the raw data with as little latency as possible. The data are then copied to the fast RMN network by the TAC LCU and stored to disk using a dedicated data recorder facility, the RMN recorder7 that is configured to filter and store data in the formats required by GRAVITY. The exposures for all detectors are synchronized using a time signal provided by ESO using TIM devices (see below). 3.A GRAVITY OBSERVATION AND THE DATA PRODUCTA GRAVITY observation will consist essentially of two templates, an acquisition and an observing template. Additionally, various calibration templates may be needed, depending on the actual science observation. The acquisition sequence for the off-axis AO and off-axis fringe tracking case, is a staggered process that starts with the preset of the telescopes, domes and delay lines and the starting of the telescope guiding that is parallelized as much as possible (e.g. delay lines start to move at the same time as domes and telescopes start to move). After the telescope guiding has started, the star separators within the telescopes are set up such to relay one beam to the adaptive optics (AO) train and the other towards the VLTI delay lines. This includes the interactive confirmation of the two fields for all four telescopes. After the AO loop is closed and the science field is properly centered on the star separator, the fringe tracking object is selected on the GRAVITY-internal acquisition camera for all four beams. The fringe tracking star is then moved onto the fiber that feeds the fringe tracking arm of GRAVITY and the light in that arm is optimized using the internal tip-tilt mirror. Once fringes are found using the internal piston actuator (and offloading large values of piston to the main delay lines), the science object is confirmed interactively for all four beams and the light is optimized on this arm using the fiber positioners. After centering the science fringes using a group delay estimator, GRAVITY is ready for observations. When GRAVITY is ready for observations, the observation template sets up the exposures by requesting exposure identification numbers (expoIds) for each detector. The setup also includes sending a common start time to the “TIM” devices. When this time is reached, these devices start sending trigger signals in the defined time-intervals to the detectors. This way, the precise time of each sub-integration can be reconstructed from the starting time, the integration time and the exposure number. At the end of the exposure, the OS triggers the production of header files from its subsystems and writes the “archivation reference file” which specifies what files will be merged together to form the final FITS file with all extensions. In the standard ESO software this is done separately for each detector. For GRAVITY the requirement was, however, to merge the individual exposures and headers from all subsystems into one FITS file in order to ease the archiving process and simply to have all relevant information conveniently in a single file. For this, we modified the observation software such that file merging is deactivated by default in order not to create partly merged files. At the end of the exposure, “arf” files are written depending on the current instrument configuration and the archiver process is called from within the template to merge the files. To ensure fast disk I/O, our instrument workstation is equipped with solid state drives on which the merging process is performed within a few seconds for the maximum file size of 2 GiB. The merged FITS file consists of three binary tables provided by the VLTI subsystem, two image cubes (from the acquisition camera and science spectrometer detectors) as well as four binary tables recorded by the RMN recorder. Two of the RMN tables are provided by the fringe tracking application which is distributed over two LCUs, the other two are from the fiber differential delay line controller and the metrology controller (see Fig. 6). 4.AUXILIARY SYSTEMS FOR THE GRAVITY INSTRUMENT DEVELOPMENTThe GRAVITY instrument software produces technical log files containing both “FITS logs” (device temperatures, pressures, fan speeds, …) as well as messages from the processes running on the various computers. Depending on the instrument and testing operations, these log files amount to a few MiB up to a few GiB per day. We developed a log parsing and querying tool in order to easily extract and plot time sequences of relevant values, such as the output power of lasers, the temperature of some component or the amount of cooling water used. The tool consists of two parts. The back end parses the raw log file for relevant entries, formats them as SQL statements and inserts these into an SQLite database. The front end consists of a set of PHP scripts to query the database and plot the resulting values using the versatile open-source plotting library jpgraph‡. The web interface and an example plot are shown in Fig. 7. REFERENCESEisenhauer, F. e. a.,
“GRAVITY: Getting ready for Paranal,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Ott, T., Wieprecht, E., Burtscher, L., Kok, Y., and Yazici, S. e. a.,
“The GRAVITY instrument software / hardware related aspects,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Anugu, N. e. a.,
“The GRAVITY/VLTI acquisition camera software,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Pozna, E., Zins, G., Santin, P., and Beard, S.,
“A common framework for the observation software of astronomical instruments at ESO,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Pozna, E.,
“Evolution of the top level control software of astronomical instruments at ESO,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Pozna, E., Duc, T. P., Abuter, R., Ramirez, A., Merand, A., Mueller, A., Frahm, R., Schmid, C., and Morel, S.,
“Generic control software connecting astronomical instruments to the reflective memory data recording system of VLTI - bossvlti,”
in [Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series], Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Abuter, R., Popovic, D., Pozna, E., Sahlmann, J., and Eisenhauer, F.,
“The VLTI real-time reflective memory data streaming and recording system,”
in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series,
Google Scholar
Notes[1] see http://www.eso.org/projects/vlt/sw-dev/wwwdoc/VLT2011/dockitVLTCore.html for technical details on ESO technologies mentioned in this article |