Spaceborne synthetic aperture radar systems continuously evolve because of technological progresses and increasing user demands. The System Performance Group covers an end-to-end SAR expertise from system design and development to in-orbit operation, validation, and utilization of the delivered SAR data products. Our asset is based on the combination of system knowledge of SAR theory, simulation, and analysis, gained from over ten years of experience in on-going satellite missions, together with the expertise in the development and operational deployment of novel algorithms for the generation of higher level scientific products. Major activities and projects are shown in the following.
SAR data quality is influenced by several system parameters, such as sensor orbit and attitude, antenna patterns, transmit bandwidth, and pulse repetition frequency (PRF). The optimization of such parameters by simulation represents the first step for the design of a proper SAR acquisition. Our group is specialized in the optimization of the acquisitions parameters as well as the definition of effective strategies to achieve a high quality of the delivered products. Once the operational mission has started, the received SAR raw data is analyzed for detecting possible saturations, noise or channel imbalance effects, while the quality of the focused SAR images is further assessed by evaluating specific parameters, such as scene size, resolution, signal-to-noise ratio, and range/azimuth ambiguities. The obtained results serve as basis for further refining the acquisition strategy. An example is presented in Figure 1, where the suppression of ambiguities in TerraSAR-X quad polarization products by PRF optimization is depicted. For the TerraSAR-X and TanDEM-X satellite SAR missions, we are responsible for the definition and characterization of all SAR modes and products, up to Level 1b SAR image quality specification.
We also concentrate on the development and optimization of new algorithms to support on-going and future SAR missions, such as the study of innovative data reduction approaches and the investigation of full polarimetric data.
In addition to that, we concentrate on the definition and implementation of new algorithms to support on-going and future SAR missions, such as the development of innovative methods for on-board data reduction. Indeed, for present and next-generation spaceborne SAR missions, an increasingly large volume of on-board data is going to be demanded, which implies, from mission design, stronger requirements in terms of on-board memory and downlink capacity. In this scenario, SAR raw data quantization represents an aspect of primary importance, since the data rate employed for raw data digitization defines the amount of data to be stored and transmitted to the ground, but also, at the same time, it directly affects the performance of the SAR products. Figure 2 shows, on the left-hand side, the radar backscatter map of the urban area of Mexico City acquired by TanDEM-X and, on the right-hand side, the phase error map of the same area due to raw data quantization. The phase error was obtained by comparing the interferogram generated by digitizing the same raw data with a block adaptive quantizer (BAQ) at 2-bits/sample with the non-compressed one. Higher phase errors are located in areas of lower backscatter, and vice-versa. Research activities in the group are aiming at investigating novel methods for efficient data reduction methods in the context of future mission, such as multi-channel SAR or staggered SAR systems.
Excellent system performance and advanced data processing are the necessary basis for deriving high-quality products by using SAR interferometry. Since 2010 the TerraSAR-X and TanDEM-X satellites have been flying in a close-orbit formation, acting as single-pass interferometer. Such a configuration allows for the derivation of high-resolution digital elevation models (DEMs), by exploiting the phase difference between a co-registered pair of bistatic SAR images. The complex correlation coefficient of this data pair is the interferometric coherence and quantifies the amount of noise affecting the interferogram and, therefore, the final DEM quality (the higher the coherence, the better the final DEM performance). The expected coherence is predicted from the combination of several decorrelation sources which are due to the limited signal-to-noise ratio, quantization noise, ambiguities, range and azimuth shifts, temporal and volume decorrelation. We designed a global strategy for regularly acquiring data over dedicated test sites (so-called Long Term System Monitoring, LTSM) and investigating the interferometric coherence behavior with respect to the acquisition geometry and the land-cover type, as presented in Figure 3 (a). This valuable information has been exploited for supporting the development of an optimized acquisition strategy within the TanDEM-X mission, leading to a significant overall performance improvement by optimizing the acquisition geometry, e.g. over forested areas and sandy deserts (Figure 3 (b) and (c)).
Furthermore, a large-scale outlook on the final DEM performance is an essential feature for understanding possible systematic effects at local as well as global scale, and to define a reliable performance prediction and modelling. For each TanDEM-X bistatic scene, the operational TanDEM-X processor (ITP) produces several quicklook images as by-products from the interferometric processing chain, depicting several quantities, such as amplitude and interferometric coherence, at a ground pixels spacing of 50 x 50 meters. This unique global dataset represents the ideal starting point for the generation of global geo- maps and for the investigation of remote sensing related topics, aiming at delivering useful data for the international scientific community. For this purpose, we have combined quicklook images of several parameters, such as SAR amplitude, coherence, and the relative height error, which represents the random noise contribution affecting the estimated height and is a driving parameter for the definition of interferometric mission requirements. An example is presented in Figure 4, which depicts a global mosaic of the relative height error of the TanDEM-X DEM, derived from the coherence. The absolute height error includes all low-frequency DEM inaccuracies (due to, e.g., baseline or residual calibration errors). Figure 5 shows the global map of the vertical absolute height error at 90% confidence level of each TanDEM-X DEM tile (1°x1° in latitude/longitude).
In this framework, we are currently focusing on the development of new algorithms for the retrieval of geophysical quantities from SAR images, concentrating in particular on the analysis multi-temporal data, machine learning and deep learning based classification approaches, and multi-sensor data fusion. In the field of the biosphere, examples are given by the TanDEM-X Forest/Non-Forest Map , a new global binary layer at 50 x 50 meters resolution, which is depicted in Figure 6. This product represents a highly valuable starting point for monitoring deforestation at a large scale, or by investigation of multi-temporal Sentinel-1 stacks for forest monitoring purposes, by combining interferometric modeling and machine learning classification algorithms. In the field of the cryosphere, further examples are given by the classification of the Greenland ice sheet snow facies, by means of interferometric TanDEM-X data, and by the estimation of the X-band penetration depth, as shown in Figure 7.