The task of operational software processors within the ground segment of an earth observation mission is to generate fast and continuously predefined products from the incoming raw data. Depending on the application, data products of different levels are produced, starting with calibrated products (level 1) followed by products comprising geophysical information (level 2) and value-added products (VA). Each processing step is an integral part of the processor.
Processors are embedded in a configuration-controlled environment and work largely without interactive intervention. This requirement is a challenge for the developers of the processor software, implying that processors are never off-the-shelf products, but rather the customized result of a series of successive developmental steps. The fundamental precondition is the availability of algorithms, which have to be adapted to the ground segment specifications. In the case of level-1 processors such algorithms usually evolve from ground calibration processes. In the case of level-2 or VA processors the starting point is scientific code for the retrieval of geophysical parameters.
In these cases the algorithm, which is often quite complex, has to be optimized to derive results of the requested precision with minimal use of resources (time, personnel). This is why operational processors are different from purely scientific software systems. Ideally, new calibration or retrieval methods are initially tested in a scientific environment, and if they prove to work successfully there, are incorporated in operational systems.
Operational processors are used to generate data in near-real-time (NRT) and in an off-line mode to produce the highest possible precision. For test and development purposes prototype versions evolve at each development level, supporting interactive control and—if required—detailed investigation of single processing steps. The Algorithm Theoretical Baseline Document (ATBD) describes in detail each algorithm which is included in a processor.
In the course of a mission, processors and baseline algorithms are continuously updated because of subsequent improvements in instrument calibration and ability to extract predefined and new geophysical parameters from the data. It is not unusual to reprocess the entire data volume of a mission from time to time, using updated state-of-the-art algorithms.
The Earth Observation Center (EOC) develops both operational and prototype-processors covering the entire earth observation portfolio, including: