next up previous contents
Next: Observations dedicated to the Up: mrtcal-check Previous: Contents   Contents


Qualitative impact of the MRTCAL frequency dependent calibration

Differences between MIRA- and MRTCAL-calibrated spectra are expected as a consequence of the different calibration bandwidths. One value of the calibration parameters are derived and applied per 1.35GHz natural hardware unit when MIRA calibrates FTS200 spectra. In contrast, MRTCAL derives and applies the calibration in steps of 20MHz (in the current default settings for the automatic online data processing; this value can be customized by the user).

The MRTCAL default is intended to improve the quality of the baseline, as can be seen on Fig. 1 to 4. The average spectra computed from the same 15 minutes On-The-fly scan is displayed in each of the four figures. The only difference between the four spectra are the way MRTCAL calibrates the raw data.

  1. In Fig. 1, the calibration parameters are derived and applied per natural hardware unit (i.e., every 1.35GHz as defined by the FTS units). This gives a staircase look to the spectra.
  2. In Fig. 2, the calibration parameters are derived every 1.35GHz. But they are linearly interpolated before being applied. The staircase look is greatly decreased.
  3. In Fig. 3, the calibration parameters are derived and applied every 20MHz. No linear interpolation is done before application. The staircase look and the baseline oscillations disappeared. The atmospheric line around 110.8GHz now appears in absorption as it should because the reference position was localized about 1 degree away from the OTF observations.
  4. Finally, in Fig. 4, the calibration parameters are derived every 20MHz and linearly interpolated before being applied. The improvement with respect to the previous solution exists even though it is not obvious on this plot.
MRTCAL proposes by default the fourth solution. It thus delivers better behaved baselines. The MRTCAL default is also intended to improve the line calibration accuracy in regions where calibration parameters vary quickly as a function of frequency, as shown in Sect. 5.

Figure 1: Spectra averaged over a 15 minute On-The-Fly scan observed with a combination of E090 and FTS200 under average summer weather ($\sim 7$mm of precipitable water vapor). The simultaneously observed LSB and USB spectra are shown on the bottom panel and a 5GHz window near the 3mm band edge is zoomed on the top panel. The calibration parameters are here derived and applied by MRTCAL per natural hardware unit (1.35GHz bandwidth for each FTS unit, i.e., same as MIRA calibration scheme).
Figure 2: Same as Fig. 1 except that the calibration parameters are now linearly interpolated by being applied.
Figure 3: Spectra averaged over a 15 minute On-The-Fly scan observed with a combination of E090 and FTS200 under average summer weather ($\sim 7$mm of precipitable water vapor). The simultaneously observed LSB and USB spectra are shown on the bottom panel and a 5GHz window near the 3mm band edge is zoomed on the top panel. The calibration parameters are here derived and applied by MRTCAL every 20MHz.
Figure 4: Same as Fig. 3 except that the calibration parameters are now linearly interpolated by being applied. This spectra is the default delivered by MRTCAL.


next up previous contents
Next: Observations dedicated to the Up: mrtcal-check Previous: Contents   Contents
Gildas manager 2023-06-01