Summary of the Atmospheres Panel Meeting

--Mark Schoeberl (schom@zephyr.gsfc.nasa.gov), Code 910, NASA Goddard Space Flight Center

The Atmospheres Panel met October 19, 1998 during the EOS IWG in Durham, NH. The current Panel Chair, Daniel Jacob, was unable to attend the meeting so a previous Panel Chair, Mark Schoeberl, conducted the meeting. The Panel discussed an assortment of issues which are summarized below.

NPOESS--The converged National Polar-orbiting Operational Environmental Satellite System (NPOESS) is scheduled to take over many measurements being made by the AM and PM platforms. However, NPOESS requirements are primarily operational, and it isn't clear that the data will be of high enough quality for the needs of the research community. The Panel expressed frustration that the NPOESS requirements were defined without clear input from the research community at large. As an example, Gary Rottman noted that the solar UV measurements proposed for NPOESS were totally inadequate for any kind of monitoring. One Panel member responded that the UV monitoring requirements emerged as a compromise on the total solar irradiance measurements. This example indicates how some of the NPOESS requirements have come about and may now be inadequate for the research community. The Panel urged that since many NASA researchers would be depending heavily on NPOESS data after EOS AM, PM and CHEM, NASA HQ should have a more proactive input into the NPOESS instrument selection and definition process.

EOS-2--The panel was confused by the EOS-2 process presented by Pierre Morel. It wasn't clear in the presentation where we are in the definition of the EOS-2 process. Even though Morel showed STEP 1 charts, he kept referring to STEP 2. It also wasn't clear what is happening beyond STEP 2 and what kind of community input will be possible.

Validation of EOS instruments--There was some discussion of a stronger coupling between EOS instrument calibration/validation programs and the Research and Analysis Program. For example, the upcoming Sage III Ozone Loss and Validation Experiment combines validation of SAGE III with a major R & A missiona combination that benefits both. The Panel discussed how the calibration of several instruments could be aligned with science questions. For example, a field mission exploring aerosol properties might combine MODIS, MISR, and TOMS data along with aircraft lidars and aerosol in situ measurements. Chuck Kolb raised the issue whether the proper instrumentation for the validation missions was being developed and whether NASA aircraft resources would be sufficient for such validation missions. The Panel agreed that there was a need for clear communication between the EOS validation efforts and the R and A program.

Data Assimilation--The issue of data assimilation was raised by the Panel. The CERES team has noted that the Data Assimilation Office (DAO) GEOS-2 surface temperature and profile moisture data were not as high quality as the European Center for Medium-Range Weather Forecasts (ECMWF) data. They have asked NASA HQ if they could obtain ECMWF data for processing CERES data. The TES team has also noted that ECMWF surface pressure would be better for their algorithm. This has created some questions on whether the DAO development cycle can meet the needs of the EOS investigators. The Panel further noted that ECMWF data would not be adequate for investigations needing stratospheric information so that using ECMWF data could not be a universal solution. The Panel Chairman explained that the DAO had computer processing issues and was also looking at a different GCM model for the core system. The Panel expressed confidence that the DAO probably could meet EOS needs downstream, but the individual investigators also need to use the best meteorological data products available to generate research quality data.

PI Processing--A number of PIs complained that the PI Processing option under EOSDIS is being exercised in a confusing way. In two cases, the Instrument PIs were given 24 hours to estimate how much it would cost for them to set up their own data processing system. (This is a very short amount of time.) When they responded, EOSDIS replied that DIS could do it cheaper. However, when the PIs asked what assumptions were going into the DIS calculation, they couldn't get a clear answer. Thus, they were concerned that DIS hadn't performed the cost calculation the same way that the PIs had and perhaps had left out some crucial steps. The Panel recommends that the DIS and the PIs work together to iterate the computing requirements.