WORLD CLIMATE RESEARCH PROGRAMME

World Climate Research Program LogoWorld Ocean Circulation Experiment Logo


WOCE UPPER OCEAN THERMAL DATA ASSEMBLY CENTRES COORDINATION GROUP
Report of the Fourth Meeting
(UOT/DAC-IV)

CSIRO Marine Laboratories,
Hobart, Australia
5-6 April 1993

WOCE Report No. 106/93

July 1993, WOCE International Project Office, Wormley

WOCE is a component of the World Climate Research Programme (WCRP), which was established by WMO and ICSU, and is carried out in association with IOC and SCOR. The scientific planning and development of WOCE is under the guidance of the JSC Scientific Steering Group for WOCE, assisted by the WOCE International Project Office. JSC is the main body of WMO-ICSU-IOC formulating overall WCRP scientific concepts.

UOT/DAC-IV Report

Contents


1. Introduction

2. Review of Procedures at participating Centres

2.1 Atlantic Ocean DAC
2.2 Indian Ocean DAC
2.3 Pacific Ocean DAC
2.4 GTSPP
2.5 Discussion

3. Quality Control of Real-time Data

4. The Global Subsurface Data Centre

5. Retaining or Deleting Data

6. Quality Control Flags

6.1 Comparison of Flags
6.2 MEDS Analysis of AOML Flags
6.3 Standardization of Processes and Flags

7. The 1990 Data Set

7.1 Status at Regional Centres
7.2 Timely Submission of Delayed Mode Data

8. Feedback

8.1 Operators Handbook
8.2 Network Monitoring

9. Date of Next Meeting

10. List of Action Items

Report

Appendix A: Agenda
Appendix B: Participants
Appendix C: Examples of Screen Displays from AOML

Quality Control Procedures
Appendix D: CSIRO QC Procedures: Extracts from the "Cookbook"
Appendix E: The BMRC Objective Analysis System
Appendix F: Quality Assurance at the Pacific DAC
Appendix G: GTSPP Activity Report
Appendix H: Brief Overview on the Implementation of the Global Surface DAC
Appendix I: MEDS Analysis of Atlantic DAC Flags

BIBLIOGRAPHIC CITATION

WOCE INTERNATIONAL PROJECT OFFICE 1993 Report of the fourth meeting of the WOCE Upper Ocean Thermal Data Assembly Centres Coordination Group, UOT/DAC-IV, CSIRO Marine Laboratories, Hobart, Australia.

WOCE International Project Office, WOCE Report 106/93, 59pp.

1. INTRODUCTION

The meeting was opened by the Chairperson, P. Holliday and the Group was welcomed to Tasmania and the Marine Laboratories by Dr Angus McEwan, Chief of CSIRO Division of Oceanography. Representatives attended the meeting from each of the Regional Centres and from the supporting National GTSPP Centres. The representative from the Global Centre was unable to attend.
(See Appendix B for list of participants).

2. REVIEW OF PROCEDURES AT PARTICIPATING CENTRES

2.1 Atlantic Ocean DAC

R. Molinari described the AOML quality control procedures which at present are used with the real-time data but are planned to be applied to the delayed mode (full resolution) datasets. Examples of profiles which have failed various tests and typical screens viewed by the operator were shown to the meeting and some are included in Appendix C . Data are received monthly from US-NODC via the SPAN network and consist of real-time data received by the GTSPP at MEDS. The data have undergone preliminary non-scientific quality control at MEDS and those flags are received by AOML and compared with the flags resulting from the scientific quality control.

The quality control procedures presently employed by AOML consist of:

- a duplicate profile test,

- a histogram test in which profiles are compared to climatological means and standard deviations to identify possible outliers,

- a waterfall plot test in which profiles from a particular cruise are checked for profile to profile consistency,

- a location and speed test to see if profiles within a cruise are consistent,

- vertical temperature sections are plotted,

- a mapping test in which monthly distribution of SST, temperature at 150 m and the average temperature of the upper 400 m are examined for "bull's-eyes".

In general the AOML procedures can be characterized as more subjective than objective. There are no automatic flags established by any of these tests. Profiles that have been identified as suspicious are examined individually and flagged as appropriate (after all tests have been performed).

2.2 Indian Ocean DAC

G. Meyers summarized the three components within Australia which combine to form the Indian Ocean DAC; the CSIRO Division of Oceanography, the Bureau of Meteorology Research Centre (BMRC) and the Australian Oceanographic Data Centre (AODC). Meyers noted that the aim of the group was to produce a quality controlled delayed mode dataset that other WOCE PIs can pick up and use without the need for further quality control. The CSIRO and BMRC group will only process delayed mode data because of a lack of resources and because strictly speaking the WOCE requirement is for the full resolution delayed mode data, not the radio data in real time. Real-time data are screened by BMRC and AODC, however, they do not use the procedures developed for the Indian Ocean. See Section 3 for more information on real-time data processing.

R. Bailey described the Indian Ocean DAC procedures and a new QC system being developed at CSIRO. The system allows a data quality expert to simultaneously carry out the three QC steps recommended for UOT/DACs; comparison to climatology, objective analysis including buddy-checking, and consistency-check of structure along cruise tracks. The system is fully interactive and the header information and profile data are checked by the expert on a voyage by voyage basis. Some features of the system are:

- Plot: raw data, header information, profiles, cruise track

- Check: position, time and speed of vessel; calibration of probe temperatures; common malfunctions; regional oceanographic features; profile to profile consistency; repeat profiles of unusual features

- Compare: against climatology and/or statistical analysis

- Edit data: remove start up transients; flag common malfunctions, flag oceanographic features, class data by depth (Class 0 - 4)

- Archive data: several archived sets with varying degrees of flagged data removed; raw data archived

- Products, maps: observed regional oceanographic features fed back into system adding to intelligence for checking stage.

The meeting was given a demonstration of the interactive system that has been developed. Further details and examples from the system are given in Appendix D .

A comprehensive "cookbook" for quality control of XBTs has recently been completed by CSIRO. The document represents the extensive expertise accumulated by the quality control operators for the area in which their data are collected. It is the information needed by the experts to carry out the scientific quality control, to distinguish between oceanographic features and instrument malfunctions. The meeting applauded the efforts of the CSIRO group to document their expertise and recognized the usefulness of such a document to the scientific community and to data centres. The other two regional centres agreed to produce a similar document for their particular basins. [Action Item 1]

N. Smith described the BMRC objective analysis system, which is an integral part of the delayed mode QC procedures used by the DAC. The BMRC system checks the consistency of neighbouring XBT data using totally objective and automatic criteria based on optimal interpolation. A full description is included in Appendix E . The data come directly from the Melbourne GTS and from previously edited data sets prepared by US-NODC and PMEL (TAO data). The system completes a duplicate check (eliminating duplicates), a cruise track check and a crude check against a 5 x 5 degree (latitude x longitude) climatology. There is no subjective inspection of the profiles. The objective data inspection has three phases:

- Pre-check of the deviation of each observations away from the first-guess against the total variance (this amounts to a sigma-test); 4-sigma rule is usually used

- In super-observation formation, each member is cross-validated against the expected super-observation value. The expected super-observation "error" is used as the basis for the QC criterion (usually 3 x E)

- All observations are cross-validated against the expected final analysis, using the estimated error in the analysis as the guide (usually 3 x E).

Points are awarded at each failure and when a certain number is reached the datum is removed from subsequent processing. Data may be removed after failure at one level if required.

The Group discussed briefly whether the BMRC flags should be included in the WOCE dataset returned to NODC. Because of the entirely different approach to quality control to the Regional Centres it was felt that this would not be appropriate. It was agreed that the BMRC flags should be supplied to MEDS to aid in track monitoring and data quality intercomparisons (see Section 3).

2.3 Pacific Ocean DAC

W. White described the procedures used at the JEDA Centre; a full description is given in Appendix F . Unlike the other Regional Centres, this DAC does not examine each individual profile. Instead, after the data have undergone duplicate tests, the first step of the procedure marks suspect profiles which fail the temperature anomaly test. In this test anomalous temperatures are compared with gridded historical subsurface temperature anomaly distributions; temperature outliers are marked as suspect. A local bimonthly climatology has been developed based on historical data for this procedure.

The second step is interactive; the suspect profiles are examined by an expert. The profiles are displayed with the climatology and the expert makes a decision about the validity of the profiled temperatures based on neighbouring profiles and the climatology. Contoured grids of bimonthly anomalous Tav (0/400 m), anomalous Tav (0/400 m) error fields, and anomalous SST fields are examined. The expert looks for "bull's-eyes", large horizontal gradients aligned with VOS routes and contours which seem to reveal thermal structure at variance with previous oceanographic knowledge. These mapping procedures also produce objective estimates of interpolation error which are subjectively examined by a scientist for inconsistencies in the resulting patterns. Flags are set by the expert at 11 selected levels in each profile and a single flag given to the profile as a whole.

At the meeting of the Coordination Group in Tallahassee in February 1990 (WOCE Report No 45/90) it was agreed that the Regional Centres would examine each individual profile and to flag each temperature-depth pair. The meeting expressed concern that the Pacific DAC was not at present following these guide-lines and recommended that they should do so as soon as possible. [Action Item 2]

2.4 GTSPP

R. Keeley outlined the activity of the GTSPP since January 1990; the full report is in Appendix G. The expansion of the project to include a total of 7 member states has lead to a increase in the number of observations received each month through the GTS. The GTSPP has also attracted more data to the project, in particular navy declassified XBT data. Analyses of the real-time data flow have allowed centres receiving real-time GTS data to discover and fix data routing problems. The GTSPP is preparing to review and modify the quality control procedures in the light of experience gained and to specify modifications and new tests required to manage data received in delayed mode.

2.5 Discussion

The discussions following the presentations focused on the details of the procedures used at individual centres and the effects that may be having on the levels of flags being assigned to the data. More details on the implications regarding the flags can be found in Section 6. Meanwhile some important recommendations regarding standardization were reached.

The Group felt that it was an appropriate time to write a formal definition of the scientific quality control they are seeking to implement. A good deal of experience has been accumulated particularly in the last 2 years and it is clearer now what is appropriate.

"Scientific Quality Control for WOCE XBT data:

Quality control of XBT profiles requires distinguishing between real vertical structure (such as internal waves, interleaving water masses and other small-scale processes) and the features in a profile that are due to malfunctions of the instrument, which sometimes resemble real features. We are developing a procedure to do this which relies on assessment of a profile in comparison to the best representation of the oceanographic state determined from neighbouring and historical data. The goal is to assess each new temperature/depth pair as "good", "probably good", "probably bad" and "bad" in a way that is as repeatable as possible, taking into account that a data quality expert will make subjective decisions based on pattern recognition of horizontal and vertical structure. To this end, it is necessary to develop a global catalogue of the vertical features (real and erroneous) that regularly appear in profiles and a procedure to incorporate the statistical information derived from objective mapping of the temperature field. The quality control procedures will reflect as far as possible an overall scientific judgment based on objective assessments and subjective evaluation of all information relevant to a particular measurement profile."

The next issue concerns the climatology used in histogram or sigma tests at all centres; the group wanted to ensure there is a standard climatology used, but without losing valuable local knowledge. So instead of adopting a single climatology such as the one developed by the JEDA Centre, all centres would use the Levitus climatologies as the first test, supplemented by a second test with their own local climatology based on the developing time series. The Levitus climatologies will be replaced by Levitus II when they are published. The local climatologies used must be well documented so end users of the data sets know what limits were used. [Action Item 3]

The Group was concerned that there was a certain amount of duplication of work, particularly regarding the elimination of duplicates from the datasets. GTSPP carry out duplicate checks before passing the data to NODC and each Regional Centre also has their own duplicate check; this may not be necessary. At present, for example, AOML finds approximately 10-15 duplicates out of 1000 profiles per month. This 1-2% of duplicates missed by GTSPP is an acceptable rate and Regional Centres which do not merge the GTSPP datasets with additional datasets do not need to re-do a duplicate test.

The issue of standardization of procedures amongst the participating centres has been discussed at previous meetings. Up until now it has been felt that standardization was premature because the system was still developing. The Group has now decided that the principal of standardization of procedures is desirable and moves to achieve that should begin as soon as possible. Details of how this will be achieved are described in Section 6.3.

3. QUALITY CONTROL OF REAL-TIME DATA

The requirement for WOCE is generally for the full resolution delayed mode data, but any real-time data which are not replaced by delayed mode data are also included in the WOCE dataset and quality controlled with the delayed mode data. The real-time data need to be examined soon after they are collected in order to identify vessels making consistent errors and to monitor the status of the network. AOML produce monthly analyses from the real-time data which are used precisely for this purpose and the IGOSS Operations Coordinator, using that information, has had some success with contacting badly performing platforms to alert them of the errors. The JEDA Centre has also been developing summaries of track and platform performance from the real-time data which provide more useful information.

With this in mind, the Group questioned the need for educated scientific judgment during quality control of real-time data prior to the quality control of the entire dataset in delayed mode. The required information as described above is essentially a managerial tool and it is suggested that it could be provided by a data centre such as MEDS. R. Keeley indicated that the GTSPP could provide this kind of information from its own quality control processes. The Group recommended that GTSPP take over the function of providing ship and track information from the real-time data from 1 January 1994, when quality control of real-time data at the Regional Centres will cease. The remainder of 1993 will be used as a spin-up period for GTSPP; the Regional Centres, BMRC and AODC (for the Indian Ocean) will forward their flags and analyses to GTSPP during that period for comparison. GTSPP will forward the resulting analyses to the IPO and the IGOSS Operations Coordinator for action. [Action Item 4]

4. THE GLOBAL SUBSURFACE DATA CENTRE

The Group reviewed the document submitted to the meeting by J-P. Rebert who was unfortunately unable to attend. The document can be seen in full in Appendix H. The Group was pleased to note that the Global Centre will be operational in June 1993. The issue of deleting or reducing data raised by the document began a lengthy discussion within the group. Rebert notes that the amount of information retained in the database must be reasonable in size if the timeliness requested by end users is to be maintained, and mentions reducing data to achieve that end. The document also mentions the common practice within the community of deleting "bad" data.

The issue which relates directly to the Global Centre concerns the amount of data and metadata which is to be archived at the Centre. The Group stressed the importance of the history file which will include the vital information on original values of data (particularly header information) if there have been alterations, and information codes explaining why particular classes of flags were chosen (see Section 6.3). The history code is amended each time a profile passes through a quality control procedure. WOCE requirements are for full resolution data and it is these data which the Regional Centres will concentrate their efforts on, and these data which must be archived at Brest and ultimately in the World Data Centres.

Therefore, it is not acceptable for metadata or profile data to be modified or reduced once those data have been processed by the Regional Centres. The Group recognized the logistical problems of maintaining a database of this size but urge the Global Centre to reconsider the policy outlined in the document. The AODC have experienced similar difficulties with large amounts of data using the ORACLE data base management system, and are willing to discuss with the Global Centre the ways in which they overcame those difficulties.

Between the meeting and the writing of this report, the Global Centre has assured the Group that appropriate action will be taken to ensure all the information provided by the Regional Centres is properly archived without any loss.

5. RETAINING OR DELETING DATA

A discussion on a general policy regarding the deletion of so-called "bad" data developed. Some, if not most operators delete a certain amount of their data before they are passed to national data centres and so to the UOT/DAC system. For example totally spurious profiles and bad "tails" after spikes may be deleted; many remove minor spikes and interpolate the data in their place, leaving no indication there was a spike at all even though some scientists believe data below spikes is likely to be incorrect. It is common to delete real repeat profiles too. The Group considered whether this practice is desirable and concluded that it was not. Outliers may be real, spikes may distort following data, data believed to be incorrect may in fact be correct, etc. The Group agreed the most appropriate course of action was to flag dubious data in profiles and to retain original values in the history file if information is altered. This should be the case at all stages of processing the data, from the originator through to the final archive. The Group developed a policy on maintaining the integrity of XBT data:

The WOCE UOT Data Assembly Centres recommend as a general principle that reduction or removal of information from ocean thermal data should be avoided. The originators of data should be encouraged to submit all data at the original resolution, as far as is practical, regardless of quality. Header or profile information that is modified must be recorded in history metadata so that the original information can be retrieved. [Action Item 5]

6. QUALITY CONTROL FLAGS

6.1 Comparison of Flags

As previously described, each Regional Centre has developed different subjective systems for the scientific quality control. One way to determine the effect of the differing systems on the classes of flags assigned to the data is to compare flags produced by the Regional Centres for a common dataset. Two months of real-time data have been processed by the JEDA Centre and AOML for this purpose (Atlantic, September - October 1992) and the flags submitted to Brest via NODC. The outcome of the comparison is awaited by the Group and particularly by NODC who do not wish to receive and integrate flagged data if problems remain. The Group recognized that a comparison based on real time data will not be a good intercomparison of the different centres because many of the difficult features in XBT profiles appear only in the full resolution (i.e. 0.6 m) form of the data, and they get filtered out in a BATHY message. However, the Group urged the Global Centre to complete this analysis as soon as possible. [Action Item 6]

The Group noted that now the Indian Ocean DAC has begun to operate a comparison of flags from all three Regional Centres is desirable. It was agreed that two months of delayed mode data from the Indian Ocean from 1990 will be quality controlled by each Centre and the resulting flags compared. [Action Item 7]

It came to light during the meeting that the approximate percentage of profiles given class 3 or 4 by the JEDA Centre (30%) is considerably higher than the percentage at CSIRO and AOML (10%). The Group agreed it would be an interesting exercise to compare statistics on numbers of profiles given each class of flag and why. This would provide general guidance as to whether the centres were performing along the same lines, and act as a precursor to the more detailed comparison outlined above. [Action Item 8]

6.2 MEDS Analysis of AOML Flags

MEDS has analyzed flags produced by AOML by comparing them to their own flags. The full description of the analysis is in Appendix I . Although the flags were produced by AOML some time ago and their system has developed further since then, the comparison is still useful. The major difference between AOML and MEDS flags was in the flags set on the profiles. MEDS would set an automatic flag class 2 if the profile failed the Levitus climatology, but often AOML would reset the flag to class 1. The next biggest differences is with features which have not failed the objective tests but which AOML classes as doubtful. A specific example is when there is an inversion in the lower part of the profile. AOML may flag this as class 3 after comparing to profiles from the same cruise individually and in temperature sections. MEDS have software which allows them to display historical profile from the same one degree square in the same month or one month either side. With the benefit of these historical records, the operator may agree with the class 3 or set a class 1. The Group noted the usefulness of this ability to call up historical records. AOML are developing a similar capability. Other examples are provided in Appendix I.

The Group was interested to note that MEDS have two kinds of automatic flags which may be set during their procedure; class 2 if the profile fails the Levitus climatology and class 3 if a BATHY has three data points in a row at the same value (noting that BATHYs usually consist of inflexion points and this would indicate a problem). The Group felt that it would be an improvement if the system alerted the operator to such failures rather than automatically assigning a doubtful flag, and allowed the operator to make the decision. [Action Item 9]

6.3 Standardization of Processes and Flags

As mentioned in Section 2.5 the standardization of processes is now considered desirable by the Group. The Group was impressed by the Australian interactive system and the expertise used to operate it. One of the important features of the system is the ability to include, in the history group, feature codes to identify oceanographic features and instrument malfunctions, and in doing so, giving the reason for the class of flag assigned. The Group noted the usefulness of having this information and agreed to adopt the principle that the history record contains codes to explain why each flag was chosen.

It was discussed whether the Australian system could be used by all the Regional Centres to ensure standardization, and the Group recommended that a system with the ability to include malfunction codes in the history group (for example, the Australian one) be used at all the Regional Centres. The JEDA Centre and AOML estimated having similar systems operating by the end of 1993, either by adopting the Australian system or modifying their own. [Action Item 10]

CSIRO has developed an extensive list of feature codes which is documented, with examples, in the QC Cookbook. Their flags, which include those for real features, capture the thought processes of the data quality expert as he/she works through the inevitably subjective decisions that must be made. In the future it is hoped that the process will be less subjective and that the process becomes more repeatable with experience; flagging real features is part of what is needed to move in that direction. It was also noted that the detailed flags might be useful in the future as artificial intelligence is applied to develop the expert system. AOML and the JEDA Centre expressed concern at the extra time and work including the detailed flags (particularly those for real features) would require and noted that they would at first concentrate on the common instrument malfunctions.

7. THE 1990 DATASET

7.1 Status at Regional Centres

AOML were ready to begin processing the 1990 delayed mode data set immediately but were holding off until a further 7000 profiles (globally) received at NODC could be transferred to them. The JEDA Centre have completed the processing using the system they have developed (see Section 2.3), but not to the level required by WOCE. But with the new principles described in Section 6.3 which have been adopted by the Group, there will be a further delay before the 1990 data set will be processed, while the new systems are implemented. The Group discussed whether it would be better to process and archive the 1990 data set now and use the new systems with the 1991 set. It was decided that since at present only a small portion of the 1990 set is delayed mode (approximately 30%) it would be more appropriate to wait a few months and apply the new system to the 1990 data. All Centres agreed the 1990 data set would be processed by the end of 1993. [Action Item 11]

7.2 Timely Submission of Delayed Mode Data

Delayed mode data are taking sometimes up to several years to reach NODC (Figure 1). The IOC/IODE guide-lines, which WOCE has adopted, state that delayed mode data should be submitted to the data centres within 1 year of collection. This clearly is not happening with the majority of the XBT data and the worry is that the DACs will be processing a largely incomplete data set and that re-analysis will have to be considered not long after the initial analysis, and perhaps even more than once.


Timely submission chart

Figure 1. Real time and delayed mode data making up the GTSPP data base

PIs and operators who are slow to submit delayed mode data can be identified if they have submitted real-time data. These operators must be contacted and encouraged to submit their full resolution data in a timely fashion. NODC will identify from which ships real-time data but not delayed mode data for 1990 and 1991 have been submitted. The IPO and NODC will identify and contact the groups or individuals who operate these ships. [Action Item 12]

8. FEEDBACK

8.1 Operators Handbook

The Group felt that it was possible that many operators had problems or did not behave exactly as WOCE or TOGA would like them to because although individual organization may provide their ships with documentation, there was no one document describing how a WOCE/TOGA Operator should operate. It was agreed that such a document would be a practical aid to managing the system. The document should help operators recognize problems with profiles and list specifics about data submission, and essentially would bring operators and scientists closer together. The Group recommend that a WOCE/TOGA XBT Operators Handbook be written. [Action Item 13]

8.2 Network Monitoring

It is important to know from the real-time data soon after it is collected whether the TWI lines are being sampled sufficiently for the WOCE and TOGA requirements. Previous IGOSS reports have attempted to supply this information on a monthly basis, but have not been successful because of the difficulties some operators have in obtaining that information on a monthly basis. The IGOSS Ship-of-Opportunity Programme Meeting (23-26 March 1993, Hobart) designed two new reports which separate real-time and delayed mode information. The real-time report will consist of the number of individual messages by call sign received at each GTS centre and will be produced every month. The delayed mode report will be produced six-monthly and will include information on each TWI line; for each call sign, the ship name, number of good profiles, number of BATHYs transmitted and number of sections completed. The latter report will provide the delayed mode monitoring information WOCE requires if every participating group contributes.

C. Noe supplied the meeting with copies of the NOAA/NOS summaries of SEAS data for 1992 and all real-time data for 1992 (NODC data set). The semi-automated system assigns the data to TWI lines and the Group noted the usefulness of the system and the information it provides. The Group recommended NOS consider the possibility of using the system to analyze further datasets such as the 1990 delayed mode set.

The Group identified a gap in the proposed reporting; that of a summary of activity on each line on a monthly basis. NODC will provide this information from the GTSPP dataset on a monthly basis. The report will be combined with the ship performance report by MEDS and distributed each month.

9. DATE OF NEXT MEETING

The Group noted the benefits of having the next coordination meeting back-to-back with the TOGA/WOCE XBT/XCTD Programme Planning Committee (TWXXPPC) and the next GTSPP meeting, possibly in Spring 1994. The Group agreed it would be beneficial to hold the meeting at one of the operational centres. R. Keeley indicated the next GTSPP meeting is being planned for the autumn of 1993.

LIST OF ACTION ITEMS

1. Pacific and Atlantic regional Centres to produce a "cookbook" documenting their regional expertise.

2. Pacific DAC to modify procedure to include examination of each individual profile not just suspect profiles, and to flag each temperature-depth pair.

3. All regional Centres to adopt Levitus climatologies in histogram/sigma test, followed by similar test using locally accumulated climatology.

4. IPO to convey to the WOCE SSG the recommendation that quality control of real-time data cease at the Regional Centres at the start of 1994 from when the GTSPP will provide the necessary ship and track performance information.

5. IPO to convey to the WOCE SSG the recommendation regarding maintaining the integrity of XBT data.

6. Global Centre to complete comparison of the JEDA Centre and AOML flags.

7. Two months of 1990 delayed mode data for the Indian Ocean processed by all three Regional Centres to be subjected to flag comparison analysis (MEDS).

8. All Regional Centres to provide the Group with statistics illustrating overall percentage of profiles given each flag class and which tests are commonly failed.

9. GTSPP to consider modifying their quality control procedure to eliminate some automatic flagging and install a warning facility for the operator should a profile fail an objective test.

10. Atlantic and Pacific DACs to develop quality control systems which enable codes identifying malfunction features to be included in the history group, thus explaining the flags.

11. All Regional Centres to complete the processing of the 1990 dataset by the end of 1993.

12. IPO and NODC to identify and contact ships and operators slow to submit delayed mode data, with a view to increasing the amount of delayed mode data submitted to NODC within the recommended 1 year period.

13. TWXXPPC to consider drawing up a WOCE/TOGA XBT Operators Handbook (IPO to coordinate).