Monday 29 November 2010

Brockmann Consult / University of Leicester

Gary Corlett (UoL), Norman Fomferra, Martin Böttcher, Ralf Quast, Marco Zühlke (BC)
Telecon with Chris Merchant near end of their discussions on 26 November



GC and BC staff collocated at BC from 24 – 26 Nov in order to lay down the system requirements for the SST Multi-sensor Match-up System (MMS) and to sketch a rough development plan for its first version to be ready by the end of Jan 2011.

On day 1, GC gave an overview about the SST cal/val work and the role of the MMS for the SST_cci project. Various related projects have been discussed, especially the (A)ATSR L2P Project and GHRSST with respect to their Match-up databases.

For subsequent discussions on the MMS requirements, the team identified the possible MMS users, the various datasets it creates and agreed on the following definition of terms:

·       MD – Single-sensor match-up dataset. It is usually a NetCDF file comprising single-sensor match-up records. Each record comprises a single reference (in situ, in hi-lat cases “dummy in-situ” with invalid SST value) with a single satellite. MDs are ingested by the MMS and are versioned and archived for full backward traceability.

·       MMD – Multi-sensor match-up dataset comprising multi-sensor match-up records. Each record is a coincidence of a single in situ (or dummy) with multiple satellites and auxiliary data. MMDs are extracted from the MMS as NetCDF files in a well-specified format.

·       MMS database – The database that stores the in-situ data, the searchable fields of all ingested MDs and all detected coincidences (match-ups of in-situ-satellite and satellite-satellite).

Day 2 was fully dedicated to the gathering of system requirements which have been written down in a SST_cci Toolbox Wiki at https://github.com/bcdev/sst-cci-toolbox/. GC will integrate these in the Product Validation Plan document.

IR SST algorithm improvements are taking place from project kick-off+6 onwards, so a first version of the data set (DBT2) deliverable to support these improvements will be made available to algorithm developers. This will be the Algorithm Testing MMD – an extract from the MMS database. This MMD shall be ready by end of Jan 2011. It will contain prescribed "test" match-ups (for algorithm development) and "training" match-ups (for fair algorithm assessment). It will exclude some match-ups that are designated as "reference" match-ups, which will be used as the Reference MMD  later in the project. 

The first version of the MMS will have a simple command-line interface for ingestion and extraction of data and database inspection. As part of the ingestion process, the MMS will compute coincidences from MD input files. In addition to the coincidence criteria where there is always an ATSR observation, it has been agreed that triplets of Metop+ SEVIRI+PMW should also be collected. During collocation, the MMS will assign additional satellite and auxiliary data including interpolated NWP values, aerosol and sea-ice data. For each match-up, also a 24h in-situ history (+/-12h from satellite observation time) will be included. Investigations will be made to let the MMS directly call RTTOV-10 (simulation model) and ingest the RTTOV-10 output automatically back into the MMS database.

There are still unknowns regarding the physical deployment of the MMS to the ECDF (the Edinburgh computing cluster). In the next few days, BC will compile the external system requirements (e.g. DBMS, web server, ssh, Java JDK) and send it to Marc Filipak and Owen Embury. A telecon will be held on Nov 13, 10:30 (UK time) dedicated to address the MSS/ ECDF requirements.

Day 3 was used to review the requirements and develop a detailed action list in order to meet the MMD delivery at KO+6, by end of Jan 2011. The immediate actions for Dec 2010 are:

BC: Send external system requirements to Marc and Owen
BC: Update BEAM to also read NOAA AVHRR GAC
BC: Provide existing (A)ATSR childgen tool to Gary
GC: Describe data items to be used for the first month of data
BC: Develop flexible database schema which lets us add fields at any time
BC: Develop ingestion function to convert MD files and compute coincidences
BC: Develop extraction function to generate a MMD from given query
GC: Provide BC with one month of data (all different types)
GC: Convert MD files to monthly, reformat old SEVIRI to NetCDF
GC: Update PVP document with MMS requirements
CM: Provide BC with the RTTOV-10 configuration

CM discussed the progress with GC and the BC staff in a telecon during day 3. 

DONM by telecon,  Dec 13, 10:30 (UK time) dedicated to address the MSS/ ECDF requirements

Wednesday 17 November 2010

Met Office Telecon 5

Me, Nick Rayner, Simon Good by skype
17 Nov 2010

NR has received feedback from ESA (Craig Donlon) on the User Requirements Document. Several helpful changes to the URD have been requested, and will be addressed by the end of November, at which point it will be approved and published on the website.

The next stage is to use the URD to inform the Product Specification Document. We reviewed the WP description, discussed this, and there are essentially two classes of approach. 1. Define some objective criteria to apply to the URD data to create specifications. 2. Work from the PSD implicit in ESA's SoW and our proposal, and look for discrepancies with what users have asked for. NR and SG propose to do both approaches and see how the outcomes compare, which I think is an excellent approach. In both cases, the results then have to be appraised for technical feasibility (can specified products be created in principle?) and resource feasibility (can we do it within the plan for the SST CCI).

SG aims to draft the PSD for team review by the end of November, for a telecon discussion early in December. Interactive discussions will be solicited from users, planned for early January. Issue A will be submitted to ESA at end of January.

CD has sought advice on climate user requirements prior to attending a GCOS session in January. We agreed the CCI approach to this would be structured as follows: 1. NR and SG will complete (and expand if necessary) the template for requirements circulated by ESA as a basis for summarising our advice, by 3 December, and pass to me for review and agreement. 2. I will add any commentary appropriate from Science Leader's perspective in my submission to CD.

Plans for the January and July project meetings were discussed.

DONM Fri 10 Dec 10.00

Tuesday 16 November 2010

Leicester Telecon 3

Chris Merchant, Mark Filipiak and (by telephone) Gary Corlett

GC reported that the Data Access Requirements Document is essentially assembled, having received comments back, including additional of data of relevance to high latitudes recommended by Scandinavian colleagues.

The outline of the Product Validation Plan has been sent to parties who are contributing parts of that plan.

Regarding the multi-sensor match-up data (MMD), all the required inputs from Meteo France are collected and complete up to September 2010. The creation of the ATSR matches to different in situ data sources is almost completed, including the additional data sets from Scandinavian colleagues. The tools to associate matches with ECMWF fields will need to work on ERA Interim data at 1.5 deg resolution (currently available) in the first instance, and therefore will need to be coded flexibly to deal with ECMWF data of higher resolution later. GC is going to Germany to work through the MMD coding plan with Norman Fomerra and the Brockmann Consult team next week. CM and GC agreed that, for the passive microwave sensors, we would simply use the L2P version throughout, since we need to know the L2P product confidence flags for comparison with the new product confidence algorithm that will be developed within the project.

Regarding data set collection, the sourcing of AVHRR GAC is yet unresolved. Supply from CLASS via Ken Casey would take >10 months. The CLASS archive are unable to ship it directly on tape because of resource limitations at their end. The ECMWF copy requires special access, and GC will arrange a speed test to UoL.

The question of why the SCOPE-CM BTs are significantly different than expected is also pending. But it does seem that the initial estimates of the availability of SCOPE-CM data were too optimistic.

Decisions:
MF will establish what GAC CLASS could provide by end Jan 2011, raising possibility of disk instead of tape.
MF will start downloading 2010 GAC data from CLASS web ordering and ftp for NOAA-15,16,17,18,19 and Metop-2, which alone is estimated to require 2 months.
GC will arrange a speed test for downloading GAC from ECMWF to UoL.
MF will document SCOPE-GAC comparisons further.

DONM: Telecon with GC at Brockmann Consult Friday 26th Nov at 10 UTC.

Wednesday 10 November 2010

NASA SST Science Team

Seattle, 8-10 November.
Chris Merchant representing SST CCI.

A real scientific buzz at this NASA SST meeting, with lots of good progress reported, particularly in understanding SST variability across the full range of time scales and space scales (literally from O(1 s, 1 cm) to O(year, global)), and thinking through how that affects our statements about SST and its uncertainty in remotely sensed products.

I (CM) presented the principal concepts underlying the approach in the SST CCI to the long-term climate data record (CDR). Generated quite a lot of questions and discussion. Generally, the idea of tying up the ATSR and AVHRR time series via radiance cross-calibration was welcomed as an innovative approach. It is different to the US route, and this is seen as helpful. AVHRR folks (Ken Casey, Bob Evans, Sasha Ignatov) seem keen to interact with us during the SST CCI, which is excellent.

I tried to communicate some of the subtleties about assessing CDR quality using the range of measures including formal independence of the information content as well as the traditional accuracy, resolution, etc. I presented the opportunities for involvement, including the algorithm selection process via the round robin. Sasha is interested in the Algorithm Selection exercise, and we discussed the contents necessary for his group to participate efficiently using their new "hybrid" algorithm.

Off-line discussions with Ken about transferring the AVHRR archive across the Atlantic have sparked some download experiments using alternative routes, but none so far have given us the data rate we need (by a factor of three or four too slow). Still need to pursue all possibilities.

Carol-Anne Clayson and Alexander Soloviev showed diurnal warming modelling results that explain the 5 pm kink in the average diurnal cycle in SST-skin: as solar insolation reduces there is a transition in the depth communicating with the atmosphere accompanied by a period of turbulent kinetic energy increasing temporarily over a greater depth. Thereafter, the thermal inertia being cooled is greater, and so the rate of cooling diminishes to the rate is sustains during the night. The diurnal cycle shape seen in their idealized (constant wind speed) model runs is essentially identical to the global average diurnal cycle shape Mark Filipiak gets from his analysis of Pierre Le Borgne's SEVIRI observations, including the peak time and the time of this TKE-related transition. They also had different explanations for the observation that the >4 K diurnal events all seem to happen in the mid-latitudes, not in the tropics. Carol-Anne finds a role for the extended length of day during mid latitude summer, while Alexander reckons they happen when the air-sea temperature difference is more positive than typically seen in the tropics. The latter idea certainly fits with the tendency of the >4 K event distribution to be nearer land masses, and could easily be tested by getting the NWP data for the events.

Some discussion from Dudley Chelton and Dick Reynolds about how sampling limits the true information in L4 analyses at higher spatial frequencies, this being relevant to the satellite-only OSTIA analysis we will generate in the SST CCI. Seems to me, the key thing is to construct proper, dynamic uncertainty estimates, including covariance effects, rather than to smooth the data, which is Dudley's recommendation. Although, for those interested in SST gradients rather than absolutes, they did show that under-sampling can introduce big problems (spurious gradients particularly from the patchiness of IR data).

Lots of enthusiasm for coming to next year's GHRSST meeting that we are hosting in Edinburgh -- should be a great event.

Wednesday 3 November 2010

Leicester Telecon 2

Gary Corlett and Chris Merchant by telephone
2 Nov 2010

GC has circulated the draft Data Access Requirements Document round the project team, for comments and completion of some missing/difficult-to-source information about data sets. Replies are solicited for 10 Nov, the Met Office input being key. CM asked GC specifically to request also the DMI and met.no colleagues to review the sea ice / high latitude contents of the data being sought.

Key problem/uncertainty is the best source for AVHRR. First look shows >1 K differences in retrieved SST when using SCOPE-CM BTs compared to NOAA GAC, and also scan-line striping. Use of the cloud information seemed feasible: there are obvious failures to detect cloud in the SCOPE-CM data that are absent from NOAA GAC, but the reverse is also true. It was agreed that a telecon between Leicester (GC), Edinburgh (Mark Filipiak) and the SCOPE CM contact is needed to address the following:

1. Why the BTs are different by >1 K in clear sky areas in the mean.
2. Why there is striping relative to GAC, and if this striping is reasonably known to be zero-mean.
3. Realistic time scale for obtaining the full SCOPE CM data set.
4. Where SCOPE CM source their NOAA GAC input.

CM has commented on GC's Product Validation Plan Table of Contents, and it was agreed to stick to the contractual contents, except that a section introducing the Multi-sensor Match-up Dataset is essential for the document to make sense.

Argo matches with AATSR and further SEVIRI matches from Meteo France have been obtained at Leicester since last discussion. Access to the Edinburgh cluster is problematic so far -- GC and Mark F will work on this.

Regarding aerosol data, agreed solution is as follows. GC has found http://www.temis.nl/airpollution/absaai/, which offers daily GOME-2 Absorbing Aerosol Index in TOMS-like format. We will use TOMS, GOME-1, TOMS, OMI & GOME-2 to cover as much as we can from 01/08/1991 to 31/12/2010 (long-term record) and then GOME-2 (if available) for 01/10/2011 to 31/03/2012 (short-term demonstration). Overlap where possible between sensors (up to one year) to do some basic AAI intercomparisons with TOMS as well. This means there will be roughly 7 months of data missing from 24/11/1994 (the last METEOR 3 file, although the data starts to degrade a few weeks prior to this date) to 01/07/1995 (the start of GOME-1, although there are a few odd days of missing GOME-1 data). For the Pinatubo aerosol, we'll use the SAGE II data monthly means for the period 01/08/1991 to at least the end of 1994 to look for stratospheric aerosol correlations during the Mount Pinatubo eruption.

CM will attend the NASA SST meeting, and will use the opportunity to discuss AVHRR data sourcing. CM will not attend the next AATSR QWG (no funding and it is not in UK), but in any case, the AATSR SAG meeting the following week looks very pertinent and will summarize some of the same ground.

DONM Tue 16 Nov, 10.00