Friday 17 December 2010

Leicester Telecon 4

Chris, Mark Filipiak and Jon Mittaz (Maryland / NOAA) in Edinburgh
Gary Corlett in Leicester
10.30 - noon 17 December 2010

Data Access Requirements Document

GC and CM agreed that (i) priority is data access requirements relevant to the envisaged system, over the additional data that may be useful within the project as a whole, and (ii) GC should ask for comments on draft in present form from ESA technical officer. Presently, all major information is present, but some obscure information is not yet available.

Product Validation Plan

GC had suggested to Nick Rayner and Simon Good that their metrics for assessing the degree to which envisaged products meet user requirements should be used within the climate assessment plan. CM agrees this is an excellent idea. GC's work on PVP development has been strongly supported by Jacob Hoyer re high latitude aspects. Algorithm selection plan is in final drafting at Edinburgh. GC will draw together the document for presentation and discussion at our upcoming project meeting in early January (PM2).

Multi-sensor match-up dataset

Marco Zuelhlke and Martin Boettcher (BC) are writing data readers at present. Timescale for availability of first outputs within project is probably mid February. An important task is to identify the list of queryable fields that will be held as a data base to link the data and enable querying and extraction. Need to develop this during the PM2.

AVHRR GAC

Progress has been made on trying different routes to obtain complete archive. Andy Harris has been able to stage 2010 at NOAA in such as a way as to allow a much more efficient ftp transfer, so should be here by start of January.

For MMD extraction, approach at BC is to adapt existing java HRPT readers. Some discussion about this and how to accommodate (i) the need for the MMD to be consistent with the future processor (which will be in Fortran, and based on readers adapted by JM), and (ii) the fact that JM's understanding of calibration of each AVHRR is still under development within his NCDC CDR project, and will continue to be so for the next couple of years.

We agreed that the MMD should hold, over the extracted segment
1. smoothed infrared calibration information (blackbody PRT x 4, BB counts, space counts)
2. all channel counts
3. onboard time
4. satellite and solar angle information
5. visible calibration information
6. the original and a modified solar contamination flag

More comments on some of these:

1. It is normal to smooth the calibration data, not least to minimise the impact of PRT digitisation errors. Different groups use different intervals. JM uses current line +/- 27 lines in a boxcar average. Some smoothing is necessary and helpful, and we have no reason to think JM's approach is inappropriate, so adopt that. To avoid needing to hold calibration data beyond the ends of the extracted image segment, it is proposed to smooth with this boxcar on reading, and hold the smoothed values in the MMD. End of orbit-file effects should be small since there is significant overlap between orbit files (for a match near the start of end of an orbit file, one can more or less always create the smoothed calibration data from one or other of the overlapping orbit files).

3. The onboard time may not be correct in the files for older satellites. This can affect the navigation significantly. Correction files exist, eg, from Pathfinder. These need to be formally sourced and documented with the DARD.

5. External vis cal information may be available. We propose to stick with the "Andy Heiginder" approach implemented within CLAVR-X.

6. JM has found that sometimes the flag for solar contamination of calibration data misses some bad data at either end of the events. We don't have time/resource to implement his approach to detecting this and interpolating the calibration across the events. So we need to flag this potential problem simply. Solution is to have a second flag to indicate lines "near" a line where the product flag is set. JM will advise us on the appropriate number of lines to correspond to "near".

Fortran code to generate the BTs, flagging and geolocation to be used as standard for the project algorithm development and RRDP will be applied to the above extractions. These outputs will be added to the MMD using the facility for doing so. The RRDP will not include the raw data, in order to avoid confusion in algorithm comparisons from any groups choosing to use alternative calibrations.

Other

JM explained the NCDC CDR programme http://www.ncdc.noaa.gov/sds/sds-documents.html . Ultimately, and alternative archive of more uniform GAC will be created, using JM's IR calibration and the other corrections mentioned above.

Next discussions will be at PM2.

Happy Christmas from Chris to everyone in the team!

POST-MEETING NOTE

Further analysis by JM showed a change in calibration of 0.4 K between Feb 2010 and May 2010 for NOAA 16 AVHRR, which is also a period of substantial change in instrument temperature. While this may be an extreme case, it does call into account the 3 month rolling window for bias correction that one might have expected as reasonable (based on Pathfinder approach).

Friday 10 December 2010

Met Office Telecon 6

Me, Nick Rayner, Simon Good by skype
10 Dec 2010


Purpose was to discuss the Product Specification -- process and the actual document. SG and NR had (yesterday) arranged a net meeting with the wider team, and today we went over the outcomes. 


NR and SG have been building the specifications up from the URD they previously developed, which we all agree is a coherent and traceable approach. The only issue is where that "working" should be reported, since it does not usually appear in a Product Spec Doc, nor is it requested in the ESA statement of work. However, it is clear it needs to be recorded and available. The option favoured is, as well as the formal deliverable which will be a fairly conventional PSD, to use SG's text so far to make a technical note to give the PSD context.


We went through the actual specifications for the SST in the products. Main discrepancy between user requirements and our original proposal is related to sub-daily variations in SST. Within the baseline proposal, we can treat diurnal variations by adjusting to a reference time that gives a good estimate of the daily mean SST. Since that reference time is close to the time of observation of our most stable reference source, this a powerful approach. The requirement users have for SST at subdaily synoptic times (00, 06, 12, 18 UTC) could accommodated at L4 (gap-filled "analyses" of SST) if the relevant option proposal is funded, but, for pure resource reasons, not otherwise.


CM reported clarifications from C Donlon regarding format requirements (compatibility with CMIP5 and GDS2.0). Victoria Bennett will play a co-ordinating role for data at the ESA Harwell office, and will be able to advise on these issues, discovery metadata, harmonisation across the CCI, etc. VB is now alerted to the timescale on which SST CCI is running. The WP from our point of view can be shifted by a few weeks, not more. VB thinks the CMIP5/GDS2.0 compatibility issue is only relevant at levels L3 and L4.


No further teleconference before PM2 at Met Office.

Monday 29 November 2010

Brockmann Consult / University of Leicester

Gary Corlett (UoL), Norman Fomferra, Martin Böttcher, Ralf Quast, Marco Zühlke (BC)
Telecon with Chris Merchant near end of their discussions on 26 November



GC and BC staff collocated at BC from 24 – 26 Nov in order to lay down the system requirements for the SST Multi-sensor Match-up System (MMS) and to sketch a rough development plan for its first version to be ready by the end of Jan 2011.

On day 1, GC gave an overview about the SST cal/val work and the role of the MMS for the SST_cci project. Various related projects have been discussed, especially the (A)ATSR L2P Project and GHRSST with respect to their Match-up databases.

For subsequent discussions on the MMS requirements, the team identified the possible MMS users, the various datasets it creates and agreed on the following definition of terms:

·       MD – Single-sensor match-up dataset. It is usually a NetCDF file comprising single-sensor match-up records. Each record comprises a single reference (in situ, in hi-lat cases “dummy in-situ” with invalid SST value) with a single satellite. MDs are ingested by the MMS and are versioned and archived for full backward traceability.

·       MMD – Multi-sensor match-up dataset comprising multi-sensor match-up records. Each record is a coincidence of a single in situ (or dummy) with multiple satellites and auxiliary data. MMDs are extracted from the MMS as NetCDF files in a well-specified format.

·       MMS database – The database that stores the in-situ data, the searchable fields of all ingested MDs and all detected coincidences (match-ups of in-situ-satellite and satellite-satellite).

Day 2 was fully dedicated to the gathering of system requirements which have been written down in a SST_cci Toolbox Wiki at https://github.com/bcdev/sst-cci-toolbox/. GC will integrate these in the Product Validation Plan document.

IR SST algorithm improvements are taking place from project kick-off+6 onwards, so a first version of the data set (DBT2) deliverable to support these improvements will be made available to algorithm developers. This will be the Algorithm Testing MMD – an extract from the MMS database. This MMD shall be ready by end of Jan 2011. It will contain prescribed "test" match-ups (for algorithm development) and "training" match-ups (for fair algorithm assessment). It will exclude some match-ups that are designated as "reference" match-ups, which will be used as the Reference MMD  later in the project. 

The first version of the MMS will have a simple command-line interface for ingestion and extraction of data and database inspection. As part of the ingestion process, the MMS will compute coincidences from MD input files. In addition to the coincidence criteria where there is always an ATSR observation, it has been agreed that triplets of Metop+ SEVIRI+PMW should also be collected. During collocation, the MMS will assign additional satellite and auxiliary data including interpolated NWP values, aerosol and sea-ice data. For each match-up, also a 24h in-situ history (+/-12h from satellite observation time) will be included. Investigations will be made to let the MMS directly call RTTOV-10 (simulation model) and ingest the RTTOV-10 output automatically back into the MMS database.

There are still unknowns regarding the physical deployment of the MMS to the ECDF (the Edinburgh computing cluster). In the next few days, BC will compile the external system requirements (e.g. DBMS, web server, ssh, Java JDK) and send it to Marc Filipak and Owen Embury. A telecon will be held on Nov 13, 10:30 (UK time) dedicated to address the MSS/ ECDF requirements.

Day 3 was used to review the requirements and develop a detailed action list in order to meet the MMD delivery at KO+6, by end of Jan 2011. The immediate actions for Dec 2010 are:

BC: Send external system requirements to Marc and Owen
BC: Update BEAM to also read NOAA AVHRR GAC
BC: Provide existing (A)ATSR childgen tool to Gary
GC: Describe data items to be used for the first month of data
BC: Develop flexible database schema which lets us add fields at any time
BC: Develop ingestion function to convert MD files and compute coincidences
BC: Develop extraction function to generate a MMD from given query
GC: Provide BC with one month of data (all different types)
GC: Convert MD files to monthly, reformat old SEVIRI to NetCDF
GC: Update PVP document with MMS requirements
CM: Provide BC with the RTTOV-10 configuration

CM discussed the progress with GC and the BC staff in a telecon during day 3. 

DONM by telecon,  Dec 13, 10:30 (UK time) dedicated to address the MSS/ ECDF requirements

Wednesday 17 November 2010

Met Office Telecon 5

Me, Nick Rayner, Simon Good by skype
17 Nov 2010

NR has received feedback from ESA (Craig Donlon) on the User Requirements Document. Several helpful changes to the URD have been requested, and will be addressed by the end of November, at which point it will be approved and published on the website.

The next stage is to use the URD to inform the Product Specification Document. We reviewed the WP description, discussed this, and there are essentially two classes of approach. 1. Define some objective criteria to apply to the URD data to create specifications. 2. Work from the PSD implicit in ESA's SoW and our proposal, and look for discrepancies with what users have asked for. NR and SG propose to do both approaches and see how the outcomes compare, which I think is an excellent approach. In both cases, the results then have to be appraised for technical feasibility (can specified products be created in principle?) and resource feasibility (can we do it within the plan for the SST CCI).

SG aims to draft the PSD for team review by the end of November, for a telecon discussion early in December. Interactive discussions will be solicited from users, planned for early January. Issue A will be submitted to ESA at end of January.

CD has sought advice on climate user requirements prior to attending a GCOS session in January. We agreed the CCI approach to this would be structured as follows: 1. NR and SG will complete (and expand if necessary) the template for requirements circulated by ESA as a basis for summarising our advice, by 3 December, and pass to me for review and agreement. 2. I will add any commentary appropriate from Science Leader's perspective in my submission to CD.

Plans for the January and July project meetings were discussed.

DONM Fri 10 Dec 10.00

Tuesday 16 November 2010

Leicester Telecon 3

Chris Merchant, Mark Filipiak and (by telephone) Gary Corlett

GC reported that the Data Access Requirements Document is essentially assembled, having received comments back, including additional of data of relevance to high latitudes recommended by Scandinavian colleagues.

The outline of the Product Validation Plan has been sent to parties who are contributing parts of that plan.

Regarding the multi-sensor match-up data (MMD), all the required inputs from Meteo France are collected and complete up to September 2010. The creation of the ATSR matches to different in situ data sources is almost completed, including the additional data sets from Scandinavian colleagues. The tools to associate matches with ECMWF fields will need to work on ERA Interim data at 1.5 deg resolution (currently available) in the first instance, and therefore will need to be coded flexibly to deal with ECMWF data of higher resolution later. GC is going to Germany to work through the MMD coding plan with Norman Fomerra and the Brockmann Consult team next week. CM and GC agreed that, for the passive microwave sensors, we would simply use the L2P version throughout, since we need to know the L2P product confidence flags for comparison with the new product confidence algorithm that will be developed within the project.

Regarding data set collection, the sourcing of AVHRR GAC is yet unresolved. Supply from CLASS via Ken Casey would take >10 months. The CLASS archive are unable to ship it directly on tape because of resource limitations at their end. The ECMWF copy requires special access, and GC will arrange a speed test to UoL.

The question of why the SCOPE-CM BTs are significantly different than expected is also pending. But it does seem that the initial estimates of the availability of SCOPE-CM data were too optimistic.

Decisions:
MF will establish what GAC CLASS could provide by end Jan 2011, raising possibility of disk instead of tape.
MF will start downloading 2010 GAC data from CLASS web ordering and ftp for NOAA-15,16,17,18,19 and Metop-2, which alone is estimated to require 2 months.
GC will arrange a speed test for downloading GAC from ECMWF to UoL.
MF will document SCOPE-GAC comparisons further.

DONM: Telecon with GC at Brockmann Consult Friday 26th Nov at 10 UTC.

Wednesday 10 November 2010

NASA SST Science Team

Seattle, 8-10 November.
Chris Merchant representing SST CCI.

A real scientific buzz at this NASA SST meeting, with lots of good progress reported, particularly in understanding SST variability across the full range of time scales and space scales (literally from O(1 s, 1 cm) to O(year, global)), and thinking through how that affects our statements about SST and its uncertainty in remotely sensed products.

I (CM) presented the principal concepts underlying the approach in the SST CCI to the long-term climate data record (CDR). Generated quite a lot of questions and discussion. Generally, the idea of tying up the ATSR and AVHRR time series via radiance cross-calibration was welcomed as an innovative approach. It is different to the US route, and this is seen as helpful. AVHRR folks (Ken Casey, Bob Evans, Sasha Ignatov) seem keen to interact with us during the SST CCI, which is excellent.

I tried to communicate some of the subtleties about assessing CDR quality using the range of measures including formal independence of the information content as well as the traditional accuracy, resolution, etc. I presented the opportunities for involvement, including the algorithm selection process via the round robin. Sasha is interested in the Algorithm Selection exercise, and we discussed the contents necessary for his group to participate efficiently using their new "hybrid" algorithm.

Off-line discussions with Ken about transferring the AVHRR archive across the Atlantic have sparked some download experiments using alternative routes, but none so far have given us the data rate we need (by a factor of three or four too slow). Still need to pursue all possibilities.

Carol-Anne Clayson and Alexander Soloviev showed diurnal warming modelling results that explain the 5 pm kink in the average diurnal cycle in SST-skin: as solar insolation reduces there is a transition in the depth communicating with the atmosphere accompanied by a period of turbulent kinetic energy increasing temporarily over a greater depth. Thereafter, the thermal inertia being cooled is greater, and so the rate of cooling diminishes to the rate is sustains during the night. The diurnal cycle shape seen in their idealized (constant wind speed) model runs is essentially identical to the global average diurnal cycle shape Mark Filipiak gets from his analysis of Pierre Le Borgne's SEVIRI observations, including the peak time and the time of this TKE-related transition. They also had different explanations for the observation that the >4 K diurnal events all seem to happen in the mid-latitudes, not in the tropics. Carol-Anne finds a role for the extended length of day during mid latitude summer, while Alexander reckons they happen when the air-sea temperature difference is more positive than typically seen in the tropics. The latter idea certainly fits with the tendency of the >4 K event distribution to be nearer land masses, and could easily be tested by getting the NWP data for the events.

Some discussion from Dudley Chelton and Dick Reynolds about how sampling limits the true information in L4 analyses at higher spatial frequencies, this being relevant to the satellite-only OSTIA analysis we will generate in the SST CCI. Seems to me, the key thing is to construct proper, dynamic uncertainty estimates, including covariance effects, rather than to smooth the data, which is Dudley's recommendation. Although, for those interested in SST gradients rather than absolutes, they did show that under-sampling can introduce big problems (spurious gradients particularly from the patchiness of IR data).

Lots of enthusiasm for coming to next year's GHRSST meeting that we are hosting in Edinburgh -- should be a great event.

Wednesday 3 November 2010

Leicester Telecon 2

Gary Corlett and Chris Merchant by telephone
2 Nov 2010

GC has circulated the draft Data Access Requirements Document round the project team, for comments and completion of some missing/difficult-to-source information about data sets. Replies are solicited for 10 Nov, the Met Office input being key. CM asked GC specifically to request also the DMI and met.no colleagues to review the sea ice / high latitude contents of the data being sought.

Key problem/uncertainty is the best source for AVHRR. First look shows >1 K differences in retrieved SST when using SCOPE-CM BTs compared to NOAA GAC, and also scan-line striping. Use of the cloud information seemed feasible: there are obvious failures to detect cloud in the SCOPE-CM data that are absent from NOAA GAC, but the reverse is also true. It was agreed that a telecon between Leicester (GC), Edinburgh (Mark Filipiak) and the SCOPE CM contact is needed to address the following:

1. Why the BTs are different by >1 K in clear sky areas in the mean.
2. Why there is striping relative to GAC, and if this striping is reasonably known to be zero-mean.
3. Realistic time scale for obtaining the full SCOPE CM data set.
4. Where SCOPE CM source their NOAA GAC input.

CM has commented on GC's Product Validation Plan Table of Contents, and it was agreed to stick to the contractual contents, except that a section introducing the Multi-sensor Match-up Dataset is essential for the document to make sense.

Argo matches with AATSR and further SEVIRI matches from Meteo France have been obtained at Leicester since last discussion. Access to the Edinburgh cluster is problematic so far -- GC and Mark F will work on this.

Regarding aerosol data, agreed solution is as follows. GC has found http://www.temis.nl/airpollution/absaai/, which offers daily GOME-2 Absorbing Aerosol Index in TOMS-like format. We will use TOMS, GOME-1, TOMS, OMI & GOME-2 to cover as much as we can from 01/08/1991 to 31/12/2010 (long-term record) and then GOME-2 (if available) for 01/10/2011 to 31/03/2012 (short-term demonstration). Overlap where possible between sensors (up to one year) to do some basic AAI intercomparisons with TOMS as well. This means there will be roughly 7 months of data missing from 24/11/1994 (the last METEOR 3 file, although the data starts to degrade a few weeks prior to this date) to 01/07/1995 (the start of GOME-1, although there are a few odd days of missing GOME-1 data). For the Pinatubo aerosol, we'll use the SAGE II data monthly means for the period 01/08/1991 to at least the end of 1994 to look for stratospheric aerosol correlations during the Mount Pinatubo eruption.

CM will attend the NASA SST meeting, and will use the opportunity to discuss AVHRR data sourcing. CM will not attend the next AATSR QWG (no funding and it is not in UK), but in any case, the AATSR SAG meeting the following week looks very pertinent and will summarize some of the same ground.

DONM Tue 16 Nov, 10.00

Friday 22 October 2010

Met Office Telecon 4

October 22, 10.00 a.m., Nick Rayner, Simon Good and Chris Merchant

The User Requirements Document is now publicly available in draft for comment until next Monday. Next week it will be finalized by SG and NR and approved by CM for submission to ESA.

Next telecon will be to kick off the Product Specification Document on 12 November, at 10.00 a.m.

Friday 15 October 2010

Leicester Telecon 1

15 October 2010 by phone, Chris Merchant, Gary Corlett

GC has scoped out the data request to ECMWF, with some minor queries. The 3 month lag time for ERA interim was discussed, and we agreed it was not problematic for the 6 month demonstration, since the data collection for the demo is expected to start October 2011, whereas the system run will not be before Aug 2012.

Aerosol comparison data will comprise TOMS (most of 1991 to 2010, but missing some of 1994 to 1996), MACC (post 2002), and (for stratospheric aerosol) SAGE II (under consideration). 

The issue of rights to redistribute ECMWF data within the RRDP has not yet been resolved.

Collection of match-up data is ongoing, but download speeds from Meteo France are slow. A telecon with Pierre to discuss plans for MSG-1 SEVIRI is needed.

GC will visit Brockmann in November, and telecon is planned during visit.

Before next telecon: Draft DARD will be distributed round project team; PVP Table of Contents will be defined.

DONM Tue 2 Nov 15.30.

Wednesday 13 October 2010

Project Team Telecon 1


13 October 2010 by Skype
Chris Merchant, Hugh Pumphrey, Paul Spinks, Gary Corlett

AGENDA

1. Overview of active WPs and progress towards deliverables

2. Project communications and reporting arrangements

- update on monthly reporting cycle with ESA from PS

- are WP-specific telecons + blog summaries adequate for "science leadership"?

3. Project templates and reference documents

4. Project web site

- PS to summarize recommendations from co-location

- quick survey of opinions about what we want on there

5. Guidelines from co-location

- each colleague involved in other drafting teams to give status in 1 minute


1. CM explained that in WP 101 (Analysis of Requirements) the Met Office have drafted the URD with the survey results and other findings. CM has reviewed this and returned comments, which Nick and Simon are reflecting on before distributing more widely for comment. 

CM stated that at Edinburgh WP 203 (Radiative transfer basis) work has started with Claire testing RTTOV10; Meteo France have confirmed they are going to standardize also to RTTOV10 one formally released.

GC addressed WP 103 (Data access). He has updated ESA on findings to date as requested. The data needs for ECMWF data have been fully compiled now, and he has sought comments from David Tan. It is not clear whether ECMWF will want to respond to our individual request or to a consolidated request from ESA, and this may cause delay. GC pointed out the data gap in TOMS aerosol for 93 - 96, and it is not clear what other data could be sourced to bridge this. Regarding stratospheric aerosol, HP undertook to explore options for gridded data set around the Pinatubo period from limb sounders. Sample pages for DARD will be sent to CM for comment next week. DARD will be fully drafted by 1 Nov by Karen Veal and circulated round team and ESA for comment. May be revised later in light of PSD and URD.

GC described WP 201. Draft ToC will be with CM by end October for comment and implementation of Edinburgh component.

WP 202 (Multisensor Matchup Dataset) GC said content had been reviewed with CM (see previous blog entry) and data collection has started with the Meteo France MDs for Metop and SEVIRI. Next downloads will be in situ data sets and passive microwave. CM undertook to follow up again regarding AVHRR GAC. GC needs a sample, so if not forthcoming from CM-SAF, CM will ask Mark Filipiak to send sample. 

Question was raised whether data sets should now start to be collected at Eddie (the Edinburgh cluster). Basically, yes, but this is not yet set up. CM will follow this up.

2. PS explained the monthly communications routine agreed with ESA. Towards the end of each calendar month, he will solicit reports from leaders of active work packages, and compile into a draft monthly report. On the first Wednesday of each month, the project team telecon will review that draft and discuss AOB. Once updated, the report will go to ESA, and then PS, CM and Craig Donlon will hold a telecon, noting any actions that emerge.

PS explained that since in this project there is split project management and science leadership, actions needs to be treated more flexibly (not solely by project manager). He proposes to investigate a "living" document (e.g., Google doc) to be available online as the repository from actions arising both from project meeting (PS to manage) or science leader telecons (CM to manage). 

3. The project's document template is essentially under test with GC and the Met Office at present, and will be circulated thereafter. The powerpoint template will be distributed soon. PS will give instructions for numbering of project documents. The reference document list will be in common across all documents and will be visible online. Present spreadsheet will be sent to GC to update with new RDs from DARD work; thereafter it is expected to evolve more slowly, and will be maintained by Emma Danby at Edinburgh, along with the acronym and science reference documents.

4. PS explained that Hugh Kelliher is exploring use of the the ESA web site template. Although implementation would represent significant extra work beyond our proposal, he is considering whether the benefits are such that it is worth pursuing. In the meantime, this puts a blight on creating web links for project common documents, etc, on the existing web site (in case the effort is wasted). CM said that as an interim measure, documents could easily be deposited on the project ftp site (ftp://esa-sst-cci.org).

5. The status of drafting teams' efforts were briefly reviewed.

Decisions summary:
1. HP will explore simple data sets for stratospheric aerosol pre and post Pinatubo.
2. UoL will send CM DARD sample and he will comment and return.
3. CM will chase CM-SAF AVHRR GAC sample, and/or ask Mark F to provide NOAA GAC sample to GC.
4. CM will ensure system for collecting data on Edinburgh cluster gets up and running.
5. PS will trial an online document to act as Actions Database.
6. GC will extend the RD list previously created by PS.
7. PS and all: Templates and common documents will be placed on ftp site in interim..

Wednesday 6 October 2010

Met Office Telecon 3

1 October 2010, by skype/phone, Nick Rayner, Simon Good, Chris 
Merchant
Topics: WP 10130 on User Requirements 

Today is the final day for responses to the User Requirements questionnaire to be included in the analysis exercise that SG will start next week. To date there have been 106 complete responses, and about twice as many partial responses. SG has also held a further discussion session with users interested in seasonal forecasting and SST variability; the main message from this session was the importance of compatibility of satellite-era SST from the CCI project and longer term historical SST records. We discussed approaches to presenting the survey results: SG has lots of good ideas. 

SG expects to have the draft report available by 8 Oct. CM will review it rapidly before it goes out to the project team and to those surveyed who indicated a willingness to review it. Following guidelines that emerged from the recent co-location meeting, comment will also be solicited from other international scientific bodies (although replies may take longer).

Practicalities of document template and referencing need to be followed up.

Actions/decision:
1. SG will send draft Requirements report to CM on 8 Oct
2. CM will review and return ASAP
3. SG will distribute report for comment more widely
4. CM will follow up with project manager status of templates and reference-list document

Thursday 23 September 2010

Conversations in Frascati and Cordoba

I have been at the first "co-location" meeting for the Climate Change Initiative, where ESA gather representatives of all the CCI projects (11, comprising 10 variable teams and the climate modelling users group), and thereafter at the EUMETSAT conference in Cordoba. This post notes a few useful conversations for the SST CCI team.

1. Rainer Hollman (CM SAF and Cloud CCI). We discussed Fundamental Climate Data Records from AVHRR Global Area Coverage. A project the CM SAF is involved in (SCOPE CM) will create AVHRR GAC FCDR with some homogenisation applied, plus the CM SAF cloud products, by the end of this year. Both because of local downloading, and the homogeneous cloud mask, this may be a practical alternative source for AVHRR GAC.

2. Carol-Anne Clayson (Florida State U). Carol-Anne uses the SSMI records to retrieve ocean-atmosphere fluxes, by determining inputs to bulk flux parameterisations, including a surface temperature term. I encouraged Carol-Anne to take part in the Round Robin exercise, by adding SSMI to the multi-sensor matches. This would be a practical way of making comparisons with infra-red based SSTs.

3. Don Grainger (Oxford and Cloud CCI). One of Don's students is extending their aerosol retrieval to include infra-red channels. SST then gets added to the state vector. Again, I encouraged Don to get involved in the Round Robin exercise, where the SST obtained by their scheme can be compared with our SSTs.

4. Steve Ackermann (Wisconsin). In their MODIS cloud work, they also take and apply the official MODIS SST algorithm. I explained that MODIS could be added to the project's MMD to create some useful retrieval and cloud mask inter-comparisons. Steve is keen on this: they have systems for pulling out MODIS data for a given lat and lon and time already, so overhead would be modest.

Thursday 16 September 2010

Met Office Telecon 2


10 September 2010, by skype/phone, Nick Rayner, Simon Good, Chris
Merchant
Topics: WP 10130 on User Requirements / ESA Collocation meeting

SG gave a progress report on the User Requirements questionnaire. 767
people have been emailed so far, with about ~100 bounces. Have been
asked to respond by 17 September. A further batch will be added to
include a greater diversity of countries. 19 responses by 9 September,
the nature of which SG summarized. Importantly, 5 our of 19 have
agreed to the follow-up discussion session, 12 to review the URD and 8
to receive test data -- looks a promising level of engagement.

NR has piloted interactive discussions. CM commented the notes of the
discussions so far make very interesting reading, and suggesting
prompting users about their requirements for uncertainty information
if it didn't naturally arise.

Actions/decisions:
1. NR/SG will supply CM a slide or two schematically describing the
questionnaire process for CM's presentation at the first collocation
meeting next week.

DONM: Fri 1 Oct 10.00

Discussions at University of Leicester

Discussions at UoL on 3 September
Present: Chris Merchant, Gary Corlett and John Remedios

Project month / Action
2 & 3 / Specify match-up data sets and write experimental code for
collating multi-sensor matchups, do ToC for PVP
4 / Circulate specification and visit Brockmann Consult
5 / Write Product Validation Plan (while BC start prototype code for
MMD)
6 / Define MMD tools for extraction etc
7 / verification of MMD (while BC start to code tools)

Aim is for a working MMD by end January 2011.

Gary will be involved in Envisat post-orbit drop validation from 2 Nov
for ~few weeks.

We had a preliminary discussion about MMD tools. Elements we foresee
are:

By end January 2011:
- extraction tools, including metadata to set up extraction options
- NWP matcher and extracter, including providing files in correct
formats to run RTTOV-10 and diurnal models
- tools to add fields (e.g. model outputs) to MMD that will then be
available to other users

By RRDP:
- RRDP input tool (to accept new MDs and alternate algorithm results)

For system prototype:
- MD generator

Thinking about the nature of the matches, there may be a sparsity
problem for comparisons across the full range of sensors. So, in
addition to the previously agreed hierarchy (where there is always an
ATSR observation) we agreed that triplets of Metop+SEVIRI+PMW should
also be collected.

The MMD will cover dates as follows:

ATSR/AVHRR: 1991 - 2010
ATSR/AVHRR/Metop/SEVIRI/AMSRE/TMI: 2003 - 2010

As part of the Round Robin exercise, we will invite addition of MDs
from other sensors in a standard format, to be associated into the MMD
using the same tools. Priorities: MODIS, GOES, MTSAT. The mechanism
for submitting results of alternate algorithms to the Algorithm
Selection exercise will be an MD in the same standard format.

The Data Access Requirements Document will be drafted during October,
led by Karen Veal. Final version requires review of URD and PSD. DARD
is likely to comprise a page describing data, giving URL/info for
third parties to obtain the data. For MMD data, which we will
redistribute (in RRDP/CDRP), we need to seek a license to do so, and
the DARD will have notes on status regarding this (and whether the
next party is licensed to redistribute further). It would be helpful
if ESA had a standard letter requesting data for all CCI projects.
ACTION: GC to ensure this is proposed during 1st CCI collocation
meeting.

CIRIMS radiometry data may be useful to include in MD. ACTION: CM to
discuss CIRIMS data with And Jessup at Seattle SST meeting.

Actions/decisions:
1: GC to ensure ESA letter requesting data is proposed during 1st CCI
collocation meeting.
2: CM to discuss CIRIMS data with And Jessup at Seattle SST meeting.
3. (Not exactly CCI) CM to provide QWG with 3 or 4 pages on how new
coefficients were generated.

Friday 10 September 2010

SST CCI Calendars

The calendar to which SST CCI meetings and telecons can be added by anyone in the project (who has associated their e-mail address to a Google account) is:

http://www.google.com/calendar/ical/v6f4dq21fq5c1dill1eao8u46o%40group.calendar.google.com/public/basic.ics


A calendar with the planned durations of work packages within the project is

http://www.google.com/calendar/ical/j22otpbcof0vl4bngcv8l96jvs%40group.calendar.google.com/public/basic.ics

Met Office Telecon 1

The main purpose of this experimental blog is to allow the SST CCI team to have access to discussions and decisions they were not directly involved in. So ... here are notes from a telecon I had with colleagues in the Hadley Centre.

13 Aug 2010 by skype, Chris Merchant, Nick Rayner, Simon Good
Main focus: WP10130

NR explained that SG has identified ~1000 users on the basis of citations of SST CDR papers over last 5 to 10 years. >400 are from US, so they may be subselected to avoid unbalance. Will send a named e-mail to invite to take online survey. Briefing material will give user a choice between an extremely brief summary and something with some scientific detail. Questionnaire is being developed and likely distributed within project for comment around 20th Aug.

Regarding interactive sessions, SG explained that Web-X appears to be ideal. NR will pilot the project presentation for the interactive sessions at the technical kick off (TKO) at Edinburgh. Sessions will be trialled within Hadley Centre around end of August.

CM confirmed that WP20123 will be initiated by discussions with Gary Corlett at a side meeting in Edinburgh at TKO.

Actions/decisions:
1. NR/SG to send CM User Requirements Document at outline stage for review.
2. CM to distribute links to project Google calendars.
3. CM to try to get his calendar system to send out invitation to next telecon.

DONM: telecon agreed for 10 am 10 Sep by skype.

An experiment in team communication

This blog is an experiment in team communication. I'm leading a European Space Agency project called the Sea Surface Temperature Climate Change Initiative (SST_cci) which involves the collaboration of eight organisations round Europe (see our main web site). The idea is, briefly, to get better information about how the surface temperature of the oceans has evolved over the last few decades, using satellite data.

I'm going to be having lots of meetings and discussions as I run this project, and I am wondering whether blogging would be a good way to keep the team in touch with the ideas and decisions that arise from all those conversations. There is also a principle of transparency to the project, so having these conversations open to all to view also fits in with that -- not that I expect too many people outside of the team will want to follow the blog, since it will all be rather technical.

Anyway, thought I'd try it and see.