Friday 17 December 2010

Leicester Telecon 4

Chris, Mark Filipiak and Jon Mittaz (Maryland / NOAA) in Edinburgh
Gary Corlett in Leicester
10.30 - noon 17 December 2010

Data Access Requirements Document

GC and CM agreed that (i) priority is data access requirements relevant to the envisaged system, over the additional data that may be useful within the project as a whole, and (ii) GC should ask for comments on draft in present form from ESA technical officer. Presently, all major information is present, but some obscure information is not yet available.

Product Validation Plan

GC had suggested to Nick Rayner and Simon Good that their metrics for assessing the degree to which envisaged products meet user requirements should be used within the climate assessment plan. CM agrees this is an excellent idea. GC's work on PVP development has been strongly supported by Jacob Hoyer re high latitude aspects. Algorithm selection plan is in final drafting at Edinburgh. GC will draw together the document for presentation and discussion at our upcoming project meeting in early January (PM2).

Multi-sensor match-up dataset

Marco Zuelhlke and Martin Boettcher (BC) are writing data readers at present. Timescale for availability of first outputs within project is probably mid February. An important task is to identify the list of queryable fields that will be held as a data base to link the data and enable querying and extraction. Need to develop this during the PM2.

AVHRR GAC

Progress has been made on trying different routes to obtain complete archive. Andy Harris has been able to stage 2010 at NOAA in such as a way as to allow a much more efficient ftp transfer, so should be here by start of January.

For MMD extraction, approach at BC is to adapt existing java HRPT readers. Some discussion about this and how to accommodate (i) the need for the MMD to be consistent with the future processor (which will be in Fortran, and based on readers adapted by JM), and (ii) the fact that JM's understanding of calibration of each AVHRR is still under development within his NCDC CDR project, and will continue to be so for the next couple of years.

We agreed that the MMD should hold, over the extracted segment
1. smoothed infrared calibration information (blackbody PRT x 4, BB counts, space counts)
2. all channel counts
3. onboard time
4. satellite and solar angle information
5. visible calibration information
6. the original and a modified solar contamination flag

More comments on some of these:

1. It is normal to smooth the calibration data, not least to minimise the impact of PRT digitisation errors. Different groups use different intervals. JM uses current line +/- 27 lines in a boxcar average. Some smoothing is necessary and helpful, and we have no reason to think JM's approach is inappropriate, so adopt that. To avoid needing to hold calibration data beyond the ends of the extracted image segment, it is proposed to smooth with this boxcar on reading, and hold the smoothed values in the MMD. End of orbit-file effects should be small since there is significant overlap between orbit files (for a match near the start of end of an orbit file, one can more or less always create the smoothed calibration data from one or other of the overlapping orbit files).

3. The onboard time may not be correct in the files for older satellites. This can affect the navigation significantly. Correction files exist, eg, from Pathfinder. These need to be formally sourced and documented with the DARD.

5. External vis cal information may be available. We propose to stick with the "Andy Heiginder" approach implemented within CLAVR-X.

6. JM has found that sometimes the flag for solar contamination of calibration data misses some bad data at either end of the events. We don't have time/resource to implement his approach to detecting this and interpolating the calibration across the events. So we need to flag this potential problem simply. Solution is to have a second flag to indicate lines "near" a line where the product flag is set. JM will advise us on the appropriate number of lines to correspond to "near".

Fortran code to generate the BTs, flagging and geolocation to be used as standard for the project algorithm development and RRDP will be applied to the above extractions. These outputs will be added to the MMD using the facility for doing so. The RRDP will not include the raw data, in order to avoid confusion in algorithm comparisons from any groups choosing to use alternative calibrations.

Other

JM explained the NCDC CDR programme http://www.ncdc.noaa.gov/sds/sds-documents.html . Ultimately, and alternative archive of more uniform GAC will be created, using JM's IR calibration and the other corrections mentioned above.

Next discussions will be at PM2.

Happy Christmas from Chris to everyone in the team!

POST-MEETING NOTE

Further analysis by JM showed a change in calibration of 0.4 K between Feb 2010 and May 2010 for NOAA 16 AVHRR, which is also a period of substantial change in instrument temperature. While this may be an extreme case, it does call into account the 3 month rolling window for bias correction that one might have expected as reasonable (based on Pathfinder approach).

Friday 10 December 2010

Met Office Telecon 6

Me, Nick Rayner, Simon Good by skype
10 Dec 2010


Purpose was to discuss the Product Specification -- process and the actual document. SG and NR had (yesterday) arranged a net meeting with the wider team, and today we went over the outcomes. 


NR and SG have been building the specifications up from the URD they previously developed, which we all agree is a coherent and traceable approach. The only issue is where that "working" should be reported, since it does not usually appear in a Product Spec Doc, nor is it requested in the ESA statement of work. However, it is clear it needs to be recorded and available. The option favoured is, as well as the formal deliverable which will be a fairly conventional PSD, to use SG's text so far to make a technical note to give the PSD context.


We went through the actual specifications for the SST in the products. Main discrepancy between user requirements and our original proposal is related to sub-daily variations in SST. Within the baseline proposal, we can treat diurnal variations by adjusting to a reference time that gives a good estimate of the daily mean SST. Since that reference time is close to the time of observation of our most stable reference source, this a powerful approach. The requirement users have for SST at subdaily synoptic times (00, 06, 12, 18 UTC) could accommodated at L4 (gap-filled "analyses" of SST) if the relevant option proposal is funded, but, for pure resource reasons, not otherwise.


CM reported clarifications from C Donlon regarding format requirements (compatibility with CMIP5 and GDS2.0). Victoria Bennett will play a co-ordinating role for data at the ESA Harwell office, and will be able to advise on these issues, discovery metadata, harmonisation across the CCI, etc. VB is now alerted to the timescale on which SST CCI is running. The WP from our point of view can be shifted by a few weeks, not more. VB thinks the CMIP5/GDS2.0 compatibility issue is only relevant at levels L3 and L4.


No further teleconference before PM2 at Met Office.