Thursday, July 29, 2010

White paper 11 - Spatial and temporal interpolation of environmental data

The spatial and temporal interpolation of environmental data white paper is now available for discussion. When making posts please remember to follow the house rules. Please also take time to read the full pdf before commenting and where possible refer to one or more of section titles, pages and line numbers to make it easy to cross-reference your comment with the document.

Update 8/2: A supplement has also been published for comment / consideration. Please be sure top delineate in your comments whether you are discussing the main white paper or the supplement.

The recommendations from the main white paper are reproduced below:
• The choice of interpolation technique for a particular application should be guided by a full characterization of the input observations and the field to be analyzed. No single technique can be universally applied. It is likely that different techniques will work best for different variables, and it is likely that these techniques will differ on different time scales.

• Data transformations should be used where appropriate to enhance interpolation skill. In many cases, the simple transformation of the input data by calculating anomalies from a common base period will produce improved analyses. In many climate studies, it has been found that separate interpolations of anomaly and absolute fields (for both temperature and precipitation) work best.

• With all interpolation techniques, it is imperative to derive uncertainties in the analyzed gridded fields, and it is important to realize that these should additionally take into account components from observation errors, homogeneity adjustments, biases, and variations in spatial sampling.

• Where fields on different scales are required, interpolation techniques should incorporate a hierarchy of analysis fields, where the daily interpolated fields should average or sum to monthly interpolated fields.

• Research to develop and implement improved interpolation techniques, including full spatio-temporal treatments is required to improve analyses. Developers of interpolated datasets should collaborate with statisticians to ensure that the best methods are used.

• The methods and data used to produce interpolated fields should be fully documented and guidance on the suitability of the dataset for particular applications provided.

• Interpolated fields and their associated uncertainties should be validated.

• The development, comparison and assessment of multiple estimates of environmental fields, using different input data and construction techniques, are essential to understanding and improving analyses.

5 comments:

  1. There is a Supplement paper associated with White Paper (WP) 11, which Richard Smith and I put together. At one point I thought they would be merged, which has led to some uneven referencing in WP 11. The Supplement paper is much more complete, but in order for the WP to
    stand alone, I'd like to add some more statistics references that credit early work and make the material more accessible.
    Line 94...add to the Cressie and Wikle reference, the following reference:
    Banerjee, S., Carlin, B.P., and Gelfand, A.E. (2004). Hierarchical Modeling and Analysis for Spatial Data. Chapman and Hall/CRC Boca Raton, FL.
    Line 121...add to the Johannesson at al. reference, the following reference:
    Chou, K.C., Willsky, A.S., and Nikoukhah, R. (1994). Multiscale systems, Kalman filters, and Riccati equations. IEEE Transactions on Automatic Control, 39, 479–492.
    Line 202...add the following sentence at the end of the paragraph:
    See the discussion of Haylock et al. (2008) and Hofstra et al. (2008) in the Supplement to this
    White Paper (Smith and Cressie, 2010).

    ReplyDelete
  2. I'll suggest that you contact this statistician to explore some of the issues with current approaches

    http://statpad.wordpress.com/2010/03/18/anomaly-regression-%E2%80%93-do-it-right/

    ReplyDelete
  3. A climate impacts scientist who would prefer to remain anonymous sent a long comment as a word document that won't post as a single comment here. I have converted to a file and posted here. Liz, my apologies for the white paper author lead mis-attribution but I did not want to edit more than was strictly necessary to make anonymous.

    ReplyDelete
  4. Comment received by email with requested anonymity:

    The IPCC Task Group on Data and Scenario Support for Impact and Climate Analysis (TGICA), which hosts the IPCC Data Distribution Centre (DDC):

    http://www.ipcc-data.org/ddc_about.html

    might be able to play a role in supporting the development of Guidelines on this topic. Guidelines qualify as IPCC Supporting Material (i.e. they are not "approved", "adopted" or "accepted" in a full IPCC assessment process), have to be proposed by TGICA, are drafted by experts from TGICA and from the international research community, and are extensively peer reviewed prior to TGICA approval and posting at the DDC. The topic of baseline interpolated observed climate data has been hovering on the Group's radar for a while, is addressed superficially in some of the existing guidance, and probably needs re-visiting at some point. Indeed, a number of the topics covered in the other White Papers are also potentially relevant to the activities of TGICA and the DDC.

    ReplyDelete
  5. If I correctly understand John Coleman, founder of the Weather Channel, the intention of this part of the project is to project temperatures for all of northern California based primarily on a single station located in San Francisco. Anybody with experience of those temperatures could tell you that this is not enough information to do the job. No matter how sophisticated your statistics, this is just not enough information. More stations are required, or the results are of little use.

    The factors which largely drive the San Francisco temperature are greatly different from the factors which drive other temperatures hundreds of miles away. San Francisco is moderated by sea breezes and weather systems arriving from mid Pacific. Temperatures near Sacramento in the Central Valley region are moderated by inversion layers, persistent cloud cover, and storms --both warm and cold -- which frequently bypass San Francisco. Temperatures in the mountain regions are moderated by daily storm formation and storm activity, often arriving from Canada.

    Before you try projecting temperatures for Northern California from a few stations, please try comparing your projections against the on-the-spot temperature records for a couple of historic and recent years. I submit that the number of 20-degree divergences will be enough to prompt re-consideration.

    Robert Matthews

    ReplyDelete