The near real-time updates white paper is now available for discussion. When making posts please remember to follow the house rules. Please also take time to read the full pdf before commenting and where possible refer to one or more of section titles, pages and line numbers to make it easy to cross-reference your comment with the document.
The recommendations are reproduced below:
Recognizing that (i) there is no analog to CLIMAT bulletins on the daily timescale, (ii) daily data are not routinely shared, (iii) no central repository exists because of the lack of a formal process for sharing data with daily resolution, and (iv) the “climatological data” code group in synoptic bulletins does not support the climate communities need for daily summary observations, establish a formal mechanism for dissemination of daily climate messages or requirement for transmission of daily climate observations with synoptic reports.
Recognizing more pronounced issues of the same sort at the basic instantaneous data level, consider as a secondary priority the feasibility (including data exchange policy issues) of adoption of international mechanisms to standardize the exchange of the highest resolution data.
Recognizing that a limited number of bilateral arrangements (e.g. US-Australia, US-Canada) have proven effective at improving access and near real-time data sharing of daily and sub-daily data, establish efforts at the WMO regional level to expand bilateral arrangements for sharing of daily and sub-daily data to increase data holdings and foster regular updates of global and regional daily data sets.
Recognizing that training programs such as those of the JMA have proven effective at improving NMHS capabilities to provide CLIMAT data, expand training opportunities at the regional and national level to improve the routine and regular dissemination of CLIMAT bulletins from developing nations.
Recognizing the success that GCOS Monitoring Centres and CBS Lead Centres for GCOS have had on improving the quality and quantity of CLIMAT reports, continue to support Monitoring of quality and completeness of CLIMAT transmissions and feedback to data providers. Work to garner commitments for enhanced monitoring and feedback related to synoptic bulletins and daily climate summaries.
Recognizing the efficiencies and flexibility of Table-Driven Code Forms for transferring large amounts of data, their design for ease and efficiency of processing, as well as their cost-effectiveness, encourage and support NMCs conversion of data transmissions to TDCFs, while at the same time ensuring adequate attention to issues of long-term data homogeneity in support of climate research.
Recognizing that the GTS is not structured to meet newly evolving requirements for the exchange of data in near real-time, and recognizing the 2003 agreement to move to a WMO Information System (WIS) to meet all of the WMO’s information needs, support adoption of WIS technologies and encourage establishment of GISCs and DCPCs.
Footnote 2 on page 3 should be deleted or reworded, due to discontinuation of CLIMAT TEMPs, following an assessment by GCOS that they were no longer needed for its purposes, and (I believe) a subsequent CCl decision.ReplyDelete
1. The list on line 308 has duplicates.ReplyDelete
2. This is a general comment, not just relating to this white paper. Some processing - e.g. correction - is mentioned. The source code for this processing must be published and version controlled, and so should the datasets, and in particular any change to data should be clearly tagged with the exact version and configuration of the code used to make the change. There's a lot about this in white paper 6, and I will make more comments there.
3. "authorized users"? Authorized how and by whom? Are we to be gatekeepers? Is the intention to prevent (or control) commercial use? Or critical use? I advocate and will argue for a completely open approach.
Posted on behalf of:ReplyDelete
Hitomi Saito (Ms.)
Climate Prediction Division
Global Environment and Marine Department
Japan Meteorological Agency
The improvement in the quality and quantity of synoptic reports and daily data is essential for monitoring extreme events, and I basically agree with the recommendation (line 431-435). However, the reality is that it is not easy to improve the quality and quantity even in CLIMAT report and it took about a decade to see the gradual improvement through activities of GCOS Monitoring Center and CBS Lead Center.
Though it is very important to improve the reception rate of SYNOP reports, the number of SYNOP stations is larger than that of CLIMAT stations, and time interval of reporting of SYNOP is shorter than that of CLIMAT. Moreover, the purpose of reporting is different between SYNOP and CLIMAT. (SYNOP is for weather monitoring while CLIMAT is for climate monitoring.) It is therefore necessary to consider technical requirements and to secure human resources for monitoring and feedback to data providers of SYNOP. In other words, it is necessary to examine prudently possible framework for effective monitoring and feedback to data providers.
JMA produces weekly reports on extreme climate events (temperature and precipitation).
(Please select “Report” and “Figure”)
However, we feel data management for SYNOP is much more difficult than that for CLIMAT, and there still exist many areas with no reporting of SYNOP.
That would be a nice addition to metadata. I suspect some corporations might be willing to provide either discounts or free contributions of their data products.
Any high-bandwidth updates to databases that allow past data to be added, changed or deleted raise the issue of irreproducibility of results from earlier database contents. How will this be addressed? Will users be expected to download the entire dataset for their work, or will they be able to retrieve a view of "how the database looked at time t on date d"?ReplyDelete