Review guidelines – step by step
For future reuse and reinterpretation, it is mandatory for the user to be assured about research data quality. It is the aim of ESSD to provide quality assessment for data sets which are already included in permanent repositories.
Thus, when reviewing a paper in ESSD, we would like you to review not just the manuscript but, more importantly, the data set itself. For your guidance, a step-by-step review approach is suggested:
Read the manuscript: are the data and methods presented new? Is there any potential of the data being useful in the future? Are methods and materials described in sufficient detail? Are any references/citations to other data sets or articles missing or inappropriate?Is the article itself appropriate to support the publication of a data set?
Check the data quality: is the data set accessible via the given identifier? Is the data set complete? Are error estimates and sources of errors given (and discussed in the article)? Are the accuracy, calibration, processing, etc. state of the art? Are common standards used for comparison?Is the data set significant – unique, useful, and complete?
Consider article and data set: are there any inconsistencies within these, implausible assertions or data, or noticeable problems which would suggest the data are erroneous (or worse). If possible, apply tests (e.g. statistics). Unusual formats or other circumstances which impede such tests in your discipline may raise suspicion.Is the data set itself of high quality?
Check the presentation quality: is the data set usable in its current format and size? Are the formal metadata appropriate? Check the publication: is the length of the article appropriate? Is the overall structure of the article well structured and clear? Is the language consistent and precise? Are mathematical formulae, symbols, abbreviations, and units correctly defined and used? Are figures and tables correct and of high quality?Is the data set publication, as submitted, of high quality?
Finally: By reading the article and downloading the data set, would you be able to understand and (re-)use the data set in the future?
Reviewers are asked to decide how well the respective data sets presented by an article and the article itself meet the following criteria (rated 1–4, excellent–poor):
Is there any potential of the data being useful? This is clearly the most important decision. There are at least three sub-criteria to evaluate:
- Uniqueness: it should not be possible to replicate the experiment or observation on a routine basis. Thus, any data set on a variable supposed or suspected to reflect changes in the Earth system deserves to be considered unique. This is also the case for cost-intensive data sets which will not be replicated due to financial reasons. A new or improved method should not be trivial or obvious.
- Usefulness: it should be plausible that the data, alone or in combination with other data sets, can be used in future interpretations, for the comparison to model output or to verify other experiments or observations. Other possible uses mentioned by the authors will be considered.
- Completeness: a data set or collection must not be split intentionally, for example, to increase the possible number of publications. It should contain all data that can be reviewed without unnecessary increase of workload and can be reused in another context by a reader.
The data must be presented readily and accessible for inspection and analysis to make the reviewer's task possible. Even if a data set submitted is the first ever published (on a parameter, in a region, etc.), its claimed accuracy, the instrumentation employed, and methods of processing should reflect the "state of the art" or "best practices". Considering all conditions and influences presented in the article, these claims and factors must be mutually consistent. The reviewer will then apply his or her expert knowledge and operational experience in the specific field to perform tests (e.g. statistical tests) and cast judgement on whether the claimed findings and its factors – individually and as a whole – are plausible and do not contain detectable faults.
Long articles are not expected. Regarding the style, the aim is to develop stereotypical wording so that unambiguous meaning can be expressed and understood without much effort. The article should express clearly what has been found, where, when, and how. The article text and references should contain all information necessary to evaluate all claims about the data set or collection, whether the claims are explicitly written down in the article, or implicit, through the data being published or their metadata. The authors should point to suitable software or services for simple visualization and analysis, keeping in mind that neither the reviewer nor the casual "reader" will install or pay for it.
Access review, peer review, and interactive public discussion (ESSDD)
Manuscripts submitted to ESSD at first undergo a rapid access review by the topic editor (initial manuscript evaluation), which is not meant to be a full scientific review but to identify and sort out manuscripts with obvious major deficiencies in view of the above principal evaluation criteria.
If they are not immediately rejected, they will be posted on the Earth System Science Data Discussions (ESSDD) website, the discussion forum of ESSD, where they are subject to full peer review and interactive public discussion.
Peer-review completion (ESSD)
At the end of the interactive public discussion, the authors may make their final response and submit a revised manuscript. Based on the referee comments, other relevant comments, and the authors' response in the public discussion, the revised manuscript is re-evaluated and rated by the topic editor. If rated excellent or good in all of the principal criteria and specific aspects listed above, the revised manuscript will normally be accepted for publication in ESSD. Additional advice from the referees in the evaluation and rating of the revised manuscript will be requested by the topic editor if the public discussion in ESSDD is not sufficiently conclusive.