Thunder Bay National Marine Sanctuary
2013 Condition Report

Photo of a diver and a shipwreck

Developing the Report

The process for preparing condition reports involves a combination of accepted techniques for collecting and interpreting information gathered from subject matter experts. The approach varies somewhat from sanctuary to sanctuary, in order to accommodate differing styles for working with partners. The Thunder Bay National Marine Sanctuary approach was closely related to the Delphi Method, a technique designed to organize group communication among a panel of geographically dispersed experts by using questionnaires, ultimately facilitating the formation of a group judgment. This method can be applied when it is necessary for decision-makers to combine the testimony of a group of experts, whether in the form of facts or informed opinion, or both, into a single useful statement.

The Delphi Method relies on repeated interactions with experts who respond to questions with a limited number of choices to arrive at the best supported answers. Feedback to the experts allows them to refine their views, gradually moving the group toward the most agreeable judgment. For condition reports, the Office of National Marine Sanctuaries uses 17 questions related to the status and trends of sanctuary resources, with accompanying descriptions and five possible choices that describe resource conditions (Appendix A).

In order to address the 17 questions, sanctuary staff selected and consulted outside experts familiar with water quality, living resources, habitat, and maritime archaeological resources. A small workshop was convened in March 2010 where experts from NOAA Great Lakes Environmental Research Laboratory (GLERL) participated in facilitated discussions about each of the 17 questions. At the workshop each expert was introduced to the questions, was then asked to provide recommendations and supporting arguments and the group supplemented the input with further discussion. In order to ensure consistency with Delphic methods, a critical role of the facilitator was to minimize dominance of the discussion by a single individual or opinion (which often leads to "follow the leader" tendencies in group meetings) and to encourage the expression of honest differences of opinion. As discussions progressed, the group converged in their opinion of the rating that most accurately describes the current resource condition. After an appropriate amount of time, the facilitator asked whether the group could agree on a rating for the question, as defined by specific language linked to each rating (see Appendix A). If an agreement was reached, the result was recorded and the group moved on to consider the trend in the same manner. If agreement was not reached, the facilitator instructed sanctuary staff to consider all input and decide on a rating and trend at a future time, and to send their ratings back to workshop participants for individual comment.

Experts at the workshops were also given the opportunity to qualify their level of confidence in status and trend ratings by characterizing the sources of information they used to make judgments. A ranking of information quality was provided for three potential categories: data, literature, and personal experience. For each status or trend rating, the experts documented the source of information for each category.

Level of Confidence
1 2 3 4 5
High Uncertainty Speculative Reasonable Inference Moderate Certainty High Certainty
No data are available, and no substantive personal experience Few data and little information available, and limited personal experience Some data available, unpublished or in non-peer reviewed sources, or some direct personal experience Data available, some peer-reviewed publications exist, or direct personal experience Considerable data available, extensive record of publication, or extensive personal experience or expertise


The scores compiled during the workshop were as follows:

Question Data Literature Personal Experience
1 2 2 3
2 3 3 3
3 3 1 1
4 3 1 1
5 N/A N/A N/A
6 N/A N/A N/A
7 N/A N/A N/A
8 N/A N/A N/A
9 N/A N/A N/A
10 N/A N/A N/A
11 4 4 4
12 N/A N/A N/A
13 N/A N/A N/A
14 3 3 3
15 3 2 3
16 3 2 3
17 2 1 3


The first draft of the document summarized the opinions and uncertainty expressed by the experts, who based their input on knowledge and perceptions of local conditions. Comments and citations received from the experts were included, as appropriate, in text supporting the ratings.

The first draft of the document was sent to the subject experts from GLERL who attended the workshop for what was called an initial review - a four-week period that allows experts to ensure that the report accurately reflected their input, identify information gaps, provide comments, or suggest revisions to the ratings and text. During this four-week period, the report was also distributed to representatives from the NOAA National Marine Fisheries Service, NOAA Office of National Marine Sanctuaries, NOAA Marine Debris Program, and Michigan Department of Natural Resources. These individuals were asked to review the technical merits of resource ratings and accompanying text, as well as to point out any omissions or factual errors. Upon receiving reviewer comments, the writing team revised the text and ratings as they deemed appropriate.

A draft final report was then sent for external peer review, a requirement that started in December 2004, when the White House Office of Management and Budget (OMB) issued a Final Information Quality Bulletin for Peer Review (OMB Bulletin) establishing peer review standards that would enhance the quality and credibility of the federal government's scientific information. Along with other information, these standards apply to Influential Scientific Information, which is information that can reasonably be determined to have a "clear and substantial impact on important public policies or private sector decisions." The condition reports are considered Influential Scientific Information. For this reason, these reports are subject to the review requirements of both the Information Quality Act and the OMB Bulletin guidelines. Therefore, following the completion of every condition report, they are reviewed by a minimum of three individuals who are considered to be experts in their field, were not involved in the development of the report, and are not ONMS employees. Comments from these peer reviews were incorporated into the final text of the report. Furthermore, OMB Bulletin guidelines require that reviewer comments, names, and affiliations be posted on the agency website. Reviewer comments, however, are not attributed to specific individuals. Comments by the external peer reviewers are posted at the same time as the formatted final document.

Following the external peer review, the comments and recommendations of the reviewers were considered by sanctuary staff and incorporated, as appropriate, into a final draft document. In some cases, sanctuary staff reevaluated the status and trend ratings and when appropriate, the accompanying text in the document was edited to reflect the new ratings. The final interpretation, ratings and text in the draft condition report were the responsibility of sanctuary staff, with final approval by the sanctuary manager. To emphasize this important point, authorship of the report is attributed to the sanctuary alone. Subject experts were not authors, though their efforts and affiliations are acknowledged in the report.