MLMA Master Plan Appendix Q. Peer Review under the Marine Life Management Act

This appendix draws information and conclusions from a report by OST that was prepared during the information gathering phase of the Master Plan amendment process (OST 2017 (PDF)(opens in new tab)). It provides details regarding best practices and resources to help managers plan for and navigate the peer review process, including a peer review checklist, TOR and a sample report template, and summary of scientific peer reviews of Department work products from the period of 2001 – 2017 (see Table Q1). As with the other appendices, it is anticipated this overview will continue to be expanded and refined as part of Master Plan implementation so it can serve as an effective resource to managers and stakeholders.

Best practices for common work products

Draft Fisheries Management Plan review

As discussed in Chapter 10, the scientific components of FMPs are subject to external peer review. Scientific analyses, including stock assessments, should be peer reviewed before they are used as a basis for identifying management strategies. Review of methodologies, complex models, or stock assessments supporting an FMP should occur separately from review of a full draft FMP.

Review of a complete draft FMP should occur late in development when a full high-quality draft is completed, and preferably before public comment so that the science has been reviewed and any issues addressed. Reviewers should not be used as FMP development teams or advisory committees.

Following the operating procedures of the PFMC, an FMP peer review should evaluate statistical, biological, economic, social, and other scientific information, analyses, analytical methodologies, literature, research, and information relevant to decision-making. Rather than a line-by-line assessment, an FMP review should consider addressing the following questions:

  • Do the scientific and technical components within and supporting the FMP form a rigorous framework that can support sound fishery management decisions?
  • Are there critical discussions or literature that should be factored into the FMP that would substantially strengthen the document?
  • Are the models’ interpretations technically sound, appropriate, and supported by the best available data?
  • Are the proposed reference points within and supporting the FMP scientifically sound and supported by the best available data? Are the thresholds sufficient and appropriate for identifying important changes/trends in stock status?
  • Are research and monitoring needs comprehensive enough to allow the Department to collect and maintain EFI necessary to achieve management targets for the stock? Are there any priority gaps in research and monitoring that should be addressed or included?

If the FMP is at the draft stage and the supporting scientific analyses, models, and methods have already been reviewed, the draft may not necessitate a highly-processed technical review and a written review may be more appropriate. A follow-up webinar and/or workshop review could be conducted if enough concern were to emerge during the review process.

Methodology reviews

Methodology reviews are appropriate when a major new data source is introduced, when a new tool is developed for consideration in management, or when a major change is made to a method or model. Ideally, the scientific and technical merits of a new methodology proposed for use should be reviewed before the methodology is applied in an FMP or other management work product to help ensure any issues have been resolved. A reviewed model can then be included in an “accepted” toolbox for use in fishery management, and any future application will not need the same level of review, unless there are exceptional circumstances.

The scope of a methodology review will vary depending on the work product under review, but should consider addressing the following questions:

  • Are the analytical methods used appropriate and technically sound?
  • Are the research, data collection, and analyses comprehensive and representative of the best available science, and do they support the methodology?
  • If it is a new methodology proposed for use, how does it improve upon existing approaches, and how can it be applied in support of management targets for the stock?
  • What research and/or monitoring are needed to improve the methodology in the future?

Remote panel reviews, panel workshops, and/or journal peer reviews are modes of peer review most appropriate for methodologies since they tend to be novel, untested, and can be subject to controversy.

Stock assessment and Management Strategy Evaluation reviews

Stock assessments use fishery-dependent and -independent data to describe the past and current status of a fish population or stock to help managers make predictions about how a fishery will respond to current and future management measures. MSE are simulations that compare different combinations of data collection efforts, methods of analysis, and subsequent management actions in order to identify an appropriate strategy or understand the effectiveness or associated risk of an existing management strategy. Stock assessments have only been completed for a handful of marine species in California due to the resource-intensive nature of the exercise and the data required for a fishery. However, as more data-poor, rapid stock assessment and MSE methods become available, the Department will likely conduct more frequent assessments and evaluations that require peer review. A stock assessment and/or MSE review may consider posing the following questions to the review team:

  • Are the underlying assumptions, data inputs, model parameters, and other pertinent information scientifically sound and appropriate?
  • Are additional sensitivity runs, analyses, or data required to support the peer review process?
  • Does the stock assessment or MSE represent the best available scientific information to inform the development of HCRs? Are there any deficiencies in the input data or analytical methods?
  • What additional research and monitoring are needed to improve the assessment and fishery management in the future?
  • What data sets were considered and rejected for the final model, and why were they rejected?

The mode of peer review most appropriate for a stock assessment or MSE is a panel workshop because of the need for group discussion and additional data analyses. In addition to reviewers, stock assessment and MSE review workshops often include the FMP management team and Department scientists, as well as additional stock assessment and MSE experts. Stock assessment review processes have been well established for federal fisheries management. Groups like South East Data Assessment and Review(opens in new tab) and NOAA PFMC Stock Assessment Review Panels may provide informative examples of successful approaches that vary in detail and level of time and analyses required.

Review of science supporting focused rulemaking or routine management measures

Routine management measures are those that are likely to be adjusted annually or more frequently, and may include changes to conservation area boundaries, trip limits, bag limits, and size limits among other measures. The science supporting these measures has often been previously reviewed or relies on expert judgment. Given the need for timeliness, the mode of peer review most appropriate for science supporting focused rulemaking or routine management measures may vary. Often, the mode will likely fall under internal review or external expert written review depending on the significance and implications of the rulemaking. With controversial issues it should be determined whether the benefits of a panel review with public, stakeholder, and agency input may be worth the costs of the more extensive process.

Additional considerations

Stakeholder buy-in of a review process and outputs may be of particular importance for highly-politicized, controversial, or sensitive fisheries. Understanding who key stakeholders are and how they are likely to react to a review can help identify the best ways to engage them in the process. The Department should consider whether a transparent process is consistently applied across all reviews, or whether stakeholder involvement is determined on a case-by-case basis depending on the needs of a review. See Appendix G for strategies regarding stakeholder engagement.

Terms of reference and sample report template

TOR documents outline general procedures and responsibilities that contributors should aim to adhere to when conducting a formal process such as developing and peer-reviewing a work product. A TOR is typically developed for each type of review (e.g., stock assessment review, methodology review) and for each fishery. TOR documents detail the objectives, approaches, reporting requirements, and responsibilities of participants. They are made publicly available to enhance transparency. Each individual review will likely have unique requirements that can be defined in a specific TOR document or scope of work.

Drawing on experience of the PFMC, the Department should develop TORs that include information on:

  • Review process goals and objectives.
  • Roles and responsibilities of participants.
  • Structure and qualifications of the review panel participants.
  • Structure of meetings and/or workshops.
  • Process for requesting additional data or analyses.
  • Guidelines for dealing with uncertainty and areas of disagreement.
  • Guidance on structure of the review report (see below).

Sample Council TOR reports

General Fisheries Peer Review Checklist

Below is a checklist that should be used by the Department and review coordinating bodies to plan for a peer review process. Note that timelines often shift, so review coordinators should maintain a high level of flexibility (given that end products are often time sensitive).

Peer Review Scoping

4-6 months prior to start of a review

Department

Determine whether product is subject to or exempt from review

  • ▢ If review is required, determine whether review is internal or external
  • ▢ If external, contract with an appropriate review coordinating body

1-2 months prior to start of review

Department

  • ▢ Deliver draft report to review coordinating body

Review Coordinating Body

  • ▢ Work with the Department to develop a specific TOR or scope of work indicating:
  • ▢ Mode and level of review
    • Roles and responsibilities of all parties involved in the review
    • Process, timeline, and budget
    • Level of stakeholder involvement
    • Required reviewer expertise and appropriate number of reviewers
    • Product(s) from the review
  • ▢ Select and convene reviewers
  • ▢ Have reviewers complete and sign a conflict of interest policy and a non-disclosure agreement (if required)
  • ▢ Develop review instructions based on draft report and specific TOR
  • ▢ Develop collateral (e.g., webpage, communication materials, stakeholder listserv)

Conduct Peer Review

Reviews take from 6 weeks to several months

Review Coordinating Body

  • ▢ Distribute specific TOR, review materials, and review instructions to reviewers
  • ▢ Administer review based on mode selected (e.g., individual written reviews, panel workshop, etc.)
  • ▢ Gather and submit additional data and analyses requests to the Department
  • ▢ Develop draft product(s)
  • ▢ Manage reviewers approval of/sign-off on final product
  • ▢ Deliver product to the Department for a management preview prior to public release
  • ▢ When appropriate, conduct a results briefing with the client and/or stakeholders
  • ▢ Post final report online and distribute to interested partners and stakeholders

Peer Review Follow-up

Revisions to the product under review may occur from several weeks to several months after delivery of the review report

Review Coordinating Body

  • ▢ Facilitate discussions between reviewers and the Department as they consider review feedback and revise the work product
  • ▢ Where appropriate, present results of review in a public meeting (e.g., Commission public meeting)
  • ▢ Work with the Department to develop text to include in the final work product that appropriately represents the review process and outcomes
Table Q1. Summary of scientific peer reviews of Department work products from the period of 2001 – 2017 (adapted from OST 2017).
Work product reviewed Review year Review type Coordinating entity Review format Public participation Number of reviewers Review output
Draft Nearshore FMP

2001

FMP

Sea Grant

1-day workshop

None

6

Individual written reports, consolidated report

Draft White Sea Bass FMP

2001

FMP

Sea Grant

1-day workshop

None

4

Individual written reports, consolidated report

Draft Market Squid FMP

2002

FMP

Sea Grant

2-day workshop

None

5

Compiled summary report written by review panel (internal)

Draft Abalone Recovery and Management Plan

2002

FMP

Sea Grant

2-day workshop

None

4

Compiled summary report from California Sea Grant (internal)

Model Supporting the Herring Stock Assessment

2003

Methodology

Sea Grant

2-day workshop

None

3

Peer Review Report (PDF)(opens in new tab)

Sheephead Stock Assessment

2004

Stock assessment

Department

Meeting

Unknown

3

Peer Review Report (PDF)(opens in new tab)

California Halibut Assessment

2011

Stock assessment

Department

3-day workshop

Workshop open to public (with public comment)

3

Peer Review Report (PDF)(opens in new tab)

Spiny Lobster Stock Assessment

2011

Stock assessment

Department

2-day workshop

None

3

Peer Review Report (PDF)(opens in new tab)

Abalone Density Estimation Method

2014

Methodology

OST

Multiple remote meetings and a 1-day workshop

Several remote meetings open to public (with public comment)

6

Peer Review Report (PDF)(opens in new tab)

Draft Spiny Lobster FMP

2015

FMP

OST

Multiple remote meetings

None

4

Peer Review Report (PDF)(opens in new tab)

White Seabass Stock Assessment

2016

Stock assessment

Pfleger Institute

2-day workshop

Workshop open to public (with public comment) and many participants

2

Peer Review Report (PDF)(opens in new tab)

Pacific Herring Stock Assessment

2016/2017

Stock assessment

Department

2-day workshop

No public

3

In progress

Photo at top of page: Fish eye. (Greg Amptman/Shutterstock photo)