The Peer Review Coordinator and the Collegiality Index

As we sought to map out the design and functionality of the PPJ with colleagues at Matrix a few weeks ago, we began to suggest how a disciplinary economy of an open peer review might be navigated in ways that at once ensure rigor and maximize collegiality.

In order to do this, it will be important to approach the review process not simply as a means to an final scholarly publication, but as itself an important scholarly activity. 1

To facilitate this, we intend to assign to each review target a peer review coordinator (PRC) whose responsibilities would include, among other things:

  1. Identifying reviewers with the requisite expertise;
  2. Cultivating a climate of collegiality between the reviewers and the author;
  3. Establishing review criteria, including target specific review prompts (beyond the standard review prompts adopted by the journal);
  4. Creating the conditions for a just review, including, if necessary, toggling on reviewer anonymity;
  5. Facilitating the discussion forum associated with the review to ensure that the most salient and substantive ideas and suggestions have the most influence.

Obviously, with responsibilities like these, it will be necessary (and difficult) to cultivate the requisite habits of digital scholarly communication among the members of the PPJ community. As a start, we envision developing a community of PRCs first among the Philosophy graduate student research assistants at Penn State and Michigan State. But if the PPJ is to be successful, and if we are going to be able to scale up our capacity for open public peer review, we will need to extend our community of coordinators more broadly.

To do this, we envision creating a sophisticated system of credentialing that will be translated into a PPJ user score for each member of the PPJ community. What, precisely, will constitute the PPJ user score will be developed in the months to come in conversation with an emerging community of interested colleagues inside and outside the academy.

However, one measure that should be an important determining factor of the PPJ user score should be something that we might call one’s “collegiality index.”

Drawing on the work done by Hart-Davidson, McLeod, Klerkx and Wojcik, regarding how to measure “helpfulness” in online peer review, we hope to develop a collegiality index. They suggest that a helpful review:

  1. Describes the rhetorical moves a scholar makes to achieve rhetorical aims;
  2. Accurately and fairly evaluates the review target; and
  3. Provides “specific, actionable advice” to improve the target of review. 2

Similarly, we might consider operationalizing the collegiality score according to how well a reviewer is able to:

  1. Accurately describe what animates the scholarship under review, thus demonstrating a capacity for hermeneutical empathy;
  2. Evaluate the review target in its own terms, thus demonstrating a capacity for hermeneutical generosity;
  3. Engage the community in ways that enrich the scholarship under review, thus demonstrating a capacity for hermeneutical transformation.

A community member’s “collegiality index” would be determined over time based on past collegiality scores and would be integrated into the PPJ user profile to become part of the member’s cultivated reputation. The hope is that by integrating a measurable expectation of collegiality into the fabric of the PPJ itself, we will be able to cultivate a Network of Scholarly Practice capable of creating the conditions under which excellent scholarship can be produced and productive scholars can become excellent.

  1. For the idea that review should be valued “itself as a teachable and learnable activity,” see Hart-Davidson, William, Michael McLeod, Christopher Klerkx, and Michael Wojcik. “A Method for Measuring Helpfulness in Online Peer Review.” In Proceedings of the 28th ACM International Conference on Design of Communication, 115–121. SIGDOC ’10. New York, NY, USA: ACM, 2010, §2.1. doi:10.1145/1878450.1878470.
  2. Ibid., §3.
0 comments

Trackbacks

  1. […] the cultivation side, the collegiality index will enable us to articulate best practices of public scholarly deliberation and to signal to users […]

  2. […] on the important work done by the team at ELI Review to create a helpfulness score in peer assessment at the undergraduate level, we are developing ways to computationally identify […]

  3. […] our fruitful conversation with Jack, we continued to work on our Collegiality Index, which we’ve been developing with the help of Bill Hart-Davidson at Matrix. We’ve been […]

  4. […] As a member of Matrix and as a creator of Eli Review, Bill will help us to develop our Collegiality Index. To prepare for our meeting with him, we continued to brainstorm important characteristics of open […]

  5. […] conduct (or rather, for a guideline for Best Practices for Public Scholarly Deliberation) and the collegiality index: a system that will allow community members to evaluate each other’s contributions to article […]

  6. […] thoughts on these issues in recent posts on The Disciplinary Economy of Open Peer Review and on The Peer Review Coordinator and the Collegiality Index. Readers can expect that contributors will continue to post pieces like these, which publicly […]

  7. […] calling a “collegiality index.” You can read more about the collegiality index in this recent post by Chris Long. As we develop the policies and criteria related to this index, we’ll share them […]

  8. […] be able to judge the quality of the contributions of every user (with what we call the user’s collegiality index), it is important that our software be able to track each user’s comments as belonging to […]