Internet DRAFT - draft-sparks-genarea-review-tracker
draft-sparks-genarea-review-tracker
Network Working Group R. Sparks
Internet-Draft Oracle
Intended status: Informational T. Kivinen
Expires: February 8, 2016 INSIDE Secure
August 7, 2015
Tracking Reviews of Documents
draft-sparks-genarea-review-tracker-03
Abstract
Several review teams ensure specific types of review are performed on
Internet-Drafts as they progress towards becoming RFCs. The tools
used by these teams to assign and track reviews would benefit from
tighter integration to the Datatracker. This document discusses
requirements for improving those tools without disrupting current
work flows.
Status of This Memo
This Internet-Draft is submitted in full conformance with the
provisions of BCP 78 and BCP 79.
Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF). Note that other groups may also distribute
working documents as Internet-Drafts. The list of current Internet-
Drafts is at http://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months
and may be updated, replaced, or obsoleted by other documents at any
time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress."
This Internet-Draft will expire on February 8, 2016.
Copyright Notice
Copyright (c) 2015 IETF Trust and the persons identified as the
document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents
(http://trustee.ietf.org/license-info) in effect on the date of
publication of this document. Please review these documents
carefully, as they describe your rights and restrictions with respect
to this document. Code Components extracted from this document must
include Simplified BSD License text as described in Section 4.e of
Sparks & Kivinen Expires February 8, 2016 [Page 1]
Internet-Draft Review Tracking August 2015
the Trust Legal Provisions and are provided without warranty as
described in the Simplified BSD License.
Table of Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 2
2. Overview of current workflows . . . . . . . . . . . . . . . . 3
3. Requirements . . . . . . . . . . . . . . . . . . . . . . . . 5
3.1. Secretariat focused . . . . . . . . . . . . . . . . . . . 5
3.2. Review-team Secretary focused . . . . . . . . . . . . . . 5
3.3. Reviewer focused . . . . . . . . . . . . . . . . . . . . 8
3.4. Review Requester and Consumer focused . . . . . . . . . . 11
3.5. Statistics focused . . . . . . . . . . . . . . . . . . . 11
4. Security Considerations . . . . . . . . . . . . . . . . . . . 12
5. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 12
6. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 12
7. Changelog . . . . . . . . . . . . . . . . . . . . . . . . . . 13
7.1. 03 . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
7.2. 02 . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
7.3. 01 . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
7.4. 00 . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
8. Informative References . . . . . . . . . . . . . . . . . . . 14
Appendix A. A starting point for Django models supporting the
review tool . . . . . . . . . . . . . . . . . . . . 15
Appendix B. Suggested features deferred for future work . . . . 17
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 17
1. Introduction
As Internet-Drafts are processed, reviews are requested from several
review teams. For example, the General Area Review Team (Gen-ART)
and the Security Directorate (Secdir) perform reviews of documents
that are in IETF Last Call. Gen-ART always performs a follow-up
review when the document is scheduled for an IESG telechat. Secdir
usually performs a follow-up review, but the Secdir secretary may
choose not to request that follow-up if any issues identified at Last
Call are addressed and there are otherwise no major changes to the
document. These teams also perform earlier reviews of documents on
demand. There are several other teams that perform similar services,
often focusing on specific areas of expertise.
The secretaries of these teams manage a pool of volunteer reviewers.
Documents are assigned to reviewers, taking various factors into
account. For instance, a reviewer will not be assigned a document
for which he is an author or shepherd. Reviewers are given a
deadline, usually driven by the end of last call or a IESG telechat
date. The reviewer sends each completed review to the team's mailing
list, and any other lists that are relevant for document being
Sparks & Kivinen Expires February 8, 2016 [Page 2]
Internet-Draft Review Tracking August 2015
reviewed. Often, a thread ensues on one or more of those lists to
resolve any issues found in the review.
The secretaries and reviewers from several teams are using a tool
developed and maintained by Tero Kivinen. Much of its design
predates the modern Datatracker. The application currently keeps its
own data store, and learns about documents needing review by
inspecting Datatracker and tools.ietf.org pages. Most of those pages
are easy to parse, but the last-call pages, in particular, require
some effort. Tighter integration with the Datatracker would simplify
the logic used to identify documents ready for review, make it
simpler for the Datatracker to associate reviews with documents, and
allow users to reuse their Datatracker credentials. It would also
make it easier to detect other potential review-triggering events,
such as a document entering working group last call, or when a RFC's
standard level is being changed without revising the RFC. Tero
currently believes this integration is best achieved by a new
implementation of the tool. This document captures requirements for
that reimplementation, with a focus on the workflows that the new
implementation must take care not to disrupt. It also discusses new
features, including changes suggested for the existing tool at its
issue tracker [art-trac].
For more information about the various review teams, see the
following references
+--------------+---------------------+
| Gen-ART | [Gen-ART] [RFC6385] |
| Secdir | [Secdir] |
| AppsDir | [AppsDir] |
| OPS-dir | [OPS-dir] |
| RTG-dir | [RTG-dir] |
| MIB Doctors | [MIBdoctors] |
| YANG Doctors | [YANGdoctors] |
+--------------+---------------------+
2. Overview of current workflows
This section gives a high-level overview of how the review team
secretaries and reviewers use the existing tool. It is not intended
to be comprehensive documentation of how review teams operate.
Please see the references for those details.
For many teams, the team's secretary periodically (typically once a
week) checks the tool for documents it has identified as ready for
review. The tool has compiled this list from Last Call announcements
and IESG telechat agendas. The secretary creates a set of
assignments from this list into the reviewer pool, choosing the
Sparks & Kivinen Expires February 8, 2016 [Page 3]
Internet-Draft Review Tracking August 2015
reviewers in roughly a round-robin order. That order can be
perturbed by several factors. Reviewers have different levels of
availability. Some are willing to review multiple documents a month.
Others may only be willing to review a document every other month.
The assignment process takes exceptional conditions such as reviewer
vacations into account. Furthermore, secretaries are careful not to
assign a document to a reviewer that is an author, shepherd,
responsible WG chair, or has some other already existing association
with the document. The preference is to get a reviewer with a fresh
perspective. The secretary may discover reasons to change
assignments while going through the list of documents. In order to
not cause a reviewer to make a false start on a review, the
secretaries complete the full list of assignments before sending
notifications to anyone. This assignment process can take several
minutes, and it is possible for new last calls to be issued while the
secretary is making assignments. The secretary typically checks to
see if new documents are ready for review just before issuing the
assignments, and updates the assignments if necessary.
Some teams operate in more of a review-on-demand model. RTG-dir, for
example, primarily initiates reviews at the request of a Routing AD.
They may also start an early review at the request of a working group
chair. In either case, the reviewers are chosen manually from the
pool of available reviewers driven by context rather than using a
round-robin ordering.
The issued assignments are either sent to the review team's email
list or are emailed directly to the assigned reviewer. The
assignments are reflected in the tool. For those teams handling
different types of reviews (Last Call vs Telechat for example), the
secretary typically processes the documents for each type of review
separately, and potentially with different assignment criteria. In
Gen-ART, for example, the Last Call reviewer for a document will
almost always get the follow-up Telechat review assignment.
Similarly, Secdir assigns any re-reviews of a document to the same
reviewer. Other teams may choose to assign a different reviewer.
Reviewers discover their assignments through email or by looking at
their queue in the tool. The secretaries for some teams (such as
OPS-dir and RTG-dir) insulate their team members from using the tool
directly. These reviewers only work through the review team's email
list or through direct email. On teams that have the reviewers use
the tool directly, most reviewers only check the tool when they see
they have an assignment via the team's email list. A reviewer has
the opportunity to reject the assignment for any reason. While the
tool provides a way to reject assignments, reviewers typically use
email to coordinate rejections with the team secretary. The
secretary will find another volunteer for any rejected assignments.
Sparks & Kivinen Expires February 8, 2016 [Page 4]
Internet-Draft Review Tracking August 2015
The reviewer can indicate that the assignment is accepted in the tool
before starting the review, but this feature is rarely used.
The reviewer sends a completed review to the team's email list or
secretary, any other lists relevant to the review, and usually the
draft's primary email alias. For instance, many last call reviews
are also sent to the IETF general list. The teams typically have a
template format for the review. Those templates usually start with a
summary, describing the conclusion of the review. Typical summaries
are "Ready for publication" or "On the right track, but has open
issues". The reviewer (or in the case of teams that insulate their
reviewers, the secretary) uses the tool to indicate that the review
is complete, provides the summary, and has an opportunity to provide
a link to the review in the archives. Note, however, that having to
wait for the document to appear in the archive to know the link to
paste into the tool is a significant enough impedance that this link
is often not provided by the reviewer. The Secdir secretary manually
collects these links from the team's email list and adds them to the
tool.
Occasionally, a document is revised between when a review assignment
is made and when the reviewer starts the review. Different teams can
have different policies about whether the reviewer should review the
assigned version or the current version.
3. Requirements
3.1. Secretariat focused
o The Secretariat must be able to configure secretaries and
reviewers for review teams (by managing Role records).
o The Secretariat must be able to perform any secretary action on
behalf of a review team secretary (and thus, must be able to
perform any reviewer action on behalf of the reviewer).
3.2. Review-team Secretary focused
o A secretary must be able to see what documents are ready for
review of a given type (such as a Last Call review)
o A secretary must be able to assign reviews for documents that may
not have been automatically identified as ready for a review of a
given type. (In addition to being the primary assignment method
for teams that only initiate reviews on demand, this allows the
secretary to work around errors and handle special cases,
including early review requests.)
Sparks & Kivinen Expires February 8, 2016 [Page 5]
Internet-Draft Review Tracking August 2015
o A secretary must be able to work on and issue a set of assignments
as an atomic unit. No assignment should be issued until the
secretariat declares the set of assignments complete.
o The tool must support teams that have multiple secretaries. The
tool should warn secretaries that are simultaneously working on
assignments, and protect against conflcting assignments being
made.
o It must be easy for the secretary to discover that more documents
have become ready for review while working on an assignment set.
o The tool should make preparing the assignment email to the team's
email list easy. For instance, the tool could prepare the
message, give the secretary an opportunity to edit it, and handle
sending it to the team's email list.
o It must be possible for a secretary to indicate that the review
team will not provide a review for a document (or a given version
of a document). This indication should be taken into account when
presenting the documents that are ready for review of a given
type. This will also make it possible to show on a document's
page that no review is expected from this team.
o A secretary must be able to easily see who the next available
reviewers are, in order.
o A secretary must be able to edit a Reviewer's availability, both
in terms of frequency, not-available-until-date, and skip-next-
n-assignments. (See the description of these settings in the
Reviewer focused section.)
o The tool should make it easy for the secretary to see any team
members that have requested to review a given document when it
becomes available for review.
o The tool should make it easy for the secretary to identify that a
reviewer is already involved with a document. The current tool
allows the secretary to provide a regular expression to match
against the document name. If the expression matches, the
document is not available for assignment to this reviewer. For
example, Tero will not be assigned documents matching '^draft-
(kivinen|ietf-tcpinc)-.*$'. The tool should also take any roles,
such as document shepherd, that the Datatracker knows about into
consideration.
o The tool should make it easy for the secretary to see key features
of a document ready for assignment, such as its length, its
Sparks & Kivinen Expires February 8, 2016 [Page 6]
Internet-Draft Review Tracking August 2015
authors, the group and area it is associated with, its title and
abstract, its states (such as IESG or WG states) and any other
personnel (such as the shepherd and reviewers already assigned
from other teams) involved in the draft.
o The tool must make it easy for the secretary to detect and process
re-review requests on the same version of a document (such as when
a document has an additional last call only to deal with new IPR
information).
o Common operations to groups of documents should be easy for the
secretary to process as a group with a minimum amount of
interaction with the tool. For instance, it should be possible to
process all the of documents described by the immediately
preceding bullet with one action. Similarly, for teams that
assign re-reviews to the same reviewer, issuing all re-review
requests should be a simple action.
o A secretary must be able to see which reviewers have outstanding
assignments.
o The tool must make it easy for the secretary to see the result of
previous reviews from this team for a given document. In Secdir,
for example, if the request is for a revision that has only minor
differences, and the previous review result was "Ready", a new
assignment will not be made. If the given document replaces one
or more other prior documents, the tool must make it easy for the
secretary to see the results of previous reviews of the replaced
documents.
o The tool must make it easy for the secretary to see the result of
previous reviews from this team for all documents across
configurable recent periods of time (such as the last 12 months).
A RTG-dir secretary, for example, would use this result to aid in
the manual selection of the next reviewer.
o The tools must make it easy for the secretary to see the recent
performance of a reviewer while making an assignment (see
Section 3.5). This allows the secretary to detect overburdened or
unresponsive volunteers earlier in the process.
o A secretary must be able to configure the tool to remind them to
followup when actions are due. (For instance, a secretary could
receive email when a review is about to become overdue).
o A secretary must be able to assign multiple reviewers to a given
draft at any time. In particular, a secretary must be able to
Sparks & Kivinen Expires February 8, 2016 [Page 7]
Internet-Draft Review Tracking August 2015
assign an additional reviewer when an original reviewer indicates
their review is likely to be only partially complete.
o A secretary must be able to withdraw a review assignment.
o A secretary must be able to perform any reviewer action on behalf
of the reviewer.
o A secretary must be able to configure the review-team's set of
reviewers (by managing Role records for the team).
o Information about a reviewer must not be lost when a reviewer is
removed from a team. (Frequently, reviewers come back to teams
later.)
o A secretary must be able to delegate secretary capabilities in the
tool (similar to how a working group chair can assign a Delegate).
This allows review teams to self-manage secretary vacations.
3.3. Reviewer focused
o A reviewer must be able to indicate availability, both in
frequency of reviews, and as "not available until this date." The
current tool speaks of frequency in these terms:
- Assign at maximum one new review per week
- Assign at maximum one new review per fortnight
- Assign at maximum one new review per month
- Assign at maximum one new review per two months
Assign at maximum one new review per quarter
o Reviewers must be able to indicate hiatus periods. Each period
may be either "soft" or "hard".
- A hiatus must have a start date. It may have an end date, or
may be indefinite.
- During a hiatus, the reviewer will not be included in the
normal review rotation. When a provided end date is reached,
the reviewer will automatically be included in the rotation in
their usual order.
Sparks & Kivinen Expires February 8, 2016 [Page 8]
Internet-Draft Review Tracking August 2015
- During a "soft" hiatus, the reviewer must not be assigned new
reviews, but is expected to complete existing assignments and
do followup reviews.
- During a "hard" hiatus, the reviewer must not be assigned any
new reviews, and the secretary must be prompted to reassign any
outstanding or followup reviews.
o Reviewers must be able to indicate that they should be skipped the
next n times they would normally have received an assignment.
o Reviewers must be able to indicate that they are transitioning to
inactive, providing a date for the end of the transition period.
During this transition time, the reviewer must not be assigned new
reviews, but is expected to complete outstanding assignments and
followup reviews. At the end of the transition period, the
secretary must be prompted to reassign any outstanding or followup
reviews. (This allows review-team members that are taking on,
say, AD responsibility to transition gracefully to an inactive
state for the team).
o Both the reviewer and the secretary will be notified by email of
any modifications to a reviewer's availability.
o A reviewer must be able to easily discover new review assignments.
(The tool might send email directly to an assigned reviewer in
addition to sending the set of assignments to the team's email
list. The tool might also use the Django Message framework to let
a reviewer that's logged into the Datatracker know a new review
assignment has been made.)
o Reviewers must be able to see their current set of outstanding
assignments, completed assignments, rejected assignments. The
presentation of those sets should either be separate, or, if
combined, the sets should be visually distinct.
o A reviewer should be able to request to review a particular
document. The draft may be in any state - available and
unassigned; already assigned to another reviewer; or not yet
available.
o A reviewer must be able to reject a review assignment, optionally
providing the secretary with an explanation for the rejection.
The tool will notify the secretary of the rejection by email.
o A reviewer must be able to indicate that they have accepted and
are working on an assignment
Sparks & Kivinen Expires February 8, 2016 [Page 9]
Internet-Draft Review Tracking August 2015
o A reviewer must be able to indicate that a review is only
partially completed, asking the secretary to assign an additional
reviewer. The tool will notify the secretary of this condition by
email.
o It should be possible for a reviewer to reject or accept a review
either by using the tool's web interface, or by replying to the
review assignment email.
o It must easy for a reviewer to see when each assigned review is
due.
o A reviewer must be able to configure the tool to remind them when
actions are due. (For instance, a reviewer could receive email
when a review is about to become overdue).
o A reviewer must be able to indicate that a review is complete,
capturing where the review is in the archives, and the high-level
review-result summary
o It must be possible for a reviewer to clearly indicate which
version of a document was reviewed. Documents are sometimes
revised between when a review was assigned and when it is due.
The tool should note the current version of the document, and
highlight when the review is not for the current version.
o It must be easy for a reviewer to submit a completed review.
- The current workflow, where the reviewer sends email to the
team's email list (possibly copying other lists) and then
indicates where to find that review must continue to be
supported. The tool should make it easier to capture the link
to review in the team's email list archives (perhaps by
suggesting links based on a search into the archives).
- The tool should allow the reviewer to enter the review into the
tool via a web form (either as directly-provided text, or
through a file-upload mechanism). The tool will ensure the
review is posted to the appropriate lists, and will construct
the links to those posts in the archives.
- The tool could also allow the reviewer to submit the review to
the tool by email (perhaps by replying to the assignment). The
tool would then ensure the review is posted to the appropriate
lists.
Sparks & Kivinen Expires February 8, 2016 [Page 10]
Internet-Draft Review Tracking August 2015
3.4. Review Requester and Consumer focused
o It should be easy for an AD or group chair to request any type of
review, but particularly an early review, from a review team.
o It should be possible for that person to withdraw a review
request.
o It must be easy to find all reviews of a document when looking at
the document's main page in the Datatracker. The reference to the
review must make it easy to see any responses to the review on the
email lists it was sent to. If a document "replaces" one or more
other documents, reviews of the replaced documents should be
included in the results.
o It must be easy to find all reviews of a document when looking at
search result pages, and other lists of documents such as the
documents on a IESG telechat agenda.
3.5. Statistics focused
o It must be easy to see the following, across all teams, a given
team, or a given reviewer, and independently across all time, or
across configurable recent periods of time:
- How many reviews have been completed
- How many reviews are in progress
- How many in progress reviews are late
- How many completed reviews were late
- How many reviews were not completed at all
- Average time to complete reviews (from assignment to
completion)
o It must be easy to see, for all teams, for a given team, or for a
given reviewer, across all time, or across configurable recent
periods:
- Total counts of reviews in each review state (done, rejected,
etc.)
- Total counts of completed reviews by result (ready, ready with
nits, etc.)
Sparks & Kivinen Expires February 8, 2016 [Page 11]
Internet-Draft Review Tracking August 2015
o The above statistics should also be calculated reflecting the size
of the documents being reviewed (such as using the number of pages
or words in the documents).
o Where applicable, statistics should take reviewer hiatus periods
into account.
o Access to the above statistics must be easy to configure. Access
will be initially limited as follows
- The secretariat and ADs can see any statistic.
- A team secretary can see any statistics for that team.
- A reviewer can see any team aggregate statistics, or their own
reviewer-specific statistics.
o Where possible, the above statistics should be visible as a time-
series graph.
o The implementation should anticipate future enhancements that
would allow ADs to indicate their position was informed by a given
review. Such enhancements would allow reporting correlations
between reviews and documents that receive one or more discusses.
However, implementing these enhancements is not part of the
current project.
4. Security Considerations
This document discusses requirements for tools that assist review
teams. These requirements do not affect the security of the Internet
in any significant fashion. The tools themselves have authentication
and authorization considerations (team secretaries will be able to do
different things than reviewers).
5. IANA Considerations
This document has no actions for IANA.
6. Acknowledgments
Tero Kivinen and Henrik Levkowetz were instrumental in forming this
set of requirements and in developing the initial Django models in
the appendix.
The following people provided reviews of this draft: David Black,
Deborah Brungard, Brian Carpenter, Elwyn Davies, Stephen Farrell,
Sparks & Kivinen Expires February 8, 2016 [Page 12]
Internet-Draft Review Tracking August 2015
Joel Halpern, Jonathan Hardwick, Russ Housley, Barry Leiba, Jean
Mahoney, Randy Presuhn, Gunter Van De Velde, and Martin Vigoureux.
(If we have missed a reviewer here, or failed to capture or respond
to a review comment, please retransmit and accept our apologies.)
7. Changelog
7.1. 03
o Fixed frequent typo haitus -> hiatus
7.2. 02
o Clarified typical telechat review workflow for secdir in the
introduction.
o Added MIB and Yang doctor teams.
o Captured requirement to be able to request a review assignement
for a given document.
o Capture a requirement to gracefully handle teams that have
multiple secretaries.
o Captured that statistics should reflect document size in addition
to document count.
o Noted that assignement rejection is currently coordinated by
email.
o Noted that teams often copy the draft's primary email alias on
reviews.
o Noted that haitus periods should be carefully considered when
building statistics.
o Highlighted a couple of scenarios where the tools needs to send
email to the secretary.
7.3. 01
o Captured that OPS-dir reviewers do not use the tool directly -
only the secretaries do.
o Secretaries must be able to see who has outstanding reviews.
o Reviewers must be able to see when assignments are due.
Sparks & Kivinen Expires February 8, 2016 [Page 13]
Internet-Draft Review Tracking August 2015
o Captured that RTG-dir reviews documents primarily only at the
specific request of Routing ADs.
o Captured that RTG-dir QA-reviews can be requested by chairs.
o Captured that RTG-dir assignments are made by unicast, rather than
through a directorate list.
o Added a requirement to be able to say "this team isn't going to
review this (version of this) document"
o Captured that the secretariat should be able to act on behalf of
secretaries
o Noted that reviewer information must not be lost as reviewers
leave teams
o Captured the notion of a hiatus
o Captured the notion of transition to inactive
o Secretaries that want reminders should be able to configure the
tool to prompt them as due dates arrive
o Reviewers that want reminders should be able to configure the tool
to prompt them as due dates arrive
o Access to statistics for individuals should initially be limited
until we have some experience with them.
o Clarified the scope of trying to correlate reviews to discusses,
recognizing this is mostly future work.
o Captured that secretaries and reviewers need to be able to handle
the edge cases when a review needs to be reassigned or an
additional reviewer needs to be assigned after the initial
assignee has started.
7.4. 00
o Initial Version
8. Informative References
[AppsDir] "Applications Directorate", Work in Progress , April 2015,
<http://trac.tools.ietf.org/area/app/trac/wiki/
AppsDirReview>.
Sparks & Kivinen Expires February 8, 2016 [Page 14]
Internet-Draft Review Tracking August 2015
[art-trac]
"Area Review Team Tool - Active Tickets", Work in Progress
, April 2015, <https://wiki.tools.ietf.org/tools/art/trac/
report/1>.
[Gen-ART] "General Area Review Team Guidelines", Work in Progress ,
April 2015,
<http://trac.tools.ietf.org/area/gen/trac/wiki>.
[MIBdoctors]
"MIB Doctors", Work in Progress , April 2015,
<http://www.ietf.org/iesg/directorate/mib-doctors.html>.
[OPS-dir] "OPS Directorate", Work in Progress , April 2015,
<https://svn.tools.ietf.org/area/ops/trac/wiki/
Directorates>.
[RFC6385] Barnes, M., Doria, A., Alvestrand, H., and B. Carpenter,
"General Area Review Team (Gen-ART) Experiences", RFC
6385, DOI 10.17487/RFC6385, October 2011,
<http://www.rfc-editor.org/info/rfc6385>.
[RTG-dir] "Routing Directorate", Work in Progress , April 2015,
<http://trac.tools.ietf.org/area/rtg/trac/wiki/RtgDir>.
[Secdir] "Security Directorate", Work in Progress , April 2015,
<http://trac.tools.ietf.org/area/sec/trac/wiki/
SecurityDirectorate>.
[YANGdoctors]
"YANG Doctors", Work in Progress , April 2015,
<http://www.ietf.org/iesg/directorate/yang-doctors.html>.
Appendix A. A starting point for Django models supporting the review
tool
from django.db import models
from ietf.doc.models import Document
from ietf.person.models import Email
from ietf.group.models import Group, Role
from ietf.name.models import NameModel
class ReviewRequestStateName(NameModel):
""" Requested, Accepted, Rejected, Withdrawn, Overtaken By Events,
No Response , Completed """
class ReviewTypeName(NameModel):
Sparks & Kivinen Expires February 8, 2016 [Page 15]
Internet-Draft Review Tracking August 2015
""" Early Review, Last Call, Telechat """
class ReviewResultName(NameModel):
"""Almost ready, Has issues, Has nits, Not Ready,
On the right track, Ready, Ready with issues,
Ready with nits, Serious Issues"""
class Reviewer(models.Model):
"""
These records associate reviewers with review team, and keeps track
of admin data associated with the reviewer in the particular team.
There will be one record for each combination of reviewer and team.
"""
role = models.ForeignKey(Role)
frequency = models.IntegerField(help_text=
"Can review every N days")
available = models.DateTimeField(blank=True,null=True, help_text=
"When will this reviewer be available again")
filter_re = models.CharField(blank=True)
skip_next = models.IntegerField(help_text=
"Skip the next N review assignments")
class ReviewResultSet(models.Model):
"""
This table provides a way to point out a set of ReviewResultName
entries which are valid for a given team, in order to be able to
limit the result choices that can be set for a given review, as a
function of which team it is related to.
"""
team = models.ForeignKey(Group)
valid = models.ManyToManyField(ReviewResultName)
class ReviewRequest(models.Model):
"""
There should be one ReviewRequest entered for each combination of
document, rev, and reviewer.
"""
# Fields filled in on the initial record creation:
time = models.DateTimeField(auto_now_add=True)
type = models.ReviewTypeName()
doc = models.ForeignKey(Document,
related_name='review_request_set')
team = models.ForeignKey(Group)
deadline = models.DateTimeField()
requested_rev = models.CharField(verbose_name="requested_revision",
max_length=16, blank=True)
state = models.ForeignKey(ReviewRequestStateName)
# Fields filled in as reviewer is assigned, and as the review
Sparks & Kivinen Expires February 8, 2016 [Page 16]
Internet-Draft Review Tracking August 2015
# is uploaded
reviewer = models.ForeignKey(Reviewer, null=True, blank=True)
review = models.OneToOneField(Document, null=True,
blank=True)
reviewed_rev = models.CharField(verbose_name="reviewed_revision",
max_length=16, blank=True)
result = models.ForeignKey(ReviewResultName)
Appendix B. Suggested features deferred for future work
Brian Carpenter suggested a set of author/editor-focused requirements
that were deferred for another iteration of improvement. These
include providing a way for the editors to acknowledge receipt of the
review, potentially tracking the email conversation between the
reviewer and document editor, and indicating which review topics the
editor believes a new revision addresses.
Authors' Addresses
Robert Sparks
Oracle
7460 Warren Parkway
Suite 300
Frisco, Texas 75034
USA
Email: rjsparks@nostrum.com
Tero Kivinen
INSIDE Secure
Eerikinkatu 28
HELSINKI FI-00180
FI
Email: kivinen@iki.fi
Sparks & Kivinen Expires February 8, 2016 [Page 17]