ICANN Logo

.org Reassignment: Preliminary Staff Report on Evaluation of the Proposals for Reassignment of the .Org Registry

Posted: 19 August 2002


Preliminary Staff Report on Evaluation of the Proposals for Reassignment of the .Org Registry

Summary

This Preliminary ICANN Staff Report documents the results of the solicitation and evaluation process used to assist in the ICANN Board's selection of an entity to assume responsibility for operating the .org registry starting 1 January 2003 from the current operator of that registry, VeriSign, Inc. It is now posted for comments of the bidders and community prior to finalization and submission to the ICANN Board for its decision. Public comments on this draft report should be submitted by e-mail to org-eval@icann.org on or before 29 August 2002. A final version of this Staff Report, taking into account comments received, will be posted on 5 September 2002, and comments will also be invited on that final version.

Eleven strong proposals were received in response to the Request for Proposals issued by ICANN on 20 May 2002. Each bidder clearly took the task very seriously and invested considerable effort in proposal preparation and submission. ICANN should be very grateful for both the interest demonstrated and the effort and resources expended.

Unfortunately, only one bidder can be successful. ICANN owes it to those who submitted bids, therefore, to conduct as fair, thorough, impartial, open and transparent a process as is reasonably possible. Every effort has been made in the solicitation and evaluation processes to ensure that is the case. No process can be perfect – there is always room for improvement and, indeed, a longer timescale might have produced more bids or more complete bids, and a more detailed evaluation. However, we have great confidence in both the bid process and in the evaluation methodology and results. We believe the evaluation process has been fair and impartial, and that more time for evaluation or a different approach would not have led to a different conclusion.

Because the selection criteria span a variety of subjects, the evaluation was conducted using a multi-team approach. The teams were:

  • Gartner, Inc. performed an evaluation of technical aspects of the bids, which were identified in the published criteria (based on Board comments at its Accra meeting) as of primary importance. Gartner is an internationally recognized leader in the field of information technology consulting.
  • An international team of Chief Information Officers from major academic institutions performed an independent evaluation of those technical aspects using a different methodology.
  • The Noncommercial Domain Name Holders Constituency of ICANN's DNSO evaluated "usage" aspects (as detailed below) of the bids.
  • ICANN's General Counsel evaluated certain procedural aspects.

These evaluators – many of whom graciously volunteered their time – worked very hard and under very tight deadlines to produce their evaluations. ICANN appreciates their efforts.

This report summarizes the results of the evaluation. Each of the three teams evaluating the technical and usage aspects produced three-tier rankings of the bids. Only one of the proposals – that submitted by the Internet Society (ISOC) – was accorded top-tier ranking by all three teams. Based on that fact, and consideration of procedural aspects, ICANN staff's preliminary (subject to public comment) recommendation is that the proposal submitted by the Internet Society (ISOC) be selected.

Background

The bid solicitation arose from the revisions to the agreements among ICANN, VeriSign, and the U.S. Department of Commerce that were approved by the ICANN Board at its meeting on 2 April 2001 and signed in May 2001. One of the provisions of that agreement was that VeriSign would relinquish responsibility for operating the .org registry to an entity of selected by ICANN at the end of the calendar year 2002. As part of that provision, VeriSign also agreed to provide a US$5M endowment to be used to fund future operating costs of the successor registry operator, provided it is a non-profit entity.

In response to this provision, ICANN launched an open and transparent bid solicitation and evaluation process that was announced on 22 April 2002. Full details of this announcement, of the subsequent steps followed, and of the bids received can be found at the ICANN website in the Materials on .Org Reassignment. The bid solicitation was authorized by the ICANN Board at its meeting in Accra, Ghana in March 2002 following a report and recommendations it had received from the Domain Name Supporting Organization (DNSO). At that meeting, the ICANN Board stated that primacy of consideration should be given to stability of transition and operation of the .org registry so that there be no service interruptions in the .org registry.

As a result of the final Request for Proposals that was issued by ICANN on 20 May 2002, eleven proposals were received on or before the 18 June 2002 deadline. These and all subsequent materials received by the bidders are posted on the ICANN Website at http://www.icann.org/tlds/org/. Open and transparent procedures were maintained throughout the evaluation process. Bidders were requested to communicate only in writing with ICANN staff or board members, and any materials received from any of the bidders were posted on the website. All bidders were invited to make a brief presentation on their bid in the special ICANN Public Forum held for the purpose on 26 June 2002 in Bucharest, Romania. The process allowed for written questions to be submitted by prospective bidders prior to final submission of bids; all questions and answers were posted on the website.

This report summarizes the process used to evaluate the bids received and presents the resulting staff recommendation to the Board.

The Bidders

Eleven bids were received. Six of these were from not-for-profit organizations, most of which had obtained the commitment of other operating registries ("back-end operators") to operate the .org registry on the bidder's behalf should the bidder be successful, but the bidder would retain overall policy direction and community interface, that is, overall responsibility and accountability would remain with the bidder. All six not-for-profit bidders and one for-profit bidder plan to seek the US$5M endowment to be provided by VeriSign to assist with operating costs. One of the six (UIA) proposed to employ the services of the VeriSign registry should UIA be awarded the reassignment.

The eleven proposals in alphabetic order and, where applicable, associated back-partners including "back-end" registry operators are as follows:

Proposal Primary Partner(s)
.Org Foundation eNom Inc.
Dot Org Foundation Registry Advantage: Kintera Inc.
GNR International Federation of Red Cross and Red Crescent Societies
IMS/ISC N/A
Internet Society (ISOC) Afilias and Auxiliary Service Providers
NeuStar N/A
Organic Names CentraNic Limited
RegisterOrg Register.com
Switch Auxiliary Service Providers
UIA VeriSign
Unity Registry Poptel Limited; AusRegistry Pty Ltd.

Table 1: Proposal Submissions and Associated Primary Partners

Evaluation Process

The RFP stated eleven criteria that would be used in assessing the proposals. These are listed here for ease of reference:

    1. Need to preserve a stable, well-functioning .org registry.
    2. Ability to comply with ICANN-developed policies.
    3. Enhancement of competition for registration services.
    4. Differentiation of the .org TLD from TLDs intended for commercial purposes.
    5. Inclusion of mechanisms for promoting the registry's operation in a manner that is responsive to the needs, concerns, and views of the noncommercial Internet user community.
    6. Level of support for the proposal from .org registrants.
    7. The type, quality, and cost of the registry services proposed.
    8. Ability and commitment to support, function in, and adapt protocol changes in the shared registry system.
    9. Transition considerations.
    10. Ability to meet and commitment to comply with the qualification and use requirements of the VeriSign endowment and proposed use of the endowment.
    11. The completeness of the proposals submitted and the extent to which they demonstrate realistic plans and sound analysis.

Criteria 1, 7, 8, and 9 are primarily technical in nature and are referenced here as Technical Criteria. Criteria 4, 5 and 6 are focused on the extent to which the bidders address the needs of non-commercial registrants consistent with the primary purposes of the .org registry; these are referenced here as Usage Criteria. Criteria 2, 3, and 10 are primarily procedural in nature and are designated as Procedural Criteria. Criterion 11 is in a category of its own and is addressed separately.

The decision was made to select different teams to evaluate each set of criteria, since different expertise was required in each case. In fact, as described below, two Technical Evaluation Teams were selected that operated independently and without knowledge of each other in order to lend confidence to the final results. For reasons described below, ICANN's General Counsel evaluated the Procedural Criteria. Criterion 11 was also assigned to one of the Technical Evaluation teams for reasons described below.

The reports of the evaluation teams can be viewed at the links below:

Gartner, Inc. Evaluation Report (technical aspects)
Academic CIO Evaluation Report (technical aspects)
NCDNHC Evaluation Report (usage aspects)
ICANN General Counsel Evaluation Report (procedural aspects)

ICANN staff analyzed these reports, and synthesized the results into the final staff recommendation to the ICANN Board as presented in this document. During the evaluation process, staff were available to answer questions of the evaluation teams and to clarify terms and definitions or any issues associated with the RFP. Staff also addressed any issues regarding potential conflicts of interest that arose in one or two cases (there were no actual conflicts of interest, in fact).

An attempt was also made to create an international panel of experts who could provide e-mail answers to technical questions regarding registry operations. This proved impractical in the short timeframe available, particularly given the small pool of large registry operators who were not themselves part of one or more of the proposals (and also given the influence of summer vacations!).

The proposals were all required to follow a particular format to ease comparison and evaluation. Section C of the responses consisted of answers to 36 questions (several with subparts) that were intended to elicit information regarding how well the bidder met one or more of each criterion. Staff also provided to the evaluation teams a mapping of these so-called "C-questions" into the criteria, that is, for each criterion this mapping indicated the list of relevant C-questions that should primarily be considered in the evaluation team's analysis. This mapping was intended to be helpful, but by no means binding, that is, an evaluation team could consider any part of each response and related posted material in reaching their conclusions.

The Evaluation Teams

The processes used by each evaluation team are summarized below. Prior to the posting of this report, the identity of each team and of its members has not been made publicly available. This was to ensure as far as possible that each team could work quietly through their evaluations based on the posted materials without any risk of being importuned by anyone who may have an interest in the outcome of the process.

Technical Evaluation Teams:

Two very different evaluation teams were selected to evaluate the proposals with respect to the Technical Criteria. Each was requested to follow its own approach to evaluation to ensure diverse evaluations, since there is no absolutely deterministic and failsafe way to conduct an evaluation. Each team operated independently and without knowledge of the other. Until the posting of this report, they have not seen each other's evaluations.

Two teams were chosen following different approaches essentially to provide a check and balance on the evaluation process. Consistency of responses from each team would lend confidence to the validity of the evaluations. Serious inconsistencies would raise questions about methodology that might require reexamination of the process and the results. This approach was felt to be especially important in the case of the technical evaluation, because of the weight the Board had clearly placed on the primacy of operational stability in the transfer of the .org registry, as was reflected in the published criteria for evaluation.

Both teams were instructed only to rely on posted materials, namely the RFP and associated materials, the proposals and associated materials, and (to the extent germane) posted community comments. That is, their work needed to be documentable with reference to the written word.

The teams and processes used were as follows:

Gartner, Inc.: Gartner, Inc. ("Gartner") is an internationally recognized consulting corporation that specializes in information and communication technologies. It also analyzes industry and technology trends and provides reports to its customers that are highly regarded. One particular area of Gartner specialization is procurement, where Gartner provides full services to its clients for all phases of the procurement process from RFP development to final bidder selection and negotiations. ICANN secured the services of Gartner, however, just to be of assistance in the evaluation phase of the .org reassignment process.

In the work leading to its final report, Gartner used a traditional "weights and scores" methodology to analyze the proposals, supplemented by its own approach to assigning weights and to providing advisories to ICANN about particular items that, in Gartner's view, could be obscured by the weights and scores process. (This can occur, for example, if a particular bid – in spite of having a good overall finals score in a particular category – provided an unacceptable response in an area that Gartner interpreted as being critical according to the words of the RFP.) In the Gartner approach, the RFP technical criteria were analyzed and broken down into specific subcomponents.

In this process, weights were assigned to each subcomponent according to its importance as documented in the RFP or other posted materials or to the overall goal of the particular criterion. Each subcomponent was separately analyzed and scored according to a defined scale, leading to an overall numerical score for each proposal against each of the technical criteria. Gartner also combined the criteria into a single overall assessment, using its own judgment as to what weight to assign the results of each individual criterion based on Gartner's reading of the RFP. Gartner also summarized the strengths and weaknesses of each proposal.

Although not strictly a technical criterion, Gartner was also asked to assess Criterion 11 regarding the completeness of the proposals and the extent to which they demonstrate sound plans and realistic analysis. This was because a very high proportion of the C-questions were directed towards the technical criteria, and it made sense to obtain Gartner's views in this area.

Academic CIO Team: To provide a completely different perspective, a team was assembled that was primarily composed of CIOs (or individuals with relevant ICT administration experience) from major academic institutions in the U.S., Mexico, and Australia, with considerable experience in procurements, and information and communication technologies. The members of the team and their individual qualifications are listed in Appendix 1 and in the evaluation team's report. The team was chaired by Jim Dolgonas, formerly Assistant Vice President for Information Resources and Communications at the University of California Office of the President, and now Chief Operating Officer for CENIC, the Consortium for Educational Network Initiatives in California.

This team followed a more condensed approach than Gartner. The team was asked to classify the proposals into three tiers: high, acceptable, and marginal. The team adopted a modified "Delphic" approach, whereby each team member carefully read each proposal and reached a tentative individual conclusion regarding the classification of each proposal according to the technical criteria. The team then met together and worked through their individual evaluations until they reached a team consensus on the classification of each proposal.

The team was not asked to provide in their report a detailed analysis of the strengths and weakness of each proposal, but to provide a short definition of each classification category that demonstrates the ground rules used. The team was only asked to provide a brief report on their overall findings, not a detailed justification.

From this perspective, the Academic CIO Team evaluation should be considered as a reasonableness check on the Gartner evaluation, not as a detailed evaluation in its own right.

Usage Evaluation Team:

The Usage Evaluation Team (this appellation is bestowed in this report, and was not given to the team at the start of their work) was composed of individuals active in the ICANN Non-Commercial Domain Name Holders Constituency (NCDNHC). This all-volunteer team was assembled and co-chaired by Harold Feld and Milton Mueller. Members of the team are listed in Appendix 1 and in the team's report.

Many members of this team had considerable familiarity with the task in hand, having participated directly or indirectly in the DNSO task force (or at least in the dialog surrounding the work of the task force) on .org that led to the DNSO recommendation to the ICANN Board. In particular, many individuals on this team had previously carefully considered many of the issues and concerns surrounding the relationship of the .org registry to the domain name aspirations of the non-commercial community (or communities).

Given this familiarity, it was not felt necessary to establish two evaluation teams in this area. The team, however, was asked that the evaluation of each proposal on each of the three criteria be based on documented, reasoned analysis. The report of the team speaks for itself very well in this regard.

The team used a combination of weights and scores methodology and ranking methodology to reach its conclusions, and combined these approaches to arrive at combined overall scores for each proposal across all three criteria, using two different approaches to synthesizing its findings. The team acknowledges that any such quantitative approach is subject to imperfections, and prefers in its final recommendations to classify the proposals into three tiers, suggesting that ICANN not attempt to differentiate among proposals within each tier.

Procedural Evaluation Team:

It is something of a misnomer to call this a "team", since it is simply an evaluation by ICANN's General Counsel of three of the criteria concerning matters that do not fit within the expertise of the other teams. Criterion 2 is largely a verification step that all bidders are expected to meet, so that it only needs to be applied to the proposal(s) likely to be selected based on the overall evaluation. Criterion 3 relates to enhancement of competition and is relevant mainly to bids involving relationships to the incumbent registry operator, and analysis of that criterion has been focused accordingly. Because criterion 10 is similarly relevant only to particular proposals (i.e. only those seeking to qualify for the VeriSign endowment), evaluation under that criterion has also been targeted.

The fourth criterion in this category (#11) does require evaluation as part of the overall process. For the reasons indicated above, Gartner was asked to provide an evaluation of the proposals with respect to this criterion and has done so.

ICANN should also be grateful to all the members of the evaluation teams listed in this Report who worked very hard under a very tight deadline to produce their recommendations. Many of these individuals are volunteers who graciously made their time available for this activity.

Evaluation Summary

The evaluation reports can be directly accessed and speak for themselves. Although each evaluation took a different approach, both the technical and the usage evaluations resulted in classifying each proposal into one of three overall tiers. The exact definitions of these tiers varied among the evaluations, as did the methodologies for integrating the applicable criteria. In Table 2, however, we summarize the proposals according to these tiers, using A, B, C as indicators for the categorization, while again emphasizing that these indicators represent different metrics for each evaluation and can only be interpreted in detail by reference to the reports themselves. In fact, two separate columns are presented for the Usage Evaluation representing the results of two different approaches used in that evaluation.

Proposal Technical Evaluation1 Usage Evaluation2
Gartner Inc. Academic CIO Average Ranking Normalized Ranking
.Org Foundation C C C C
Dot Org Foundation A C B C
GNR A B B B
IMS/ISC C C A B
Internet Society (ISOC) A A A A
NeuStar A A B B
Organic Names C B C C
RegisterOrg A B B C
Switch C C C C
UIA B B B B
Unity Registry B C A A

Table 2: Summary of Rankings3 of Evaluation Teams

Key to Table 2:

A: Ranked as top tier by evaluation team
B: Ranked as middle tier by evaluation team
C: Ranked as bottom tier by evaluation team

Notes to Table 2:

1. With respect to Criteria 1, 7, 8, 9
2. With respect to Criteria 4, 5, 6
3. See evaluation reports for separate definitions of A, B, C

Again, it must be emphasized that these designations cannot be readily combined into a single result for each proposal.

Conclusions and Recommendation

Notwithstanding the caution just expressed, it is apparent from Table 2 that only one proposal, namely the ISOC proposal, was ranked top-tier in each evaluation, according to each evaluation's team definition of such a designation. That is the fundamental basis for a recommendation to the Board that the ISOC proposal be selected and that the Board authorize the President to proceed to negotiate an agreement based on that proposal.

However, further comments are in order. The ISOC proposal is also in the "A" category as evaluated by Gartner with respect to Criterion 11 addressing completeness of proposal and soundness of plans. ICANN Counsel has also validated that the ISOC proposal meets the requirements of Criterion 2 insofar as compliance with ICANN-developed policies is concerned: indeed, the ISOC "back-end" operator, Afilias, has already demonstrated this in its operation of the recently launched .info registry. Counsel has also opined that this proposal should qualify for the VeriSign endowment (Criterion 10).

The "back-end" operator contracted by ISOC, Afilias, is a consortium of eighteen gTLD registrars. VeriSign is a minority (5.6%) shareholder of Afilias as one of these registrars. Because the other Afilias shareholders are VeriSign's competitors, however, VeriSign's ability to exercise control over Afilias is effectively minimized and, indeed, no VeriSign employee has been elected to Afilias' Board of Trustees/Directors. In these circumstances, VeriSign's minority stake in Afilias does not materially implicate Criterion 3 (Enhancing Competition for Registry Services), particularly in view of the fact that the .org registry would be assigned to ISOC, not Afilias.

From Table 2, it can be seen that there are other strong proposals. The Board has stated that it gives primacy to consideration of continuing stability of operation of the .org registry as indicated through demonstrated experience. This gives special weight to the technical evaluation of Criterion 1 (need to preserve a stable, well-functioning .org registry) and to an extent Criterion 9 (transition considerations). In other words, proposals that rank very high with respect to these criteria should be given primacy of consideration above all others, and vice versa. The Gartner evaluation ranks the NeuStar proposal highest in these categories, followed by ISOC and then UIA (Criterion 1) or GNR (Criterion 9). However, NeuStar, UIA, and GNR all rank in the "B" category of the Usage Evaluations, and the last two also fall in the "B" category of the Academic CIO Technical Evaluation. The UIA proposal also falls short of the other proposals with respect to Criterion 3, in enhancing competition. In considering the GNR proposal, it is also appropriate to recognize that its experience in operating a shared registry is demonstrably less than that of Afilias (ISOC), NeuStar, or VeriSign (UIA), in that the .name registry is an order of magnitude smaller than the .info (Afilias), .biz/.us (NeuStar), or certainly any of the VeriSign operated gTLD registries; furthermore, as described in the General Counsel's evaluation of Criterion 3, GNR's shared (real-time) mode operation of the .name registry has used a VeriSign "back-end", that is, not using the technology that GNR proposes to use for .org.

Nevertheless, further consideration can be given to one of these proposals – particularly the NeuStar and GNR proposals (because of the registry diversification Criterion 3 issue with the UIA proposal) – if it proves impossible to negotiate an agreement with ISOC should the Board approve this recommendation.

Conversely, the Unity proposal ranks at the top of the Usage Evaluation, and the IMS/ISC proposal ranks second in one approach in the Usage Evaluation. Both of these proposals, however, fall short in the Technical Evaluations, although the Unity proposal does fall in the "B" category of the Gartner Technical Evaluation.

In summary, although there are several strong proposals among the eleven submitted, the staff's view is that the ISOC proposal is the strongest and most balanced proposal overall. We recommend that the Board authorize the President to enter into negotiations immediately on a schedule that would allow Board approval of the negotiated agreement and ultimate transition of the registry so that ISOC could commence operations on 1 January 2003.

Should negotiations with ISOC fail in the allowed timeframe, staff recommends that the President be authorized to enter into negotiations with NeuStar and then GNR, in that order. NeuStar's and GNR's ranking could be perceived as about equal overall, but weight should be given to the greater experience of NeuStar in actually and successfully operating a large registry and having completed a transition of an actual registry (.us) from VeriSign.

We conclude this report by thanking all the many individuals and institutions that worked so hard and who expended significant resources in formulating and submitting proposals; and in evaluating the proposals. A competitive process such as this can only have one successful award. But this result should not detract from the overall effort and thoroughness of all the submissions.


Appendix 1: Evaluation Teams

Technical Evaluation Teams

Gartner, Inc.

  • Mark Gilbert - Director - Engagement Manager
  • Jamshid Lal - Senior Consultant - Project Manager

Academic CIO Evaluation Team

  • Geoffrey Dengate, Griffith University, Australia
  • James Dolgonas, Corporation for Education Network Initiatives in California, USA
  • Lev Gonick, Case Western University, USA
  • Anne Stunden, University of Wisconsin, USA
  • Juan Voutssas, National Autonomous University of Mexico

Usage Evaluation Team (from NCDNHC)

  • Mr. Thierry Amoussougbo, Benin
  • Mr. Harold Feld, USA
  • Mr. Eric Iriarte, Peru
  • Mr. Milton Mueller, USA
  • Ms. Youn Jun Park, Republic of Korea
  • Mr. Ermanno Pietrosemli, Venezuela
  • Mr. Marc Schneider, Germany
  • Mr. Dany Vandromme, France

Comments concerning the layout, construction and functionality of this site
should be sent to webmaster@icann.org.

Page Updated 20-Sep-2002
©2002  The Internet Corporation for Assigned Names and Numbers. All rights reserved.