news aggregator

New gTLD Brand Congress Emphasizes Consumers and Innovation

CircleID posts - Wed, 2013-03-13 20:58

The New gTLD Brand Congress held earlier this week in New York provided terrific insight into how brands and New gTLD businesses are approaching the space. We saw evidence of forward movement and decision making. Overall, the main takeaways were:

• Established brands need to focus on enhancing customer experience in a new .brand world; and

• New gTLD businesses or registries will find security/stability to be critical for gaining public trust and connecting with representative communities.

Big Brand Decision Making Case Study – CITIBANK (.citi)

Mr. Louis Cohen, Citibank SVP of Internet and Digital Marketing, identified the critical questions that informed Citibank's decision making. The first question was "what extensions might be relevant?" Because Citibank has a wide scope of financial products/services, the Citibank team came up with hundreds of potential strings to consider, including .fin, .bank, .mortgage, .citibank, .creditcard etc. Next, Citibank considered how it might use a new string, and whether that use would enhance Citibank's customers' experience. Lastly, Citibank looked at whether its competitors would gain an advantage by pursuing certain strings and whether there were any "risks" to the Citibank brands. According to Mr. Cohen, Citibank "spent months" thinking through these questions.

Ultimately, the value proposition for Citibank, according to Mr. Cohen, was to work toward creating a better consumer experience. The result will be taking Citibank's very large/diverse digital footprint and consolidating it into one .citi footprint that will be simpler for consumers and easier to protect from a security standpoint. "Citbank sees opportunity in consolidating its digital footprint into something only we can control" and will allow it to unite its global internet presence "making it easier across the board", stated Mr. Cohen.

Mr. Cohen noted that Citibank would still be active in the .com/.net world and that it would have considerable work to do on the SEO front in order to make sure it adapts its content to be searchable in a New gTLD world and an increasingly "mobile first environment." Citibank is also considering the possibility of giving its customers their own .citi email address and contemplating how to treat 2nd level domains in a .citi TLD.

Donuts – A Data Driven Process

Mr. Richard Tindal, COO of Donuts, also spoke at the Congress and discussed the evolution of Donuts and its business plan. Donuts has garnered a lot of press because it applied for 307 New gTLD strings.

Mr. Tindal explained that the Donuts process of picking strings was "intensely data driven." They considered terms which scored high in tests gauging factors such as longevity, usage, rate of entry, potential for conflict with existing brands etc. Donuts has created a proprietary algorithm to consider all these factors.

Mr. Tindal stated that Donuts views its business as being similar to a "content agnostic" Internet Service Provider that would offer consumers "great domains" in interesting spaces and that Donuts planned to implement trademark/brand protections beyond those required by ICANN.

Overall, Mr. Tindal estimated that Donuts would likely end up owning over 200 of the 307 gTLD strings it applied for.

Innovation in the New gTLDs

The final panel of the Congress focused on innovation and the future of the New gTLD space. Mr. Roland LaPlante, SVP and CMO at Afilias, Mr. Hal Bailey, Strategic Partner Manager at Google and Mr. Tim Switzer, CFO and COO of DotGreen, all spoke on the issue.

The panel first discussed the existing challenges to the New gTLD system. Mr. LaPlante offered that the biggest challenges would be government interference and attempts "to control the [Internet's] naming system" as well as how ICANN and consumers respond when TLDs fail or are decommissioned. He wondered how this type of "shock" to the DNS system might affect Internet stability.

From Google's perspective, Mr. Bailey explained that "consumer confusion" was a huge issue because the general public "doesn't know and doesn't care" about New gTLDs, and that it will confuse people and require a transition for users over a "consumer confidence" barrier, which he believes can be made rapidly.

On the ICANN side, Mr. Switzer of DotGreen explained that many of the details regarding New gTLDs are "still in play," including important issues like the registry agreements. These would need to be resolved quickly in order for ICANN to meet its deadlines and will be important topics at the upcoming ICANN meeting in Beijing. He also noted that DotGreen has taken a hands-on approach to its involvement with the relevant "green" community by working with green organizations in local communities and creating "green" partnerships. This type of interaction will be an important part of making sure that DotGreen (and other gTLDs) connect with consumers.

The panel also discussed the future of the DNS system in a New gTLD world. Mr. LaPlante cited unpublished research commissioned by Afilias (to gauge potential consumer attitudes and behaviors) which seem to indicate a positive reception for New TLDs:

  • 78% of consumers are unaware of New TLDs
  • 39% of consumers would likely trust a .brand TLD
  • 28% of consumers less likely to trust a .brand TLD
  • 48% of consumers believe .brand security would be better than generic TLD
  • 37% of consumers believe not having a .brand shows lagging behind competitors

Lastly, the panel agreed with Mr Bailey's assessment that there would be "a very large upward swing in technology over the next few years" and that the New gTLD system could bring innovation and fresh ideas to businesses.

Written by David Mitnick, President DomainSkate LLC

Follow CircleID on Twitter

More under: Domain Names, Registry Services, ICANN, Top-Level Domains

Categories: Net coverage

A Look at Why Businesses Buy Cloud Services?

CircleID posts - Wed, 2013-03-13 20:39

If you are a cloud provider, whether you are pure play or an internal IT department, it is very interesting to know who is buying cloud services, and why.

In a recent survey by PB7 sponsored by EuroCloud Netherlands and others, a group of Dutch companies was interviewed about their motivations and hesitations around cloud computing. The survey's results were quite a bit more interesting than the usual lot. In this article I have cherry picked a few observations from the larger survey. The full survey is reported on in http://www.slideshare.net/peterthuis/ecm12-ghgg (in Dutch).

The majority of companies are using cloud computing these days, and this includes government organizations by the way. That adoption rate is not growing so fast anymore. The growth is in the number of cloud applications that are being deployed (and presumably also in the number of users of those applications).

How does cloud computing fit business strategy? Companies change for a number of reasons and objectives, and cloud computing as a driver is no different. Some organizations innovate using cloud computing, but from the survey it appears most are just optimizing business process, or even just substituting current solutions.

Substitution happens when an existing solution is replaced by a cheaper one. Examples of these can be seen across the board. As you can expect from a wide survey, the most common applications are mail, messaging, document processing, sales, marketing, distribution, HR. One striking category though is field service where a lot of adoption is going on. Inhibitors for these types of applications include are the value of current investments ("the server in the closet has not been fully written off").

Optimization involves process change: doing things differently. This could involve people inside the organization as well as outside the organization. From anecdotal evidence, we know that collaboration tools are on the rise, in particular when they serve to communicate over organizational boundaries. Think procurement, project collaboration and marketplaces. These are the 'cloud native' apps so to say. The other category involves empowering the current workforce, especially if it is mobile already, a trend we see happening in airlines and retail. Cloud productivity solutions allow the inclusion of staff that was not equipped with computers before. This is clearly a big market for horizontal application suites such as Google Docs and Office 365. Vertical applications areas include HR and e-learning.

The less predictable the workload, the bigger the advantage becomes that cloud applications have over non-cloud applications. About a tenth of the researched applications have a 'rapid growth' workload pattern, i.e. new applications, new business. For these categories cloud is by far the preferred solution.

These trends align very well with two important cloud characteristics: elastic scalability (especially from a financial perspective), and broad network access (anytime/anywhere/anydevice). Broad network access allows the inclusion of users that are not within the corporate firewall.

Infrastructure as a service (IaaS) is definitely on the rise across the board: small/large enterprises as well as governments. It is expected to increased penetration to 30 percent in 2014, a twofold increase in two years. Still, this is a lot less than the penetration of SaaS.

As the number of cloud applications per organization rises, integration concerns increase. From the survey, it appears cloud consumers are seeing three different avenues to address these concerns. They call for open standards, they turn to cloud brokers to do the integration for them, and they hope to see ecosystems such as app stores providing this integration for them.

Other concerns are security and privacy in general, though it is unclear to what extent these fears are actually translated into action. It is peculiar in this respect that only 40% of cloud users has a clear exit plan.

There are quite a few implications for service providers in these findings. The biggest demand for cloud services is for rationalizing existing IT systems, and if they are internal, expanding their use cases to include mobile employees and business partners. As an extension of these, inclusion of more people and partners can allow business processes to be reengineered. Partnering with consultants to help effect these changes might make sense.

Potential clients are concerned about integration and security risks. Conceivably, adequately addressing these concerns can be a selling proposition. For the mechanics of that, have a look at another article I wrote (see Can we simplify cloud security?). A lot of these concerns (including integration) are expressible in terms of the CSA Cloud Control Matrix (Disclosure: I updated some of these controls recently as a CSA volunteer).

If you are a cloud provider and wonder how to improve your offering, you may be interested in having a look at www.cloudcomputingundercontrol.com where I have outlined a Governance, Risk Management and Compliance roadmap.

Written by Peter HJ van Eijk, Cloud Computing Coach, Author and Speaker

Follow CircleID on Twitter

More under: Cloud Computing

Categories: Net coverage

ITU Staff Gone Wild

CircleID posts - Wed, 2013-03-13 18:56

In virtually all governmental legislative bodies, the staff is there to provide secretariat services for the government representatives. The staff role does not include telling the representatives what decisions they should be making. The stricture is supposed to be the same at the International Telecommunication Union (ITU) for its treaty making activities.

It is with some amazement that last week, the ITU secretariat staff showed up at a seminar in Bangkok they helped schedule — with a purported "ITU Presentation on WCIT-12 Outcome" to eighteen ITU Member States attempting to sell them on accession to the International Telecommunication Regulations (ITRs) flatly rejected by 55 nations (G55), with 49 additional ones remaining undecided. In other words, less than half the ITU Member States have signed — a stunning adverse result unprecedented in the history of the organization.

The slick 36 slide presentation begins by misrepresenting ITU history, provides a completely one-sided view of the ITRs, and at the end, includes an accession form as one of the slides — suggesting that Member States bind themselves to the ITR provisions! The last slide makes the incredulous assertion that "the treaty provides a framework for the accelerated growth of ICTs at the national and international level, in particular to bring Internet access to the two-thirds of the world's population which is still offline, and to drive investment in broadband."

The reality here is quite at odds with the SnakeOil sales pitch. The ITRs — which are a treaty instrument construct for the government-run electrical telegraph world of 1850 - are not exactly the right match for a hyper-dynamic, technology and investment driven world of 2013 driven by the global marketplace. There was a reason why 55 ITU Member Nations flatly rejected the ITRs and walked out of the meeting. The entire instrument plainly does not comport with today's world of telecommunications and information systems, and accepting the associated ITR-12 obligations is tantamount to consigning the nation to Internet impoverishment. Reiterating the socio-political acronym "ICT" like an incantation, when ironically the term remains undefined, achieves nothing.

The ITU staff attempts to sweep the rejection of the WCIT-12 Final Acts under the carpet with the absurd assertion that it "compares with 1988 when 112 countries signed ITRs on the last day of the Melbourne conference." It does not. I helped run the secretariat for the 1988 conference for Secretary-General Butler. All the Member States present in Melbourne signed the Final Acts. None of them walked out of the conference. At that time, the ITRs arguably had some marginal justification. Today they are an anachronism and negative value proposition.

In addition to being a treaty that patently was not needed, the ITR provisions that emerged at WCIT-12 crossed two "red lines" for the G55. They vastly expanded the relatively narrowly compartmentalized scope and effect of the existing ITR provisions to include all electronic communication services and apply to essentially anyone connected to a network. Furthermore, most of the ITR provisions are essentially operational, contractual, and regulatory options that vary greatly among different countries, and purport to rely on activities in an ITU-T that today is essentially non-functional because the work is almost completely accomplished in other global industry venues and arrangements. Twenty-nine nations have reduced their ITU contributions, and few countries today participate in ITU-T activity. It is ludicrous to think that all these fundamental infirmities of the ITRs will somehow become unimportant for the non-signatory nations.

The ITU staff has scheduled another of the WCIT-12 sales shows in South Africa in July 2013. The action seems demeaning to the African nations as the ITU staff is not pursuing these events in Europe or North America where they would receive substantial opposition to their presentation assertions.

In large measure, the entire WCIT-12 debacle was induced and facilitated by ITU elected officials and staff that encouraged and manipulated almost everything surrounding the conference. They obviously have not stopped. Engaging in this behavior is inappropriate. When the first permanent secretariat for an ITU precursor organization was created — the Berne Bureau that serviced the needs of the signatories to the Convention télégraphique internationale de Paris (1865) et Règlement de service international (Paris, 1865) — it was made clear that the staff were not to become involved in substantive legislative work done by the Nation States among themselves. That important stricture was obviously lost over the past decade.

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance

Categories: Net coverage

ICANN Releases Guideline for Coordinated Vulnerability Disclosure Reporting

CircleID posts - Tue, 2013-03-12 19:31

ICANN has released a set of guidelines to explain its Coordinated Vulnerability Disclosure Reporting. The guidelines serve two purposes, says ICANN: "They define the role ICANN will perform in circumstances where vulnerabilities are reported and ICANN determines that the security, stability or resiliency of the DNS is exploited or threatened. The guidelines also explain how a party, described as a reporter, should disclose information on a vulnerability discovered in a system or network operated by ICANN."

Coordinated Vulnerability Disclosure refers to “a reporting methodology where a party (‘reporter’) privately discloses information relating to a discovered vulnerability to a product vendor or service provider (‘affected party’) and allows the affected party time to investigate the claim, and identify and test a remedy or recourse before coordinating the release of a public disclosure of the vulnerability with the reporter.”

Illustration of a Coordinated Disclosure Process – The roles and relationships of parties typically involved in a coordinated disclosure. Source: ICANN (Click to Enlarge)

Follow CircleID on Twitter

More under: Cyberattack, Cybercrime, DNS, ICANN, Malware, Security

Categories: Net coverage

ICANN Releases Guideline for Coordinated Vulnerability Disclosure Reporting

CircleID news briefs - Tue, 2013-03-12 19:31

ICANN has released a set of guidelines to explain its Coordinated Vulnerability Disclosure Reporting. The guidelines serve two purposes, says ICANN: "They define the role ICANN will perform in circumstances where vulnerabilities are reported and ICANN determines that the security, stability or resiliency of the DNS is exploited or threatened. The guidelines also explain how a party, described as a reporter, should disclose information on a vulnerability discovered in a system or network operated by ICANN."

Coordinated Vulnerability Disclosure refers to “a reporting methodology where a party (‘reporter’) privately discloses information relating to a discovered vulnerability to a product vendor or service provider (‘affected party’) and allows the affected party time to investigate the claim, and identify and test a remedy or recourse before coordinating the release of a public disclosure of the vulnerability with the reporter.”

Illustration of a Coordinated Disclosure Process – The roles and relationships of parties typically involved in a coordinated disclosure. Source: ICANN (Click to Enlarge)

Follow CircleID on Twitter

More under: Cyberattack, Cybercrime, DNS, ICANN, Malware, Security

Categories: Net coverage

Security and Reliability: A Closer Look at Penetration Testing

CircleID posts - Tue, 2013-03-12 18:46

As noted in my first article of this series (see part one, two and three), security and reliability encompass holistic network assessments, vulnerability assessments and penetration testing. In this post I'd like to go deeper into penetration testing; however, first, let's go back for a quick refresh before getting started.

There are three broad steps any organization can take with respect to security and reliability to get a handle on their current security posture, whether internal (corporate or "inside the firewall") or external (Internet or "outside the firewall"). These include a series of in-depth assessments that include network, vulnerability and penetration testing.

• Network Assessment – Network assessment is a broad term that might encompass a holistic view of an organization's Internet security posture both internally and externally. A network assessment can be tailored to specific security requirements for any organization, but ultimately the assessment will provide a baseline gap analysis and remediation steps to fill those gaps.

• Vulnerability Assessment – Once your baseline network assessment is completed, an organization may wish to perform periodic vulnerability assessments. Whether internal or external, vulnerability assessments can uncover critical gaps in security that may lead to credential leaks, intellectual property theft, or denial of service to employees or customers. A well-planned and well-executed vulnerability assessment should eliminate false positives, but it can never give an organization 100 percent confidence that a specific vulnerability cannot be exploited. Vulnerability assessments should be executed on at least a quarterly basis, but it's not uncommon for larger organizations to execute them monthly.

• Penetration Testing – The next and final step in assessing your organization's security and reliability is penetration testing. While I typically say that vulnerability assessments give you a "95 percent confidence level" that a vulnerability exists, penetration testing can give you 100 percent confidence that a specific vulnerability exists as well as show you how it can be exploited by attackers.

Now that we are all caught up, let's dive in to penetration testing.

What is a penetration test?

A penetration test typically follows a full vulnerability assessment, after you have identified systems with known or suspected vulnerabilities. The existence of vulnerabilities may be obvious, or may require exploitation to validate. By definition, penetration testing involves exploiting a vulnerability to prove its existence or to expose other previously unknown vulnerabilities, or even additional systems, not previously known or tested.

Once you've completed a vulnerability assessment, you must build an attack profile for penetration testing and then execute your attacks.

Step One: Attack Profile

In the attack profiling phase, you must conduct research on your vulnerabilities to determine the best tools to use to attempt exploitation. There are a plethora of commercial, free and open source penetration testing toolkits, including:

There are many more scripts and toolkits you might use for both vulnerability assessments and penetration testing, such as wireless discovery applications, packet capture applications, port scanners, etc. We'll cover some of the more common tools in future articles.

There are too many details to cover in this overview, but suffice it to say a penetration test engineer must understand the underlying operating systems, applications and protocols for the vulnerabilities they are trying to exploit.

Exploits may be common to a given application regardless of the platform (operating system and protocols), but they may also be a very specific combination of hardware platform, operating system, application, protocols, and even network elements to include routers, switches and firewalls.

The commercial toolkits listed above provide a good framework and automation for running exploits, but they all have many configuration parameters, variables and scripts related to very specific vulnerabilities that one must understand in order to execute and effective penetration test. To paraphrase a famous line from the movie Caddyshack, "be the exploit!"

Step Two: Attack Execution

Now, the real work begins. You may understand the vulnerability, you may have your tools and scripts ready to execute and exploit the vulnerability, but inevitably things won't go as planned. As with vulnerability assessments, you may have to adapt your profile because you find that a firewall or network ACL (access control list) is blocking communication in one direction or a given vulnerability cannot be exploited for unknown reasons, or operating system/application fingerprinting was inaccurate. There are many scenarios that may cause you to alter course and change tools or methods to attempt exploitation.

In Summary

Penetration testing (and security on the whole) can be as much art as science, but hopefully this article rounds out our series on security and reliability and gives you some insight on the importance of including this as part of your organization's processes. Ultimately, you will gain confidence in assessing risks and determining which vulnerabilities should be considered real, requiring mitigation. This is the very best way to be prepared for real-time risks and attacks.

Written by Brett Watson, Senior Manager, Professional Services at Neustar

Follow CircleID on Twitter

More under: Cyberattack, Malware, Security

Categories: Net coverage

EFOW Wants Total Protection for Geographical Indications Domains in .VIN, .WINE and All Other TLDs

CircleID posts - Tue, 2013-03-12 16:42

This is a letter sent from the European Federation of Origin Wines (EFOW) to the courteous attention of Dr Steve Crocker, Chair of the ICANN Board , Mr Cherine Chalaby, Chair of the new gTLD Program Committee Board, Mr Fadi Chehadé, CEO of ICANN and Mr Akram Atallah, COO of ICANN.

This letter, sent by its President Riccardo Ricci Curbastro, was sent today to ICANN and is entitled "ICANN initiatives for the attribution of new generic top-level Internet domains — PDO and PGI wines' concerns”.

The letter:

"Dear Madam, dear Sirs,

EFOW, the European Federation of Origin Wines, a Brussels based-organisation representing PDO (Protected Designation of Origin) and PGI (Protected Geographical Indication) wines towards European and international institutions, would like to bring to your attention a crucial issue for the safeguard of our sector concerning the attribution of new generic top-level domains (gTLD) by your organisation. We are concerned that this new procedure could lead to the abuses of our members' Intellectual Property Rights (IPRs).

As far as we are informed, at the current stage of the ICANN procedure, three private firms have applied to manage a new Internet domain ".wine" and another candidate applied to manage the domain ".vin". Should registrars obtain these new gTDLs, they will be able to commercialise them and allow individuals and/or organisations to combine these gTDLs to a second-level domain name to create a personalised web address, as for instance "chianti.wine", "champagne.vin", "rioja.wine", "port.vin".

As you may know, Geographical Indications (GIs) are, according to the WTO TRIPs agreement, indications which identify a good as originating in the territory of a Member, or a region or locality in that territory, where a given quality, reputation or other characteristic of the good is essentially attributable to its geographical origin, for example, "Champagne", "Tequila", "Parma Ham" or "Roquefort". As such, these GI names, like trade marks, enjoy protection as IPRs at the international level and in all WTO Member States.

Considering the above, EFOW believes, as they stand, your organisation's rules on the new gTLD do not allow for the protection of GIs which are recognised IPRs. In fact, applicants will only have to abide by "specification 5" according to which operators shall prohibit the registration of country and territory names recognised by the United Nations or of their ISO codes in front of the extensions ".wine" and ".vin". Moreover, we are concerned that none of the four projects mentioned above, commit to the protection of GI wine names. Finally, we are also preoccupied by the fact that these projects envisage the possibility of registering "premium" domain names attributed by public auction to the highest bidders without any further specifications. ICANN's rules and these applications in their actual form thus raise serious concerns for our sector given that they could lead to abuses of GI names on domain names.

We would like to underline that EFOW is not opposed to the attribution of new gTDLs provided that ICANN and registrars provide for the protection of GIs. The current Trade mark Clearinghouse scheme is, however, not sufficient and does not respond to the needs of the GI wine sector. www.efow.eu
Moreover, EFOW believes that the concessions referred to the second-level domains should be subjected to detailed rules to guarantee an efficient protection to European PDO and PGI wines and more generally to all GIs. More specifically, ICANN should develop a procedure that ensures that GI names cannot be reserved by third parties and enables organisations responsible for the protection of GIs to oppose the reservation of a domain name that consists of or contains the name of a GI through a procedure, e.g. an alternative dispute resolution (ADR). Furthermore, it also considers that authorisations to use the generic top-level domains ".wine" and ".vin" should be guided by the respect of European and International legislation on GI wines, which provide them with a strong protection, as clearly stated by article 23 of the WTO TRIPs Agreement on trade-related aspects of Intellectual Property Rights.

EFOW has already raised its concerns with relevant EU countries and would like to know whether ICANN intends to modify its' procedures to allow GI right holders to have the same rights and guarantees as the ones given to trade marks owners.

We thank you in advance for taking into consideration our observations and would welcome an open discussion on this specific issue."

Will ICANN want to open the discussion and offer a better protection to PDO and PGI wines?

Written by Jean Guillon, New generic Top-Level Domain specialist

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Governance, Policy & Regulation, Top-Level Domains

Categories: Net coverage

ICANN New gTLD Program SWOT Analysis: OPPORTUNITIES (Part 3)

CircleID posts - Mon, 2013-03-11 23:49

The SWOT analysis is a structured planning method used to evaluate the Strengths, Weaknesses, Opportunities, and Threats involved in a project or in a business venture (source Wikipedia).

OPPORTUNITIES

1. For Registries (new gTLD applicants: brands and non brands):

a) New gTLD applicants to sell domain names = earn cash. When Registrars of the Registry's accredited network sells 1000 domain names at the price of at $5, the Registry earns $5000;
b) Owning a Top-Level Domain is a monopoly situation. The applicant "governs" an entire market (worldwide) = earn cash;
c) Brand TLDs have until round 2 to take advantage of their string and expand their presence over their competitor(s) who did not apply. Their competitor(s) can just sit and wait for Round 2;
d) Brand TLDs may want to change their application in the future to sell domain names and earn cash. Most of the time, a contract can be changed at ICANN after it's gone through the "public comments" phase…
e) Possibility of intra group cash flow transfer (cf. Google or Facebook situation in France.) (Jean-François Vanden Eynde);
f) Possibility to benefit from innovation and to apply them (Jean-François Vanden Eynde);
g) Possibility to attract customers and to maintain them within your brand reach (if I find everything within .apple why should I leave it if it could be my favorite page (Jean-François Vanden Eynde)?
h) Possibility to perform some joint venture and associate my brand to a specific event or sport (Jean-François Vanden Eynde);
i) Possibility to hide my online strategy as I will be able to activate and deactivate domain names at any time (Jean-François Vanden Eynde);
j) Build customer trust and avoid cyber squatting (Jean-François Vanden Eynde);
k) Benefit from being a pioneer in my field (Jean-François Vanden Eynde);
l) Applicant will not be subject to cyber squatting under their own TLD (Jean-François Vanden Eynde).

2. For Service providers (Back-end registries, law firms...):

a) Back-end registry providers to earn much more cash as most of their business model is based on the number of domain names sold by new gTLD applicants: the more domain names are sold, the more cash gets in. They will probably be the one to earn most of the money from this first round.
b)Specialized law firms. The New gTLD Program will provide opportunities for a variety of different professions, most notably the legal profession. Trade mark attorneys have a unique opportunity to harness their expertise and expand their scope of services by providing enforcement solutions to Trademarks owners who seek to resolve domain name disputes that will arise with each new gTLD launch (Daniel Greenberg).
A huge market is opening to Law firms:
- The Trademark Clearinghouse: their clients will need to participate in Sunrise Periods and "protect" themselves. For this, they will need to understand how to do this and register in the TMCH.
- URS (Uniform Rapid Suspension Procedure): with so many new domain names, expect many infringements too.
- The same applies to other very specific ICANN procedures: complaints to ICANN, Objections, PICS procedure, RRDRP procedure, PDDRP procedure…
- Webinars.
c) Same goes for digital marketing. It will help companies to offer online services in general not only law firms (Jean-François Vanden Eynde);
d) New gTLD consultants: the ICANN new gTLD program's organized mess is an opportunity for consultants to hunt clients for round 2 with updates on procedures, prices, service providers, etc…
e) Entrepreneurs: many applicants from round 1 strongly believe they will sell millions of domain names...but most won't probably go above 50 000. The business model based on 2 to 3% of a population may not find the success expected since most of the Registries think their Registrars are going to do the sales job. Many strings with potential have not been requested in Round 1, niche markets on small TLDs may be a better solution in Round 2 if applicants do the necessary field work upfront. A 50 000 domain names Registry to become a success in the future?
f) Possibility to extend services to a one stop shop for round 2 (Jean-François Vanden Eynde);
g) Possibility to see new comers building their own services. This will increase competition and help drop pricing in round 2 (Jean-François Vanden Eynde).

3. For ICANN:

a) ICANN has the opportunity to work with global regulatory bodies, including the United States Patent and Trademark Office, to define how ALL public roots will be regulated, not just ICANN. ICANN should work with global governments to establish top level domains as a legal business class so that trademark law can be applied to Top Level Domains. This would get ICANN out of the business of defining trademark protections for the Top Level Domain industry and back INTO the business of approving new registry operators. ICANN should not do both (Mary Iqbal).
b) Earn A LOT of cash;
c) Icann will hopefully learn from its mistake (Jean-François Vanden Eynde);
d) Icann way of working might change with the intervention of brand owners and ip lawyers who will defend their interests (Jean-François Vanden Eynde).

4. For Registrars:

a) More domain names to offer to their clients: every new gTLD launching is an opportunity to contact an existing client and sell him something;
b) Niche Registrars may develop: sports and wine Registries for example;
c) Distinction will also be made between b2b registrars and b2c registrar (Jean-François Vanden Eynde);
d) TMCH could also become a golden egg to registrar (Jean-François Vanden Eynde);
e) Possibility for some of them to go more into consultancy and strategy (Jean-François Vanden Eynde);
f) Registrar might specialized because they might not be blue to go for prepayment for all tlds. We might see some niche registrar (Jean-François Vanden Eynde).

5. For Registrants:

a) An opportunity for develop online identities with precision using more descriptive domain names;
b) By introducing new TLD strings, a wide range of audiences can be reached using the DNS (Domain Name System) to include communities having languages using non latin/roman alphabets. IDNs (Internationalized Domain Names) are one example of being able to provide this expanded reach to the DNS (RJ Glass - AmericaAtLarge.ORG;
c) For Brands and other organizations the possibility is offered to acquire a better (and nicer) domain name to then redirect the old .COM to it (ex: château-latour.vin).
d) Brands are given the opportunity to secure their strings in the Trademark Clearinghouse for future launchings so they can participate in all Sunrise phases and be alert if one intends to register their string as a domain name;
e) Registrant might get some free domains associated to one brands jef@bmw if I buy a new car with all services associated (Jean-François Vanden Eynde);
f) Benefit from innovation of registries (Jean-François Vanden Eynde).

6. For Domainers (they buy domain names to re-sell them):

a) Domainers are always the first one informed about new launchings and what is the best and efficient way to acquire a domain name. With so many new domain names to acquire and so much money to spend, there will be opportunities to acquire generic domain names. Sports new gTLDs should interest domainers: just city names could offer very good opportunities.
b) Domaining offers a serious return on investment for new gTLD applicants who need to earn cash fast. Domainers buy for speculative reasons: the intention is to buy at the cheapest price and sell at a highest.

7. For CyberSquatters (they buy domain names to try to re-sell them to owners with a prior right...or not):

a) Same as usual: they will continue to exist and proliferate as their is no better mechanism in place to block them;
b) A choice of domain name enlarged by the number of new gTLD applications.

To come: THREATS (Part 4)

Written by Jean Guillon, New generic Top-Level Domain specialist

Follow CircleID on Twitter

More under: Top-Level Domains

Categories: Net coverage

Thinking Carefully About New gTLD Objections: Legal Rights (4 of 4)

CircleID posts - Sat, 2013-03-09 04:48

This last article on the four new gTLD objections will look at the Legal Rights Objection ("LRO"). While other articles in this series have touched on trademark concepts at certain points (see part one, two and three), issues from that area of the law predominate in LRO. Here we review the pertinent LRO-related trademark concepts, with which many readers likely will have some familiarity from working with domains and the UDRP. Still, the theme of the first three articles applies here: Potential objections are more involved and complicated than they may seem, and require careful thought if they are to be made.

Standing:

Analyzing any new gTLD objection always begins with standing — namely, whether the objector has the right to raise a claim in the first place. AGB 3.2.2. The LRO does not have standing requirements nearly as strict as those for community or as wide-open as those for limited public interest objections, but does require an objector to establish its status as a "rightsholder." While there is not explicit designation of the exact "rights" needed for standing, the Guidebook's later section on the objection standards clearly refers to trademark — i.e., a name, brand, term or characteristic that recognizably identifies its individual or institutional owner, including celebrities who have rights of "publicity." See, e.g., Frampton vs. Frampton Enters, Inc., Case No. D2002-0141 (WIPO, Apr. 17, 2002)(peterframpton.com). [Note: Also, while the Guidebook also addresses rights of an inter-governmental organization ("IGO"), which involves a relatively narrow subset of potential objectors, so it will not be directly addressed here.]

Logistics and Cost-Planning:

Trademark owners and those with prior experience in UDRP procedures will see a familiar face with respect to LRO: the World Intellectual Property Organization ("WIPO"). WIPO has a very informative FAQ page which will cover what would-be objectors need to know (along with model objection and response form), and the ICANN Webinar of Mar. 6, 2013 contains additional information. Suffice to say here simply that with LRO both the filing and expert costs are levied on a flat-fee basis (recall that ICC Expert fees are billed hourly) and parties have the option to choose the size of the panel (either one or three experts). Myself, I always try to garner a three member panel whenever that option is available. All documents are (as with the other three objections) submitted in English and with a limitation of 5000 words (or 20 pages, whichever is less, excluding attachments).

Objection Standards:

Once an objector establishes standing under the relatively straightforward standard above, it must then tackle the merits of the objection. What must the objector show to prevail? Practitioners who are familiar with trademark and domain name disputes will recognize that the LRO standard incorporates elements employed in these areas. A LRO panel must determine whether an applied-for string:

  • Takes "unfair" advantage of the distinctive character or reputation of the objector's trademark;
  • "Unjustifiably" impairs the distinctive character or reputation of that mark; or
  • "Otherwise" creates an "impermissible" likelihood of confusion with the mark.

AGB § 3.5.2. Some of these buzzwords suggest both trademark dilution ("distinctive character" and "reputation") as well as infringement ("likelihood of confusion"). However, rather than delve into all the nuances of the multipart tests used by courts when analyzing these concepts — see, e.g., AMF Inc. v. Sleekcraft Boats, 599 F.2d 341, 348 (9th Cir. 1979) — simply consider this: is it sufficiently likely that a reasonable person will become confused into thinking that a term associates a particular string with one company when it really belongs to another? If not, then look at whether the mark at issue is really well-known as a brand and very unique. If the answer is not an (extremely) emphatic "yes" to either one of these questions, then pursuing an LRO is probably just wasted effort and expense.

So where does one look for help on these issues? Since ICANN has provided a number of helpful guidelines for LRO, we of course will go through those first.

Trademark Factors:

Section 3.5.2 of the Guidebook lays out a number of expressly "non-exclusive" criteria for determining whether an applied-for string creates a likelihood of confusion or will injure the distinctive nature or reputation of a mark. None will strike trademark or UDRP professionals as particularly unusual, although simplicity in concept can belie greater complexity in practice.

1. Similarity in Sight, Sound and Meaning

A LRO panel will examine whether the applied-for gTLD is identical or confusingly similar in appearance, sound or meaning to the objector's mark. For this test, consider the obvious:

Regarding sight, do the mark and the string look visually similar? Since TLDs do not involve graphic representations, colors, fonts and the like, the inquiry essentially comes down to spelling.

As to sound, say the mark and string out loud to yourself. Do they sound essentially the same? See Deutsche Telekom AG v. foxQ, Case No. D2004-0102 (WIPO, Mar. 7 2004) (UDRP complaint denied despite some sound similarity between domains t-online.com and d-online.com).

On meaning: Take into account what term(s) make up the mark and the string. Do they typically have only one connotation or several? This can matter a great deal when examining terms that have generic, descriptive or common dictionary meaning. See, e.g., Advertise.com, Inc. v. AOL Advertising, Inc., 616 F.3d 974, 978-979 (9th Cir. 2010)(advertising.com)(copy of court decision available at: http://1.usa.gov/ZyaxQI); see also Hasbro, Inc. v. Clue Computing, Inc. 66 F. Supp. 2d 117, 133 (D. Mass. 1999) (clue.com).

Simple, right? Well, lawyers have come up over the years with complications even to these apparently straightforward inquiries. Among other things:

  • Don't look at two marks together; consumers typically view them separately in the marketplace. Union-Carbide v. Ever-Ready, 531 F.2d 366, 382 (7th Cir. 1975). They also do not recall marks as well as triers of fact who see them continually over days of legal proceedings.
  • Consider the trademark and string each as a whole; do not dissect either into components. While courts occasionally give "dominant" components slightly more weight, they do not do so with more generic or descriptive portions of claimed marks — see, e.g., Gateway 2000, Inc. v. Gateway.Com, Inc. 1997 U.S. Dist. LEXIS 2144 (E.D.N.C. 1997) (no likelihood of confusion between "GATEWAY2000" and gateway.com).
  • For dilution, a claim may cut a wider swath because the goods or services need not directly compete, but this typically requires greater proof of similarity. See Ringling Bros.-Barnum & Bailey Combined Shows Incorporated v. Utah Div. of Travel Dev.,170 F.3d 449 (4th Cir. 1999) (no dilution between non-competitors' "greatest SHOW on earth" and "greatest SNOW on earth" phrases).

2. Objector's Bona Fide Acquisition and Use of Trademark Rights

A LRO panel also will consider whether the objector legitimately acquired or has used the mark at issue. UDRP colleagues often examine the efficacy of the complainant's asserted rights, including for lack of protectability, lack of authorization or fraudulent procurement.

3. Public Sector Recognition

To what extent does the relevant sector of the public recognize the rightsholder's mark? As in trademark litigation, this may require survey evidence.

4. Applicant's Intent in Applying for the gTLD

This fourth element calls to mind the "bad faith" factor in UDRP and cybersquatting cases. It asks whether the applicant knew of the objector's mark at the time applying for the gTLD, or whether the applicant has a "pattern" acquiring or operating domains confusingly similar to the marks of others. Of course, this applies only to protectable marks, and not to dictionary or generic terms, where intent to confuse either cannot be inferred or is legally meaningless. See, e.g.,Hero, Inc. v. The Heroic Sandwich, Case No. D2008-0779 (WIPO, Aug. 13, 2008) (hero.com).

5. Applicant's Bona Fide Use

This factor allows a panel to evaluate the extent to which the applicant has used (or demonstrably prepared to use) the gTLD corresponding to the objector's mark. While UDRP cases often feature the registration (and little else) of a second-level domain, an applicant for a new gTLD can cite substantial investments in time, resources and money preparing the application and putting backend technology into place. Moreover, using a common word as a domain for its inherent value in attracting internet traffic, even if that term happens to correspond to another's trademark, does not violate the other's rights. Mobile Communication Service Inc. v. WebReg, RN, Case No. D2005-1304 (WIPO, Feb. 24, 2006) (mobilcom.com transfer denied).

6. Applicant's IP Rights

Next, a panel looks at "whether the applicant has marks or other intellectual property rights in the sign corresponding to the gTLD," and, if so, whether such acquisition and use of the sign has been bona fide, and whether the likely use of the gTLD by the applicant is consistent with such acquisition or use." This would pertain to "Dot Brand" TLDs, but not to the many new gTLD applications with generic terms, as these serve no source-identifying function. See Image Online Design v. ICANN, 2013 U.S. Dist. LEXIS 16896, 22-24 (C.D. Cal. 2013) (quoting Advertise.com and finding no protectable interest in plaintiff's alleged .WEB "trademark" — which was also the subject of a prior TLD application) (copy of court decision available at: http://bit.ly/15CWCwV).

7. Applicant Commonly Known by Applied-For TLD

Taking another page from the UDRP playbook, this factor examines whether and to what extent a TLD applicant has been "commonly known by sign corresponding to the TLD and if so, whether any purported or likely use of the gTLD by the applicant is consistent therewith." This will entail case-by-case analysis since, for example, under the UDRP respondents have become "commonly" known by their domain names simply by having formed a company and done business under that name.

8. Likelihood of Confusion

Likelihood of confusion appears both in traditional trademark and domain name disputes, although not in as much detail in the latter setting. Many of the factors above go into a typical judicial likelihood of confusion analysis. Without repeating myself, I simply suggest keeping practical considerations in mind. Among the most important: evidence of actual confusion serves as one of the strongest indicators of a likelihood of confusion, whether from customer service inquiries or other "real-world" data points, or from consumer surveys conducted specifically for the dispute.

9. Other Considerations

I have noted a number of proverbial "bumps" in the objector's "road" above when discussing the Guidebook's specific LRO criteria. However, other overarching considerations also may be back in the mind of a panelist seasoned in trademark and domain name-related matters, and since the LRO factors are explicitly referred to as being "non-exclusive" there would seemingly be no reason not to take them into account.

Fair Use and Free Speech: First, how do free speech-related defenses such as "fair use" or "nominative use" fit into the LRO context? For readers who are not familiar with the concepts, trademark "fair use" typically involves a mark that is capable of describing the goods or services offered under that mark. Although one may own trademark rights in a descriptive term in certain contexts, "such rights will not prevent others from using the word ... in good faith in its descriptive sense, and not as a trademark." Car-Freshner Corp. v. S.C. Johnson & Son, 70 F.3d 267, 269 (2d Cir. 1995). "If any confusion results to the detriment of the markholder, that was a risk entailed in the selection of a mark with descriptive attributes." Id. at 270. Holders of trademark rights in these types of terms would likely face an uphill battle challenging their use in a New gTLD in a generic or descriptive sense.

On the other hand, "nominative" use comes up in situations involving "fair comment" about or criticism of something else. See New Kids on the Block v. New Amer. Pubs., 971 F.2d 302, 308 (9th Cir. 1992). It seems doubtful that an applicant would shell out $185,000 for a new gTLD simply for the narrow purpose of commenting on or referring to something else, making a nominative use defense unlikely to arise.

The Generic Nature of Many TLDs: Similar to the rationale described above involving "fair use" and freedom of speech, would-be objectors should know that, prior to the new gTLD program, courts have held top-level domains as being too generic to even be capable of serving as a trademark in the first place. Cases (in the U.S., at least) uniformly hold that adding a ".com" TLD to an otherwise common word will not confer trademark rights in the combined term. "Because TLDs generally serve no source-indicating function, their addition to an otherwise unregistrable mark typically cannot render it registrable." In Re: Oppendahl & Larson,, 373 F.3d 1171, 1174 (Fed. Cir. 2004) (patents.com). See also Image Online, supra, 2013 U.S. Dist. LEXIS 16896 at 22-24 ("the mark ".WEB" used in relation to Internet registry services is generic and cannot enjoy trademark protection"). Accordingly,expect to see LRO filings being mostly limited to just very unique, highly distinctive "dot-brand" gTLDs and not for generic "dictionary" words.

No Per Se Dilution For Domain Names: Even holders of marks that are considered famous," "well-known" or having a "reputation" may encounter difficulty relying solely upon a dilution claim rather than likelihood of confusion in mounting an LRO challenge. By way of example, courts in the U.S. have steadfastly refused to impose a "per se" (i.e. "blanket") rule for dilution in domain names. "Ownership of a famous mark does not result in automatic entitlement to ownership of the mark as a domain name." Nissan Motor v. Nissan Computer, 2007 U.S. Dist. LEXIS 90487, 45, citing Hasbro, supra, 66 F. Supp. 2d at 133. Trademark-savvy LRO panelists will no doubt take heed of the potential for abuse in dilution claims and instead require the higher likelihood of confusion threshold to be met. See, e.g., Clue.com, supra, at 135, quoting 3 J. Thomas McCarthy, McCarthy on Trademarks and Unfair Competition § 24:114 (4th ed. 1996) ("The dilution doctrine in its 'blurring' mode cannot and should not be carried to the extreme of forbidding the use of every trademark on any and all products and services, however remote from the owner's usage").

Conclusion

While LRO would appear to have a somewhat greater likelihood of success than the other three objection types in situations involving particular distinctive and very well-known marks, the same does not hold true for the entire new gTLD landscape, which is populated by a number of applications for generic and descriptive strings,. In the latter scenario, the hill is every bit as tough to climb. Rightsholders would be wise to look carefully for real trademark protection rather than just descriptive uses of their brands in New gTLDs before embarking on the LRO path.

This wraps up my series on new gTLD objections, and I hope that everyone has found them helpful and informative. As always, please feel free to reach out to me at any time with any questions.

I'll see you all in Beijing!

Written by Don Moody, Domain Name & IP attorney in Los Angeles, co-founder of New gTLD Disputes

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Governance, Law, Policy & Regulation, Top-Level Domains

Categories: Net coverage

Mishandling the Registrar Contract Negotiations

CircleID posts - Fri, 2013-03-08 21:28

By publishing a draft Registrar Accreditation Agreement (RAA) for public comment before it has been agreed on by both parties, has ICANN dealt the bottom-up multi-stakeholder model a blow?

ICANN Staff and the registrars have been negotiating a new version of the RAA for the past 18 months following requests by Law Enforcement Agencies (LEA) such as Interpol for greater consumer protection.

With both ICANN and registrars working hard, by early this year agreement had been reached on 11.5 of the 12 LEA "asks"

A deal looked close.

Then at the last minute, ICANN threw extras at the proposed RAA, including an extraordinary provision for the ICANN Board to be able to force unilateral changes into the RAA at any time.

Imagine signing a contract with someone where that person can change the contract at any time, without your input, and you are bound by those changes. Crazy, right? Even crazier in the ICANN world, built as it is on the premise of bottom-up consensus, not top-down "we'll change your contract when we damn well feel like it" tactics.

Here's one we prepared earlier

Still, disagreement in negotiations is no big deal. Surely all both sides have to do is simply continue talking and try to iron them out, right?

Wrong when one side tries to push its way forward by publishing a draft agreement and portraying it as the result of these negotiations, even though that's clearly not the case and the other side has asked this not to be done.

This is what's happened today, with ICANN putting the current draft RAA out for public comment. "Given the agreement in principle over so many areas, there were two paths forward: continue negotiations to address points that have been raised multiple times by each side, or put the agreement out to the community now for public input on the finalization of the agreement," says ICANN in a statement issued with the draft RAA. "After the long period of negotiations, as well as the import of the 2013 RAA to the New gTLD Program, ICANN feels that it is very important to take the RAA proposal to the community."

At least no-one can accuse ICANN of not saying it like it is!

The registrars have put out their own statement decrying the way ICANN has handled this. "All of the items that have been agreed to over the past 18 months would, by themselves, produce an RAA that is vastly improved over the current 2009 version. Nearly all of the Law Enforcement requests that were endorsed by the GAC have been included, as well as the major items that were requested by the GNSO. That RAA would bring registrant verification. That RAA would bring enhanced compliance tools. Registrars must emphasize that the key differences between that RAA and the one currently proposed by ICANN are not issues raised by Law Enforcement, GAC or the GNSO but by ICANN staff (underlined in the original statement)."

Cart, horse, in that order

It appears staff have been driven to put the cart before the horse by Fadi Chehadé's desire to wrap the RAA issue up.

Chehadé named the RAA as one of his key deliverables when he formally took office late last year. Since then, he has surprised the community by introducing new requirements in the contract new gTLD registries will have to sign. Among them, the obligation to only use registrars that have signed the 2013 RAA. In other words, under Chehadé's instructions, ICANN is attempting to tie down new gTLD operators to a registrar contract that is still being negotiated.

No wonder Chehadé wants these negotiations done sooner rather than later. Registrars feel this "surprise announcement that all new gTLD registries must only use registrars that have signed the 2013 RAA" is nothing more than "a transparent effort by ICANN to arbitrarily link the new gTLD program to the outcome of RAA negotiations." If enacted, they fear the requirement would create separate classes of registrars. "This is unprecedented in the DNS industry," they say. "There can and must be only one meaning of 'ICANN-Accredited'".

Worse than this, registrars feel the attempt by ICANN to give its Board power to unilaterally amend the RAA could affect the multi-stakeholder model as a whole.

"ICANN insisted on including a proposed Revocation (or "blow up") Clause that would have given them the ability to unilaterally terminate all registrar accreditations," registrars explain in their statement. "After major pushback, ICANN staff relented and in its place proposed giving the ICANN Board the ability to unilaterally amend the RAA. This is identical to what ICANN inserted into the proposed new gTLD registry agreement — a clause met with strong opposition not only from the Registry Stakeholder Group but from the broader ICANN community."

So this is the real blow-up clause. "The effect of such a clause in the primary agreements between ICANN and its commercial stakeholders would be devastating to the bottom-up, multi-stakeholder model," the registrars argue. "First, it will effectively mean the end of the GNSO's PDP, as the Board will become the central arena for all controversial issues, not the community. Second, it creates an imbalance of authority in the ICANN model, with no limits on the scope or frequency of unilateral amendments, and no protections for registrars and more important registrants."

Red alert

I founded and ran a registrar for more than a decade. Today, as a consultant to the domain industry, I represent a registrar (NetNames) in the Registrar Stakeholder Group. Clearly, I am biased towards the registrar point of view in this debate, and this is probably the way some will read this article.

But others know I am first and foremost a passionate defender of the multi-stakeholder model.

Through 2 years of chairing the GNSO (up until last October) I always sought to defend that ideal when it was put under pressure.

And since stepping down as Chair and getting my voice back, when the model is attacked I have spoken out to defend it. When Chehadé launched headfirst into the Trademark Clearinghouse discussions, I warned of the dangers of this approach and was impressed when he later recognised he may have been a little too hasty.

With what is happening now on the RAA, isn't it time to sound the alarm bells once again? Chehadé seems to be adopting a "Janus approach" to solving ICANN issues. Publicly, he is engaging, energetic and I have gone on the record saying how much good I think the new CEO is doing ICANN's image worldwide.

But in his more direct dealings with ICANN's constituencies, Chehadé seems to think that the end justifies trampling the model.

Sure ICANN has its problems and sure everyone can only welcome a determined leadership approach to solving them, but for the unique governance experiment that is ICANN, this bidirectional approach risks making the cure look worse than the disease.

Written by Stéphane Van Gelder, Chairman, STEPHANE VAN GELDER CONSULTING

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Governance

Categories: Net coverage

Google Bows to Pressure on Closed Generics

CircleID posts - Fri, 2013-03-08 20:15

The debate surrounding "closed generics", which has been covered several times in the past, has attracted a lot of attention in recent weeks.

At the centre of the debate were a number of new TLD applications from large companies including Google, Amazon and others.

Google had stated that they planned to establish a number of domain extensions and operate them as "walled gardens". At the ICANN public meeting in Toronto Google attempted to defend their plans and their position until today was unchanged.

However in their submission to the comment period on "closed generics" this evening it's obvious that they have been forced to reconsider that position in relation to some of their applications, though their overall view remains unchanged:

After careful analysis, Google has identified four of our current single registrant applications that we will revise: .app, .blog, .cloud and .search. These terms have been identified by governments (via Early Warning) and others within the community as being potentially valuable and useful to industry as a whole. We also believe that for each of these terms we can create a strong set of user experiences and expectations without restricting the string to use with Google products.

With this in mind, we intend to work with ICANN, the Government Advisory Committee (GAC), and other members of the relevant communities to amend our applications with new registration policies (and, in some cases, new registry services) to achieve these aims. Details of these plans will be forthcoming in the near future.

How that will translate into a policy and whether or not they will actually be granted the ability to run the domain name registries for these domain extensions remains to be seen, but the quite dramatic change in their position is welcome.

You can read their full submission here.

Written by Michele Neylon, MD of Blacknight Solutions

Follow CircleID on Twitter

More under: Top-Level Domains

Categories: Net coverage

Time to Take Stock: Twelve Internet and Jurisdiction Trends in Retrospect

CircleID posts - Thu, 2013-03-07 21:13

With the growing tension between the cross-border Internet and the patchwork of national jurisdictions, it becomes crucial to keep track of key global trends that drive the debate on appropriate frameworks.

One year ago, the Internet & Jurisdiction Project initiated a global multi-stakeholder dialogue process on these issues. To provide a factual basis for such discussions, it established an Observatory, supported by a network of selected international experts, to detect and categorize relevant cases via an innovative crowd-based filtering process in order to identify high-level patterns.

The following twelve trends are based on an evaluation of the first edition of the Internet & Jurisdiction Observatory case collection that features the 220 most important cases of 2012.

* * *

I. THEMATIC TRENDS

Pacesetter: National copyright enforcement

The cross-border Internet naturally challenges the geographic nature of Intellectual Property Rights. As illustrated by national ISP blockings of torrent libraries, graduated response schemes and proposals for multilateral cooperation treaties, copyright has become a major pacesetter for the enforcement of national jurisdiction over the Internet. Several proposed measures raised significant human rights and privacy concerns, as exemplified by the SOPA/PIPA bills and the Anti-Counterfeiting Trade Agreement (ACTA), which was rejected in the EU's jurisdiction in July 2012.

Cloud-based services: Global platforms versus local privacy laws

Different conceptions of online privacy clash as states and sub-national authorities increasingly try to enforce their laws on cross-border platforms. Local standards can extend globally if the operator of a platform is established within the territory of a given jurisdiction. Thus, the US Federal Trade Commission and the "Sponsored Stories" Facebook settlement in California de facto determine opt-out and consent privacy rules for all international users. At the same time, a growing number of states demands local compliance: In the EU, privacy commissioners examine Google's 2012 Terms of Service changes and Facebook deleted all facial recognition data of EU users in reaction to an audit by the Irish privacy watchdog and investigations by a regional Data Protection Authority in Germany.

Hate Speech: Viral outbursts and digital wildfires

In the absence of appropriate cross-border standards for takedown norms and procedures, viral online outbursts and "digital wildfires" of hate speech across multiple jurisdictions have become a major concern. The Innocence of Muslims video on YouTube, and "doctored images” that caused unrest in Indian regions showed that solutions like entire platform blocks via national ISPs can be disproportionate and do not take the granularity of online content into account.

In Search of Standards: Defamation and libel tourism

Prominent online defamation cases are on the rise, while criteria for liability, publishing locations and adjudicatory jurisdiction remain vague. In Australia, Twitter was directly sued as the publisher of a defamatory tweet that was sent by one of its users. In the UK, a citizen of New Zealand won a Twitter defamation case against an Indian citizen residing in England and a former British politician took action against 10.000 Twitter users who tweeted or retweeted a false rumor. Moreover, a bill in the Philippines and demands by Swedish authorities indicated the growing trend of criminalizing online defamation.

II. TRANSBOUNDARY IMPACTS OF SOVEREIGNTY

Still Neutral? The DNS as Content Control Panel

There are attempts to leverage the Domain Name System (DNS) layer to enforce national jurisdiction over foreign online content when the DNS operator is located within a state's territory. A US court ordered VeriSign, the manager of .com, to take down the Canadian bodog.com site. The US Immigration and Customs Enforcement (ICE) seized without a court order the .com and .org domains of the Spanish link library Rojadirecta because the domains had been bought through a US registrar, although the site had been declared to operate legally by courts in the Spanish jurisdiction. ICE subsequently released these domains without explanation. This potentially causes transboundary impacts of national sovereign decisions.

Limitless Sovereignty? Jurisdiction over foreign citizens

Extraterritorial extensions of jurisdiction over foreign citizens are rising in the absence of clear competence criteria. In California, a series of similar copyright cases was divided between two judges. They disagreed on having personal jurisdiction over an Australian resident. The actions were filed by a Korean rights holder, which argued that the defendant's use of US-based social media platforms constituted a sufficient connection to the American jurisdiction. Are there limits to the exercise of sovereignty over a shared common infrastructure?

III. FRAMEWORKS AND PROCEDURAL INTERFACES

National Laws vs. Platform Rules: The role of Terms of Service

Terms of Service provisions regarding freedom of expression, defamation or privacy increasingly morph into the law of "digital territories". Tensions arise, as Internet users are both subject to the laws of their jurisdiction and to the rules of the platforms they use. In Brazil, Facebook deleted the account of a topless female rights protestor for infringements of its Terms of Service. Meanwhile in the US, Twitter refused to disclose the identity of Occupy tweeters to authorities since its Terms of Service specify that the company does not own tweets.

Lack of Interoperability: Procedural interfaces and MLATs

Enforcing territorial sovereignty can carve up the Internet. Due process for takedowns, seizures or LEA access to private data emerges as a major concern for all stakeholders, but viable interoperability frameworks to manage the Internet commons do not yet exist. In search of solutions to handle state-platforms interactions, India called for a dispute resolution forum attached to the UN after local riots were triggered by online content. Pakistan claims to be obliged to continue the DNS block of the entire YouTube site for one objectionable video, due to the lack of appropriate procedures in the absence of an MLAT regime with the US.

IV. TECHNOLOGIES AND TOOLS

Data Territoriality: The Location of servers matters

Despite the global availability of most cloud-based platforms, the location of their data centers matters. Thus, US authorities seized the file locker Megaupload via its US-based servers, although the Hong Kong-based platform was operated by a German citizen residing in New Zealand. Equally enforcing national jurisdiction over servers, Ukrainian authorities shut down a platform operated from Mexico. Wikipedia explained that it does not operate servers in the UK because of certain jurisdictional risks due to strict local defamation laws.

Localizing the Internet: Geo-IP filtering and cc-TLD migration

Facing difficulties to simultaneously respect 192+ national laws, cross-border platforms create "localized experiences" to be in compliance with territorial laws. Twitter developed a tool to block unlawful content in certain jurisdictions through geo-IP filtering and used it for the first time to block a Nazi account in Germany. Google's Blogspot uses the DNS and launched an automatic cc-TLD redirection scheme to prevent cross-border impacts of local compliance on platform users from other jurisdictions.

Cybertravel: The legality of proxies and VPNs

The ability to freely cross jurisdictional borders on the Internet becomes challenged, as states strive to enforce local laws online. The spread of ISP domain blocks and geo-IP filtering increases the use of VPNs and proxies to circumvent national access limitations. Whereas a New Zealand ISP offers "US Internet" by default, cybertravel technologies become increasingly contested or criminalized as China, Russia and Iran target VPNs. Likewise, The Pirate Bay proxies in the UK and the Netherlands are being shut down.

Notice and Staydown: The rise of automated filters

Courts increasingly demand the use of automated filters on cross-border platforms to ensure that content complies with local jurisdictions, especially in cases where the same or similar infringing content is uploaded again. An Argentinean judge ordered Google to "permanently" remove defamatory pictures of a model. Concerning copyright, views diverge as a German court ordered YouTube to develop a notice-and-staydown mechanism for protected songs, while a French court ruled that upload-filters are unnecessary.

Download the "2012 in Retrospect" Case Collection

Written by Paul Fehlinger, Manager of the Internet & Jurisdiction Project

Follow CircleID on Twitter

More under: Internet Governance, Law

Categories: Net coverage

Russia Restricts U.S. Fiscal Sovereignty Using an ITU Treaty?

CircleID posts - Thu, 2013-03-07 19:57

It seems outlandish. However, as incredible as it may seem — especially in these times of sequestration and dire Federal budget cuts — the U.S. has potentially fallen prey to a ploy hatched by Russia and allies artfully carried out at a 2010 ITU treaty conference to relinquish the nation's sovereign right to choose its own ITU membership contributions. Here is how it happened and what can be done about it.

In December, as just about everyone knows, Russia and friends used a treaty conference called the WCIT of the U.N. specialized telecommunications agency known as the ITU, to potentially impose far reaching new regulatory controls on the Internet and just about everything else relating to new media and content. The gambit largely fell apart as the U.S. and 54 other nations walked out of the meeting.

Over the ensuing months since the WCIT, the mischief has continued, if not become worse. Almost everyone except a handful of countries has shunned the ITU's continuing telecommunications standards meetings. No one needs an intergovernmental body playing U.N. politics and cluelessly meddling in developing technical standards done by the industry elsewhere through numerous private sector organizations and arrangements in a highly competitive global marketplace. The reality, however, is that these problems with the ITU-T had been getting increasingly worse over the past decade. They have simply become much greater now after the WCIT.

As a result of these continuing machinations, not surprisingly, some people in Washington are contemplating joining many of the other nations that walked out of the December treaty conference, in scaling back the money they voluntarily elect to give to the ITU as membership fees. Even though each of the 193 nations in the ITU get an equal vote on everything, their financial contributions range from a high of 30 Contributory Units to a low of 1/16. Indeed, 61% of ITU member nations pay a ¼ Unit or less, but their vote equals that of the U.S. in the ITU. Nearly 25% of ITU Member States are only at the 1/16 level, and even then many are in arrears. The U.S. together with Japan, are the only two countries who many years ago when the ITU was worth something, elected to give the greatest amount of money — 30 Contributory Units currently equating to about ten million dollars a year.

Over the past six years, an increasing numbers of nations became fed up with the negative value proposition of the ITU and its leadership, and began significantly cutting back their funding of a broken organization. Twenty-nine countries have reduced their contributions. Some of the reductions were especially embarrassing. France — which gave birth to the organization and its strongest supporter for 145 years, dropped 17%. Germany — which was a major contributor to its technical work for decades — also dropped 17%. Switzerland — which has been the host country for the organization since its inception — dropped back 33%! The UK — which like France and Germany played leadership roles in the organization for decades — also dropped 33%. The list goes on and on — Australia ( 13 %), Belgium (-20 %), Italy (-25 %), the Netherlands (-38 %), Sweden (-38 %), Finland ( 40%), Hungary (-50%), Denmark (-60%), Lithuania (-75%),… Essentially all of these nations also save considerable money by not sending government delegations to the meetings — something the U.S. should consider emulating.

So about two years ago in late 2010 at the ITU's quadrennial treaty conference to review its Constitution and elect its leadership, Russia and a number of other countries that contribute minor amounts of the ITU's budget but engage in most of its mischief, quietly pulled a "fast one." Using the U.N. one-country, one-vote process, they managed to amended the ITU's Constitution so that any nation paying the larger contributions could not reduce its funding "by more than 15 per cent of the number of units chosen by the Member State for the period preceding the reduction, rounding down to the nearest lower number of units." It is known as the ITU Constitution Paragraph 165 Amendment, and apparently most people at the conference didn't even know it was adopted.

As a result of the Amendment, if the U.S. which currently pays the maximum of 30 Units wants to cut that amount in half, it would have to do it over the next five ITU Plenipotentiary Conferences — a minimum period of 20 years! Thus in these times of fiscal crisis in the United States, if the nation wants to significantly cut back funding of a broken U.N. body like most allies have already done, it is constrained by the ITU Paragraph 165 Amendment. If a nation fails to take a reservation to the Amendment, the nation ends up signing away its fiscal sovereignty. Rather embarrassingly, this sovereignty straightjacket on the U.S. was put in place by the very nations working against U.S. interests in the organization and paying far less.

Fortunately, when the U.S. signed the treaty instrument that adopted this misbegotten amendment in November 2010, although it failed to take a reservation to the Paragraph 165 Amendment, it did state a right to make further declarations to limit its obligations. In addition, it also appears as if the ratification process has not yet gotten underway, and the matter still resides in the Legal Advisor's office of the State Department. So what needs to occur is for the U.S. to institute an additional declaration rejecting the Paragraph 165 Amendment. One would think the White House if not Congress might want to add this matter to the U.S. budget agenda and act quickly.

An additional desirable step will be for the U.S. to then join the long list of other nations in significantly cutting back their contributions — say joining the UK and Switzerland in dropping back a third — from 30 to 20 Units. The ITU's small group of lobbyists and apologists in Washington — who love travel to Ville-Genève for the mind numbing ITU meetings — will suffer angst over the reduction. However, this action is the right measured response that many U.S. allies have already taken. The 1/3 reduction continues to recognize the residual value of the ITU's Radio Sector. The reduction is also the right response for a nation seriously attempting to deal with its mounting debt and the many billions already eliminated for U.S. domestic needs that are far greater than nonsensical U.N. activity that just about everyone agrees is not needed and adverse to almost everyone's interests.

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation

Categories: Net coverage

"Open" or "Closed" Generic TLDs: Let the Operators Decide

CircleID posts - Thu, 2013-03-07 19:00

(The following is an edited version of comments I submitted to ICANN regarding "closed" generic TLDs.)

On February 5th, ICANN solicited comments on whether ICANN should determine the circumstances under which a generic TLD operator is permitted to adopt "open" or "closed" registration policies. Allowing gTLD operators to make these determinations, as opposed to ICANN, will promote innovation on the Internet to the benefit of consumers.

In order to bring the benefits of a competitive TLD market to consumers, ICANN should generally take as light-handed a regulatory stance as possible, as long as it meets its technical responsibilities. A light-handed regulatory approach is consistent with the policy of relatively open entry into the TLD space that ICANN has adopted.

A benefit of the new gTLD program, in addition to providing competition to incumbents, is the ability of the entrants to develop new business models, products and services. Historically, gTLDs have been open, and arguably that openness benefited the growth of the Internet. But at this stage of the Internet's development, adding new options to the status quo is more likely to unleash new forms of innovation. Closed gTLDs may be a promising source of innovations that have not thus far been possible to implement or even envision. Closed gTLDs may, for example, be a way to provide services with enhanced security. No one can know what innovations might be blocked if ICANN generally required gTLDs to be open. In short, adding new open gTLDs is likely to create benefits, but the returns to adding completely new types of gTLDs are potentially much larger.

New gTLDs are valuable economic assets. ICANN should adopt policies that assure that these assets are allocated to their most highly valued uses. ICANN's decision to use an auction when there are multiple applicants for the same gTLD will further that goal. The bidder who believes its business model will be the most profitable will win the auction and the right to operate the gTLD. When there is only a single applicant, that applicant presumably represents the highest-valued use of the gTLD.

The best use of a gTLD can change (e.g., from closed to open) if the initial business model isn't successful or if economic conditions change. This change can be effected either by the current operator or by a transfer of the gTLD to a new operator, subject to ICANN's review. In this way, gTLDs can continue to move to their highest-valued uses over time.

The dangers of ICANN dictating how gTLDs should be used are illustrated by the experience with radio spectrum. Historically, the U.S. Federal Communications Commission allocated blocks of spectrum to specific uses — e.g., broadcast radio and television. Over time, the costs associated with misallocation of spectrum under this "command-and-control" regime became very large. The process of reallocating spectrum to higher-valued uses has proven lengthy and difficult. Although the U.S. and other countries have moved toward a more market-based system, the costs of the legacy system are still reflected in the scarcity of spectrum for wireless broadband uses.

Several commentators have expressed concern that closed gTLDs are anticompetitive. No evidence supports this claim. First, we already have experience with generic second-level domain names — e.g., cars.com — which have provided useful services with no apparent anticompetitive effect. There is no reason to expect anything different from a .cars gTLD. If, for example, General Motors (or any other automobile company) were to operate .cars, it is not plausible to suggest it could thereby gain market power in the market for cars. Note also that both operators and ICANN are subject to the U.S. antitrust laws if they use the TLD system in an anticompetitive way. To the extent that ICANN allows synonyms as gTLDS — e.g., "autos" "automobiles", "motorvehicles", perhaps even "goodcars", etc. — the potential competitive problems become even more remote.

In sum, ICANN should provide maximum flexibility for operators to experiment with new business models. This is the best way to promote innovation on the Internet.

Written by Tom Lenard, President, Technology Policy Institute

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

Moving the Multistakeholder Model Forward: Thoughts from The WSIS+10 Review Event

CircleID posts - Thu, 2013-03-07 18:27

Ten years ago, global representatives assembled in Geneva in 2003, and again in Tunis, 2005, as part of the two founding phases of the World Summit on the Information Society (WSIS). At the heart of proceedings, attended by representatives from across the spectrum of business, government, civil society, and the Internet technical community, was an acknowledgement that an inclusive approach was needed in all discussions pertaining to Internet governance and policy-making, to overcome the primary challenges in building this 'Information Society.'

In the decade that's followed, we've witnessed marked progress in moving towards this vision of a people-centred and inclusive Internet, and the multistakeholder approach has been the backbone in strengthening WSIS principles and prompting action. But challenges still remain.

Last week, representatives from all the stakeholder groups from around the world converged in Paris for the first WSIS+10 Review event to evaluate progress made since those initial meetings, and to determine fresh approaches to better inform the full WSIS review in 2015. In my mind, the meeting is proof positive that the multistakeholder model for collaborative dialogue and consensus-building is working, and must continue to be adopted going forward.

Inspired by the open debate forum, and strong support from diverse participants, the review event culminated in a statement of collaborative consensus on how to further progress and enhance the WSIS process. Indeed, the first item reinforces that multistakeholder processes have become an essential approach in addressing issues affecting the knowledge and information societies, and the statement itself was a testament to the inclusive approach. With anyone from all stakeholder groups able to contribute to its drafting, this process was further evidence of the value that the model can deliver.

As with any form of governance, the multistakeholder approach must address principles such as representation, transparency, and efficiency. Indeed multistakeholder organizations such as ICANN and multistakeholder platforms such as the Internet Governance Forum (IGF) are constantly seeking to improve their outreach to ensure active and meaningful participation by representatives from all groups. Such representation is essential for this model to be successful, and several of our ICC BASIS business representatives, took part in sessions tackling core issues such as multistakeholder principles and enhanced cooperation.

These standing-room-only sessions made clear the interest in advancing the multistakeholder approach, and the review event in its entirety was an excellent example of the multistakeholder approach, and enhanced cooperation among stakeholders, in action. Key topics affecting the future of the information society — from freedom of expression, to the free-flow of information, multilingualism and gender inclusion — were addressed in conversations which will be on-going as we approach this year's Internet Governance Forum. Hosted in Indonesia, this IGF will address issues that have emerged in international discussions, and will seek to create outputs that participants can use at the national and regional levels.

The role and relevance of the business community in this ongoing debate is one which Jean-Guy Carrier, the International Chamber of Commerce's Secretary-General, was keen to underscore in his opening address. He called upon all stakeholders to do more to protect the transparency and openness of the Internet and, highlighted that governments must engage fully with the stakeholder process to develop policies and practices that enable trade and promote economic and social growth.

On this note, and to a round of applause, a representative from Google pointed out that this collaborative model must also extend beyond dialogue, to the advancement of resources and funding for the IGF by all stakeholders. Administered by the United Nations, the IGF is funded through donations from various stakeholders. But just as the discussion of Internet governance issues must be characterised by the multistakeholder approach and equal input, so it is in all stakeholders' interests to provide a bigger commitment to funding the IGF and ensure that Internet governance continues to be debated in a fair, open and inclusive forum in the years to come.

Written by Ayesha Hassan, Senior Policy Manager, Digital Economy, International Chamber of Commerce

Follow CircleID on Twitter

More under: Internet Governance

Categories: Net coverage

Who Runs the Internet? ICANN Attempts to Clarify the Answer With This Map

CircleID news briefs - Wed, 2013-03-06 19:46

ICANN has released a "living" graphic aimed to provide a high-level view of how the internet is run attuned for those less familiar with the inner workings of the internet infrastructure ecosystem. Quoting from the document:

No One Person, Company, Organization or Government Runs the Internet
The Internet itself is globally distributed computer network comprised of many voluntary interconnected autonomous networks. Similarly, its governance is conducted by a decentralized and international multi-stakeholder network of interconnected autonomous groups drawing from civil society, the private sector, governments, the academic and research communities, and national and international organizations. They work cooperatively from their respective roles to create shared policies and standards that maintain the Internet's global interpretability for the public good.

Who Run the Internet? Graphic designed to provide a high-level view from ICANN (Click to Enlarge)

Follow CircleID on Twitter

More under: ICANN, Internet Governance

Categories: Net coverage

Who Runs the Internet? ICANN Attempts to Clarify the Answer With This Map

CircleID posts - Wed, 2013-03-06 19:46

ICANN has released a "living" graphic aimed to provide a high-level view of how the internet is run attuned for those less familiar with the inner workings of the internet infrastructure ecosystem. Quoting from the document:

No One Person, Company, Organization or Government Runs the Internet
The Internet itself is globally distributed computer network comprised of many voluntary interconnected autonomous networks. Similarly, its governance is conducted by a decentralized and international multi-stakeholder network of interconnected autonomous groups drawing from civil society, the private sector, governments, the academic and research communities, and national and international organizations. They work cooperatively from their respective roles to create shared policies and standards that maintain the Internet's global interpretability for the public good.

Who Run the Internet? Graphic designed to provide a high-level view from ICANN (Click to Enlarge)

Follow CircleID on Twitter

More under: ICANN, Internet Governance

Categories: Net coverage

Security and Reliability: A Closer Look at Vulnerability Assessments

CircleID posts - Wed, 2013-03-06 18:01

Building on my last article about Network Assessments, let's take a closer look at vulnerability assessments. (Because entire books have been written on conducting vulnerability assessments, this article is only a high level overview.)

What is a vulnerability assessment?

A vulnerability assessment can be viewed as a methodology for identifying, verifying and ranking vulnerabilities (a "fault" that may be exploited by attackers) in a given system. That system could be a single application or an entire infrastructure, including routers, switches, firewalls, servers/applications, wireless, VoIP (voice over Internet protocol), DNS (domain name system), electronic mail systems, physical security systems, etc. The list of possible elements assessed could be much longer, but you get the idea.

Step One: Reconnaissance

Vulnerability assessments can be conducted with little to no information about the target system (black box) or with full information, including IP addresses, domain names, locations and more (white box). Of course, the less information you have about the system, the more reconnaissance you must do to conduct the assessment. Some of your reconnaissance might need to be done during the assessment itself, which could alter your attack profile.

Step Two: Attack Profile

Once an initial reconnaissance is complete, the next step involves developing an attack profile, which can be most easily compared to a military term: a "firing solution." Essentially, when a target has been identified, it is the adversary's responsibility to consider all the factors and options involved in attacking a target, including stealth, tools and evasion.

An attack profile should at least include the following elements:

  • Determine IP addresses to scan
  • Determine automated tools/scripts/modules to use for discovering vulnerabilities:

Step Three: Scans

After developing an attack profile, you must execute your scans using automated tools and manual processes to collect information, enumerate systems/services and identify potential vulnerabilities. As I mentioned, you might need to perform further reconnaissance during the attack, which may alter your profile. Being prepared to — and open to — adapting your profile as you gain additional information is vital during a vulnerability assessment.

In general, your attack profile should assess the following elements of security (including but not limited to):

  • Authentication/authorization and session management
  • Transport-layer security (SSL, TLS, etc)
  • Susceptibility to Denial of Service (DoS)
  • Web-based Cross-site Scripting/Cross-site Forgery
  • Security misconfiguration (inadequate access controls, firewall rules, etc)
  • Inadequate controls for SQL injection, web-based cookie injection, etc
  • Inadequate input validation for web, database or other applications
  • Remote code execution

In addition to traditional system scanning through automated or manual processes, many assessments also include social engineering scans, such as:

  • Posing via telephone as an employee of the organization to obtain password access to email, VPN (virtual private network), or web-based applications, etc
  • Phishing/spear-phishing attacks to validate corporate security policies and/or malware and anti-virus countermeasures, etc
  • Searching for leaks of credentials or intellectual property through publicly available information such as search engines, social networking sites, etc
  • On-site visits to pose as an employee and gain physical access to facilities, potentially dropping USB-based reconnaissance tools, etc

As you can see, vulnerability assessments can be very narrowly focused on a single system/application — or they can span an entire global infrastructure, including an organization's external and internal systems.

Step Four: Eliminating False Positives

Finally, I'd like to touch upon one of the most important aspects in performing vulnerability assessments: eliminating false-positives and documenting remediation steps for your customers. Automated tools are only as good as the developers that create them. Security engineers must understand the applications, protocols, standards and best practices in addition to understanding when an automated tool is flagging a vulnerability that doesn't actually exist (false-positive). Customers need to be confident that you are reporting real vulnerabilities, and they then need actionable steps for mitigation.

Of course, this isn't everything there is to know about vulnerability assessments, but hopefully this article offers up a good snapshot of what's important. Stay tuned for the next article in this series, where we will take a closer look at penetration testing.

See Neustar's Professional Services for additional helpful information and services on vulnerability assessments.

Written by Brett Watson, Senior Manager, Professional Services at Neustar

Follow CircleID on Twitter

More under: Security

Categories: Net coverage

Interview with Avri Doria on the History of Community gTLDs

CircleID posts - Wed, 2013-03-06 02:52

This article is published on CircleID by Jacob Malthouse on behalf of the Community gTLD Applicant Group (CTAG).

Community gTLDs play an interesting and even unique role in the ICANN new gTLD process. They reflect the community-driven nature of the Internet. Indeed the story of how Community gTLDs came about is a fascinating example of the how bottom-up process can give rise to innovative policy outcomes.

It has been over six years since the community gTLD concept was first discussed. In the mists of time, it's easy to forget the deep foundations upon which this concept is based.

This February, Avri Doria joined The CTAG for a discussion and reflection on the role, history and background of community-based gTLDs. A summary of that discussion follows.

* * *

Q. What is your background?

A. I have spent most of my career working on Internet issues as a technologist. I was attracted to ICANN after participating in the Working Group on Internet Governance. One of the structures we reviewed as part of this work on multi-stakeholder governance was ICANN. The Nominating Committee brought me into the Generic Names Supporting Organisation (GNSO) council. I was on the GNSO council for five years and was chair for two and a half years. This period, coincidentally, covered the time when the new gTLD recommendations were made to ICANN from the GNSO.

Q. How long have you been involved with the Internet Community?

A. I have been involved with the Internet since before there was an Internet community. I worked on protocols starting the eighties and attended Internet Engineering Task Force (IETF) meetings until recently when ICANN began to take up my time. I still participate in the IETF on mailing lists and hope to get back there someday. I've been directly involved in various parts of Internet policy for a very long time. My last real job, until the bubble burst, was as Director of Advanced IP Routing at Nortel. Since then I've been doing technical research part time and research and consulting in Internet governance part-time.

Q. What have your key roles been within the Internet Community?

A. Protocol developer and policy analyst; I work as a systems architect, whether it's protocols or governance structures. These are the things I focus on and that interest me the most; the technical and governance systems of the Internet and how they fit together.

Q. What are you up to now?

A. At the moment I am working part-time for dotGay LLC, on their policies and community information. I also teach Internet governance in a few places, and am doing some research on Delay and Disruption Tolerant Networking (DTN) when I can find the funding; something I have been researching for ten years now.

Q. How did the 'community-based' TLD concept arise?

A. I came to ICANN in 2005, and the 'sponsored' gTLD policy making was over by then. All that was left was a few of the finalizations and the .xxx saga, which of course continued until last year.

In looking at it now, the sponsored gTLD concept was certainly part of the history that we looked at in terms of designing community gTLDs for this program and was part of the whole notion of how the community gTLD concept evolved. Those who had worked on sponsored gTLDs were part of the discussion in developing the current new gTLD recommendations.

The concept was part of the overall discussions that the GNSO was having. We were doing it as a committee of the whole — that is all of the GNSO council members were involved in the process. There was the notion that we have to defend communities that may want a gTLD. This encompassed both preventing gTLDs from being 'grabbed', but it also involved engaging more communities — a broader notion of support — that would help spread awareness further about the possibilities of a community using its name for a gTLD. We almost always put something up against the 'community' test when we were discussing policy (e.g., how would this work in the case of .bank?).

It's very important for the Internet community to go back to the policy recommendations that formed this program*. It's what we are rooted in.

* The ICANN Generic Names Supporting Organisation Final Report on the Introduction of New Generic Top-Level Domains was released on 8 August 2007. It is available for download in two parts here as: Part A and Part B (PDF).

One of the recommendations was about substantial opposition from a significant portion of the community. Implementation guidelines F, H & P explain how one follows and understands the support of the community. What is defined there is a very broad notion of community. It was the recommendation of the GNSO that 'community' should be interpreted broadly and include cultural, economic and other communities. The recommendations are quite specific about what community meant in the ICANN sense. For example, recommendation H — community contention for a name — calls out the guidelines and definitions and P explains how the objection and application and evaluation all use the same notion of community that is explicit in the recommendations.

Indeed, one of the things we learned from the sponsored gTLD round is that we needed to be a little broader in our definition of community. That is reflected in the GNSO's report.

Q. Who was involved in that process?

A. I think everyone — there were people who brought it up in terms of the .bank notion, it was one of the favourite examples. The GNSO looked at banking and the need to protect it from being used in an abusive manner.

Another example that was often raised was .Cherokee (e.g., minority that is also a brand). I used to have some involvement in theatre, so looked at the cost of a .Cherokee application for Ford Motor Company as costing less than the catering lunch on a commercial shoot. We knew that brands were going to come in and we talked about Ford grabbing Cherokee to put on the sides of buses, so we wanted to protect the Cherokee nation. We also looked at the example of Lakota as a community that isn't associated with a particular brand.

We engaged a range of voices from people who thought community gTLDs were good, to people who thought community gTLDs were bad, to people who thought that free speech would be a victim of community objections. Everyone engaged in the discussion and many stakeholders had different views. Eventually we came to ICANN consensus on encouraging and protecting communities.

Q. Where there any challenges in developing the concept?

A. Yes, there was a whole range of issues. We came up with questions such as, "If it's a community but I have a trademark on it, then who has rights?" Potentially both of them could preclude the other through an objection process. If you have a community and trademark you can try to stop a non-community bid through the objection process. There was general acceptance — rough consensus — that we could never create explicit lists of things. Any kind of controls had to be objection based. The world is too big and broad for policy to say "this is the list" that's why objections figure in everywhere as an alternative to lists. Everyone should be able to see the names that have been applied for and objected to. Our view was: no explicit lists and no expanding the reserved names list. That was explicit — if anything, people wanted to remove names from the reserved list.

Another challenge we faced was — what's the final step? Auction or lottery or ICANN Board evaluation? We were basically split on this issue and we all sent our opinions to the Board. Auction as a contention resolution solution was stronger with those who have deep pockets or who believe the market can solve all problems. In the end we left it to the Board and ICANN on how to resolve contention after community priority and such. We did spend a few years talking about it and getting feedback, so it was a thoughtful process, but we could not reach consensus.

We also went through a very strict process with ICANN staff, going through several readings and going through exercises with them. We would say "we think we would do it this way" and staff would respond with comments. It was a really interactive process in terms of coming up with our recommendations, and it took a year or two longer than many would have liked.

When I got to the GNSO, we started working on this. I was chair when we approved the recommendations and I'm now chairing the At-large group that is reviewing the process, so it's been quite an undertaking.

Q. What is your impression of the community applications?

A. I was disappointed in the number of community applications overall. I was hoping for more from around the world, especially from cultural and linguistic communities. I see this as part of the whole issue with the expense and style of outreach of the application round to other parts of the world. For example, we got assistance for applicants so late in the process that there was no time for outreach. By the time we told them about the money they would have been working on it with the assumption that there is no money. Or rather, they would have long since given up the idea because there was no money.

The At-large working group is looking at the failure of that pre-application round outreach and also at the failure on the community applicant front.

ICANN didn't make it easy. There was no specific outreach to communities. Many of us in the GNSO thought it would be a good thing but some in ICANN think communities are a bad thing — that people are cheating when they claim to be a community — but if you read the guidelines they were meant to be broad.

You are community because you say you are — and you only need to prove it if challenged by a standard application and/or an objection. If you apply for a standard string, you are implicitly saying there is no community that needs to be served. A standard application for a community's name is the same as telling a community that they do not exist, or at least that they do not matter. The other way to attack communities is the direct objection.

Q. Would you make any changes for the next round?

A. Review is one of the requirements of the program. We knew we could not figure everything out a priori, that the process was going to teach us. I certainly believe that we have to — have a remedial round — to pick up communities from other parts of the world.

We failed on community, on diversity, on international scope. Most of the IDN applications are the same old incumbents just getting an IDN. That failed and needs to be fixed.

Regarding the community test, I am of two minds. I think the testing idea is good, but I think the Community Priority Evaluation test is flawed. ICANN has improved its way of testing since then. For example, the qualification test it created for the outreach aid was richer — you might be this or that kind of community, the test had different ways to meet the threshold. It was still points based, but the way you built up your points to reach the required threshold was not quite as punitive.

As it stands, communities have to prove it the hard way under trial by ordeal, rather than starting from a notion of trust. It's "You're gaming, how do we prove you are innocent by putting you through an ordeal?" We don't need to wait for things to go further to think that this emphasis is wrong, This notion that communities are not to be trusted isn't right.

It's the "hard cases make bad law" syndrome — we can find methods of catching gamers without ordeals — the questions for the applicant support program were more nuanced and included clauses to catch gamers, so its encouraging to see that some learning has been done already.

I believe one of the primary reasons for this gTLD round was communities, around the world, cultural, linguistic, etc. This program has failed at that. We will certainly learn from this program how to allow for more categories of gTLDs — more people wanted this. What categories have been developed, what's special about them? Brand, community, geographic, etc., not all are community but many of them touch on the concept in a lot of places.

Q. What are you most looking forward to in watching these new community gTLDs?

A. 1,000 flowers blooming. I'm looking for many different kinds of communities finding cool ways to express their identities and creating a safe, useful and meaningful environment for themselves. Each one of them should somehow develop differently, following their own logic for the kind of community it is. I'm hoping that the communities manage to make it through the ICANN community priority gauntlet.

We will see how communities develop these things. The .xxx and IFFOR (International Foundation for Online Responsibility) process is just a start for what we'll learn as we watch communities try to create a self-regulated environment for their stakeholder groups. Some of those new community gTLDs should be the most beautiful blooms.

Q. What risks do you see for community gTLDs?

A. Many of them have done hand waving about how they'll really be able to implement and enforce their commitments. It won't be clear how many of them are really doing their policy work until they've won. How can they give metrics to ICANN to serve the process in a bottom-up manner? What are the 'success' metrics for community-based gTLDs?

Living up to ICANN commitments and expectations about how a community gTLD should function will be harder than most are imagining. Living up to evolving community requirements will be even more so.

Written by Jacob Malthouse, Co-founder at Big Room Inc.

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

Closed Generics Debate Rages On

CircleID posts - Tue, 2013-03-05 18:25

The new gTLD program continues to throw up last-minute debates on what is acceptable as a TLDs and what is not.

The latest such verbal joust centers around closed generics. These are generic terms being applied for by applicants whom, should they be successful, will not open the TLD up to everyone on an equal access basis.

As an example, think .book being run by Amazon and only available to Amazon customers.

In order to understand the arguments for and against closed generics, ICANN has opened a public comment period. That period ends on March 7 and ICANN has so far received 80 emails/opinions on the matter.

Closed generics are striking fear into some people's hearts mainly because of Google and Amazon's bids to secure generic terms like "cloud", "book" or "blog". No one had ever expected the two Net giants to take such an interest in the new gTLD program in the first place. Let alone show the foresight they have displayed in going for a bevy of generic terms. Many of those operating in the industries those terms describe have been taken by complete surprise.

To them, having a generic term managed according to one entity's rules is heresy. As an example, consider comments drafted by the Federation of European Publishers (FEP) on March 4. "Similarly to the case of trade marks (where generic terms may not be registered), reserving the use of generic terms as gTLDs for individual companies is not desirable," says the FEP, which represents 28 national publishers associations in Europe. "From the point of view of consumer choice, locating a class of goods and a choice of suppliers with the help of the gTLD is by far preferable to its leading to a single producer or retailer."

"At the very least, the winning applicant (for .book) must be obliged to make the gTLD available without discrimination for registrations by all eligible parties, including all commercial entities within the book industry," the FEP goes on to say in its statement which was handed to Nigel Hickson, ICANN VP of Stakeholder Engagement for Europe on March 4, and also posted as a reply to ICANN's call for public comment.

But not everyone is against closed generics. "ICANN should not be dictating business models," wrote a selection of members of ICANN's Non Commercial Stakeholder Group (NCSG) also on March 4. "There should be no intervention until and unless there is a well-documented problem related to monopoly power."

Once the current comment period has closed, ICANN staff will analyse them and provide a summary to the ICANN Board's New gTLD Program Committee. It will be up to this committee to determine whether closed generics should be shut down.

Written by Stéphane Van Gelder, Chairman, STEPHANE VAN GELDER CONSULTING

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Governance, Top-Level Domains

Categories: Net coverage
Syndicate content