CircleID posts

Syndicate content CircleID
Latest posts on CircleID
Updated: 17 weeks 4 days ago

Russia Restricts U.S. Fiscal Sovereignty Using an ITU Treaty?

Thu, 2013-03-07 19:57

It seems outlandish. However, as incredible as it may seem — especially in these times of sequestration and dire Federal budget cuts — the U.S. has potentially fallen prey to a ploy hatched by Russia and allies artfully carried out at a 2010 ITU treaty conference to relinquish the nation's sovereign right to choose its own ITU membership contributions. Here is how it happened and what can be done about it.

In December, as just about everyone knows, Russia and friends used a treaty conference called the WCIT of the U.N. specialized telecommunications agency known as the ITU, to potentially impose far reaching new regulatory controls on the Internet and just about everything else relating to new media and content. The gambit largely fell apart as the U.S. and 54 other nations walked out of the meeting.

Over the ensuing months since the WCIT, the mischief has continued, if not become worse. Almost everyone except a handful of countries has shunned the ITU's continuing telecommunications standards meetings. No one needs an intergovernmental body playing U.N. politics and cluelessly meddling in developing technical standards done by the industry elsewhere through numerous private sector organizations and arrangements in a highly competitive global marketplace. The reality, however, is that these problems with the ITU-T had been getting increasingly worse over the past decade. They have simply become much greater now after the WCIT.

As a result of these continuing machinations, not surprisingly, some people in Washington are contemplating joining many of the other nations that walked out of the December treaty conference, in scaling back the money they voluntarily elect to give to the ITU as membership fees. Even though each of the 193 nations in the ITU get an equal vote on everything, their financial contributions range from a high of 30 Contributory Units to a low of 1/16. Indeed, 61% of ITU member nations pay a ¼ Unit or less, but their vote equals that of the U.S. in the ITU. Nearly 25% of ITU Member States are only at the 1/16 level, and even then many are in arrears. The U.S. together with Japan, are the only two countries who many years ago when the ITU was worth something, elected to give the greatest amount of money — 30 Contributory Units currently equating to about ten million dollars a year.

Over the past six years, an increasing numbers of nations became fed up with the negative value proposition of the ITU and its leadership, and began significantly cutting back their funding of a broken organization. Twenty-nine countries have reduced their contributions. Some of the reductions were especially embarrassing. France — which gave birth to the organization and its strongest supporter for 145 years, dropped 17%. Germany — which was a major contributor to its technical work for decades — also dropped 17%. Switzerland — which has been the host country for the organization since its inception — dropped back 33%! The UK — which like France and Germany played leadership roles in the organization for decades — also dropped 33%. The list goes on and on — Australia ( 13 %), Belgium (-20 %), Italy (-25 %), the Netherlands (-38 %), Sweden (-38 %), Finland ( 40%), Hungary (-50%), Denmark (-60%), Lithuania (-75%),… Essentially all of these nations also save considerable money by not sending government delegations to the meetings — something the U.S. should consider emulating.

So about two years ago in late 2010 at the ITU's quadrennial treaty conference to review its Constitution and elect its leadership, Russia and a number of other countries that contribute minor amounts of the ITU's budget but engage in most of its mischief, quietly pulled a "fast one." Using the U.N. one-country, one-vote process, they managed to amended the ITU's Constitution so that any nation paying the larger contributions could not reduce its funding "by more than 15 per cent of the number of units chosen by the Member State for the period preceding the reduction, rounding down to the nearest lower number of units." It is known as the ITU Constitution Paragraph 165 Amendment, and apparently most people at the conference didn't even know it was adopted.

As a result of the Amendment, if the U.S. which currently pays the maximum of 30 Units wants to cut that amount in half, it would have to do it over the next five ITU Plenipotentiary Conferences — a minimum period of 20 years! Thus in these times of fiscal crisis in the United States, if the nation wants to significantly cut back funding of a broken U.N. body like most allies have already done, it is constrained by the ITU Paragraph 165 Amendment. If a nation fails to take a reservation to the Amendment, the nation ends up signing away its fiscal sovereignty. Rather embarrassingly, this sovereignty straightjacket on the U.S. was put in place by the very nations working against U.S. interests in the organization and paying far less.

Fortunately, when the U.S. signed the treaty instrument that adopted this misbegotten amendment in November 2010, although it failed to take a reservation to the Paragraph 165 Amendment, it did state a right to make further declarations to limit its obligations. In addition, it also appears as if the ratification process has not yet gotten underway, and the matter still resides in the Legal Advisor's office of the State Department. So what needs to occur is for the U.S. to institute an additional declaration rejecting the Paragraph 165 Amendment. One would think the White House if not Congress might want to add this matter to the U.S. budget agenda and act quickly.

An additional desirable step will be for the U.S. to then join the long list of other nations in significantly cutting back their contributions — say joining the UK and Switzerland in dropping back a third — from 30 to 20 Units. The ITU's small group of lobbyists and apologists in Washington — who love travel to Ville-Genève for the mind numbing ITU meetings — will suffer angst over the reduction. However, this action is the right measured response that many U.S. allies have already taken. The 1/3 reduction continues to recognize the residual value of the ITU's Radio Sector. The reduction is also the right response for a nation seriously attempting to deal with its mounting debt and the many billions already eliminated for U.S. domestic needs that are far greater than nonsensical U.N. activity that just about everyone agrees is not needed and adverse to almost everyone's interests.

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation

Categories: Net coverage

"Open" or "Closed" Generic TLDs: Let the Operators Decide

Thu, 2013-03-07 19:00

(The following is an edited version of comments I submitted to ICANN regarding "closed" generic TLDs.)

On February 5th, ICANN solicited comments on whether ICANN should determine the circumstances under which a generic TLD operator is permitted to adopt "open" or "closed" registration policies. Allowing gTLD operators to make these determinations, as opposed to ICANN, will promote innovation on the Internet to the benefit of consumers.

In order to bring the benefits of a competitive TLD market to consumers, ICANN should generally take as light-handed a regulatory stance as possible, as long as it meets its technical responsibilities. A light-handed regulatory approach is consistent with the policy of relatively open entry into the TLD space that ICANN has adopted.

A benefit of the new gTLD program, in addition to providing competition to incumbents, is the ability of the entrants to develop new business models, products and services. Historically, gTLDs have been open, and arguably that openness benefited the growth of the Internet. But at this stage of the Internet's development, adding new options to the status quo is more likely to unleash new forms of innovation. Closed gTLDs may be a promising source of innovations that have not thus far been possible to implement or even envision. Closed gTLDs may, for example, be a way to provide services with enhanced security. No one can know what innovations might be blocked if ICANN generally required gTLDs to be open. In short, adding new open gTLDs is likely to create benefits, but the returns to adding completely new types of gTLDs are potentially much larger.

New gTLDs are valuable economic assets. ICANN should adopt policies that assure that these assets are allocated to their most highly valued uses. ICANN's decision to use an auction when there are multiple applicants for the same gTLD will further that goal. The bidder who believes its business model will be the most profitable will win the auction and the right to operate the gTLD. When there is only a single applicant, that applicant presumably represents the highest-valued use of the gTLD.

The best use of a gTLD can change (e.g., from closed to open) if the initial business model isn't successful or if economic conditions change. This change can be effected either by the current operator or by a transfer of the gTLD to a new operator, subject to ICANN's review. In this way, gTLDs can continue to move to their highest-valued uses over time.

The dangers of ICANN dictating how gTLDs should be used are illustrated by the experience with radio spectrum. Historically, the U.S. Federal Communications Commission allocated blocks of spectrum to specific uses — e.g., broadcast radio and television. Over time, the costs associated with misallocation of spectrum under this "command-and-control" regime became very large. The process of reallocating spectrum to higher-valued uses has proven lengthy and difficult. Although the U.S. and other countries have moved toward a more market-based system, the costs of the legacy system are still reflected in the scarcity of spectrum for wireless broadband uses.

Several commentators have expressed concern that closed gTLDs are anticompetitive. No evidence supports this claim. First, we already have experience with generic second-level domain names — e.g., cars.com — which have provided useful services with no apparent anticompetitive effect. There is no reason to expect anything different from a .cars gTLD. If, for example, General Motors (or any other automobile company) were to operate .cars, it is not plausible to suggest it could thereby gain market power in the market for cars. Note also that both operators and ICANN are subject to the U.S. antitrust laws if they use the TLD system in an anticompetitive way. To the extent that ICANN allows synonyms as gTLDS — e.g., "autos" "automobiles", "motorvehicles", perhaps even "goodcars", etc. — the potential competitive problems become even more remote.

In sum, ICANN should provide maximum flexibility for operators to experiment with new business models. This is the best way to promote innovation on the Internet.

Written by Tom Lenard, President, Technology Policy Institute

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

Moving the Multistakeholder Model Forward: Thoughts from The WSIS+10 Review Event

Thu, 2013-03-07 18:27

Ten years ago, global representatives assembled in Geneva in 2003, and again in Tunis, 2005, as part of the two founding phases of the World Summit on the Information Society (WSIS). At the heart of proceedings, attended by representatives from across the spectrum of business, government, civil society, and the Internet technical community, was an acknowledgement that an inclusive approach was needed in all discussions pertaining to Internet governance and policy-making, to overcome the primary challenges in building this 'Information Society.'

In the decade that's followed, we've witnessed marked progress in moving towards this vision of a people-centred and inclusive Internet, and the multistakeholder approach has been the backbone in strengthening WSIS principles and prompting action. But challenges still remain.

Last week, representatives from all the stakeholder groups from around the world converged in Paris for the first WSIS+10 Review event to evaluate progress made since those initial meetings, and to determine fresh approaches to better inform the full WSIS review in 2015. In my mind, the meeting is proof positive that the multistakeholder model for collaborative dialogue and consensus-building is working, and must continue to be adopted going forward.

Inspired by the open debate forum, and strong support from diverse participants, the review event culminated in a statement of collaborative consensus on how to further progress and enhance the WSIS process. Indeed, the first item reinforces that multistakeholder processes have become an essential approach in addressing issues affecting the knowledge and information societies, and the statement itself was a testament to the inclusive approach. With anyone from all stakeholder groups able to contribute to its drafting, this process was further evidence of the value that the model can deliver.

As with any form of governance, the multistakeholder approach must address principles such as representation, transparency, and efficiency. Indeed multistakeholder organizations such as ICANN and multistakeholder platforms such as the Internet Governance Forum (IGF) are constantly seeking to improve their outreach to ensure active and meaningful participation by representatives from all groups. Such representation is essential for this model to be successful, and several of our ICC BASIS business representatives, took part in sessions tackling core issues such as multistakeholder principles and enhanced cooperation.

These standing-room-only sessions made clear the interest in advancing the multistakeholder approach, and the review event in its entirety was an excellent example of the multistakeholder approach, and enhanced cooperation among stakeholders, in action. Key topics affecting the future of the information society — from freedom of expression, to the free-flow of information, multilingualism and gender inclusion — were addressed in conversations which will be on-going as we approach this year's Internet Governance Forum. Hosted in Indonesia, this IGF will address issues that have emerged in international discussions, and will seek to create outputs that participants can use at the national and regional levels.

The role and relevance of the business community in this ongoing debate is one which Jean-Guy Carrier, the International Chamber of Commerce's Secretary-General, was keen to underscore in his opening address. He called upon all stakeholders to do more to protect the transparency and openness of the Internet and, highlighted that governments must engage fully with the stakeholder process to develop policies and practices that enable trade and promote economic and social growth.

On this note, and to a round of applause, a representative from Google pointed out that this collaborative model must also extend beyond dialogue, to the advancement of resources and funding for the IGF by all stakeholders. Administered by the United Nations, the IGF is funded through donations from various stakeholders. But just as the discussion of Internet governance issues must be characterised by the multistakeholder approach and equal input, so it is in all stakeholders' interests to provide a bigger commitment to funding the IGF and ensure that Internet governance continues to be debated in a fair, open and inclusive forum in the years to come.

Written by Ayesha Hassan, Senior Policy Manager, Digital Economy, International Chamber of Commerce

Follow CircleID on Twitter

More under: Internet Governance

Categories: Net coverage

Who Runs the Internet? ICANN Attempts to Clarify the Answer With This Map

Wed, 2013-03-06 19:46

ICANN has released a "living" graphic aimed to provide a high-level view of how the internet is run attuned for those less familiar with the inner workings of the internet infrastructure ecosystem. Quoting from the document:

No One Person, Company, Organization or Government Runs the Internet
The Internet itself is globally distributed computer network comprised of many voluntary interconnected autonomous networks. Similarly, its governance is conducted by a decentralized and international multi-stakeholder network of interconnected autonomous groups drawing from civil society, the private sector, governments, the academic and research communities, and national and international organizations. They work cooperatively from their respective roles to create shared policies and standards that maintain the Internet's global interpretability for the public good.

Who Run the Internet? Graphic designed to provide a high-level view from ICANN (Click to Enlarge)

Follow CircleID on Twitter

More under: ICANN, Internet Governance

Categories: Net coverage

Security and Reliability: A Closer Look at Vulnerability Assessments

Wed, 2013-03-06 18:01

Building on my last article about Network Assessments, let's take a closer look at vulnerability assessments. (Because entire books have been written on conducting vulnerability assessments, this article is only a high level overview.)

What is a vulnerability assessment?

A vulnerability assessment can be viewed as a methodology for identifying, verifying and ranking vulnerabilities (a "fault" that may be exploited by attackers) in a given system. That system could be a single application or an entire infrastructure, including routers, switches, firewalls, servers/applications, wireless, VoIP (voice over Internet protocol), DNS (domain name system), electronic mail systems, physical security systems, etc. The list of possible elements assessed could be much longer, but you get the idea.

Step One: Reconnaissance

Vulnerability assessments can be conducted with little to no information about the target system (black box) or with full information, including IP addresses, domain names, locations and more (white box). Of course, the less information you have about the system, the more reconnaissance you must do to conduct the assessment. Some of your reconnaissance might need to be done during the assessment itself, which could alter your attack profile.

Step Two: Attack Profile

Once an initial reconnaissance is complete, the next step involves developing an attack profile, which can be most easily compared to a military term: a "firing solution." Essentially, when a target has been identified, it is the adversary's responsibility to consider all the factors and options involved in attacking a target, including stealth, tools and evasion.

An attack profile should at least include the following elements:

  • Determine IP addresses to scan
  • Determine automated tools/scripts/modules to use for discovering vulnerabilities:

Step Three: Scans

After developing an attack profile, you must execute your scans using automated tools and manual processes to collect information, enumerate systems/services and identify potential vulnerabilities. As I mentioned, you might need to perform further reconnaissance during the attack, which may alter your profile. Being prepared to — and open to — adapting your profile as you gain additional information is vital during a vulnerability assessment.

In general, your attack profile should assess the following elements of security (including but not limited to):

  • Authentication/authorization and session management
  • Transport-layer security (SSL, TLS, etc)
  • Susceptibility to Denial of Service (DoS)
  • Web-based Cross-site Scripting/Cross-site Forgery
  • Security misconfiguration (inadequate access controls, firewall rules, etc)
  • Inadequate controls for SQL injection, web-based cookie injection, etc
  • Inadequate input validation for web, database or other applications
  • Remote code execution

In addition to traditional system scanning through automated or manual processes, many assessments also include social engineering scans, such as:

  • Posing via telephone as an employee of the organization to obtain password access to email, VPN (virtual private network), or web-based applications, etc
  • Phishing/spear-phishing attacks to validate corporate security policies and/or malware and anti-virus countermeasures, etc
  • Searching for leaks of credentials or intellectual property through publicly available information such as search engines, social networking sites, etc
  • On-site visits to pose as an employee and gain physical access to facilities, potentially dropping USB-based reconnaissance tools, etc

As you can see, vulnerability assessments can be very narrowly focused on a single system/application — or they can span an entire global infrastructure, including an organization's external and internal systems.

Step Four: Eliminating False Positives

Finally, I'd like to touch upon one of the most important aspects in performing vulnerability assessments: eliminating false-positives and documenting remediation steps for your customers. Automated tools are only as good as the developers that create them. Security engineers must understand the applications, protocols, standards and best practices in addition to understanding when an automated tool is flagging a vulnerability that doesn't actually exist (false-positive). Customers need to be confident that you are reporting real vulnerabilities, and they then need actionable steps for mitigation.

Of course, this isn't everything there is to know about vulnerability assessments, but hopefully this article offers up a good snapshot of what's important. Stay tuned for the next article in this series, where we will take a closer look at penetration testing.

See Neustar's Professional Services for additional helpful information and services on vulnerability assessments.

Written by Brett Watson, Senior Manager, Professional Services at Neustar

Follow CircleID on Twitter

More under: Security

Categories: Net coverage

Interview with Avri Doria on the History of Community gTLDs

Wed, 2013-03-06 02:52

This article is published on CircleID by Jacob Malthouse on behalf of the Community gTLD Applicant Group (CTAG).

Community gTLDs play an interesting and even unique role in the ICANN new gTLD process. They reflect the community-driven nature of the Internet. Indeed the story of how Community gTLDs came about is a fascinating example of the how bottom-up process can give rise to innovative policy outcomes.

It has been over six years since the community gTLD concept was first discussed. In the mists of time, it's easy to forget the deep foundations upon which this concept is based.

This February, Avri Doria joined The CTAG for a discussion and reflection on the role, history and background of community-based gTLDs. A summary of that discussion follows.

* * *

Q. What is your background?

A. I have spent most of my career working on Internet issues as a technologist. I was attracted to ICANN after participating in the Working Group on Internet Governance. One of the structures we reviewed as part of this work on multi-stakeholder governance was ICANN. The Nominating Committee brought me into the Generic Names Supporting Organisation (GNSO) council. I was on the GNSO council for five years and was chair for two and a half years. This period, coincidentally, covered the time when the new gTLD recommendations were made to ICANN from the GNSO.

Q. How long have you been involved with the Internet Community?

A. I have been involved with the Internet since before there was an Internet community. I worked on protocols starting the eighties and attended Internet Engineering Task Force (IETF) meetings until recently when ICANN began to take up my time. I still participate in the IETF on mailing lists and hope to get back there someday. I've been directly involved in various parts of Internet policy for a very long time. My last real job, until the bubble burst, was as Director of Advanced IP Routing at Nortel. Since then I've been doing technical research part time and research and consulting in Internet governance part-time.

Q. What have your key roles been within the Internet Community?

A. Protocol developer and policy analyst; I work as a systems architect, whether it's protocols or governance structures. These are the things I focus on and that interest me the most; the technical and governance systems of the Internet and how they fit together.

Q. What are you up to now?

A. At the moment I am working part-time for dotGay LLC, on their policies and community information. I also teach Internet governance in a few places, and am doing some research on Delay and Disruption Tolerant Networking (DTN) when I can find the funding; something I have been researching for ten years now.

Q. How did the 'community-based' TLD concept arise?

A. I came to ICANN in 2005, and the 'sponsored' gTLD policy making was over by then. All that was left was a few of the finalizations and the .xxx saga, which of course continued until last year.

In looking at it now, the sponsored gTLD concept was certainly part of the history that we looked at in terms of designing community gTLDs for this program and was part of the whole notion of how the community gTLD concept evolved. Those who had worked on sponsored gTLDs were part of the discussion in developing the current new gTLD recommendations.

The concept was part of the overall discussions that the GNSO was having. We were doing it as a committee of the whole — that is all of the GNSO council members were involved in the process. There was the notion that we have to defend communities that may want a gTLD. This encompassed both preventing gTLDs from being 'grabbed', but it also involved engaging more communities — a broader notion of support — that would help spread awareness further about the possibilities of a community using its name for a gTLD. We almost always put something up against the 'community' test when we were discussing policy (e.g., how would this work in the case of .bank?).

It's very important for the Internet community to go back to the policy recommendations that formed this program*. It's what we are rooted in.

* The ICANN Generic Names Supporting Organisation Final Report on the Introduction of New Generic Top-Level Domains was released on 8 August 2007. It is available for download in two parts here as: Part A and Part B (PDF).

One of the recommendations was about substantial opposition from a significant portion of the community. Implementation guidelines F, H & P explain how one follows and understands the support of the community. What is defined there is a very broad notion of community. It was the recommendation of the GNSO that 'community' should be interpreted broadly and include cultural, economic and other communities. The recommendations are quite specific about what community meant in the ICANN sense. For example, recommendation H — community contention for a name — calls out the guidelines and definitions and P explains how the objection and application and evaluation all use the same notion of community that is explicit in the recommendations.

Indeed, one of the things we learned from the sponsored gTLD round is that we needed to be a little broader in our definition of community. That is reflected in the GNSO's report.

Q. Who was involved in that process?

A. I think everyone — there were people who brought it up in terms of the .bank notion, it was one of the favourite examples. The GNSO looked at banking and the need to protect it from being used in an abusive manner.

Another example that was often raised was .Cherokee (e.g., minority that is also a brand). I used to have some involvement in theatre, so looked at the cost of a .Cherokee application for Ford Motor Company as costing less than the catering lunch on a commercial shoot. We knew that brands were going to come in and we talked about Ford grabbing Cherokee to put on the sides of buses, so we wanted to protect the Cherokee nation. We also looked at the example of Lakota as a community that isn't associated with a particular brand.

We engaged a range of voices from people who thought community gTLDs were good, to people who thought community gTLDs were bad, to people who thought that free speech would be a victim of community objections. Everyone engaged in the discussion and many stakeholders had different views. Eventually we came to ICANN consensus on encouraging and protecting communities.

Q. Where there any challenges in developing the concept?

A. Yes, there was a whole range of issues. We came up with questions such as, "If it's a community but I have a trademark on it, then who has rights?" Potentially both of them could preclude the other through an objection process. If you have a community and trademark you can try to stop a non-community bid through the objection process. There was general acceptance — rough consensus — that we could never create explicit lists of things. Any kind of controls had to be objection based. The world is too big and broad for policy to say "this is the list" that's why objections figure in everywhere as an alternative to lists. Everyone should be able to see the names that have been applied for and objected to. Our view was: no explicit lists and no expanding the reserved names list. That was explicit — if anything, people wanted to remove names from the reserved list.

Another challenge we faced was — what's the final step? Auction or lottery or ICANN Board evaluation? We were basically split on this issue and we all sent our opinions to the Board. Auction as a contention resolution solution was stronger with those who have deep pockets or who believe the market can solve all problems. In the end we left it to the Board and ICANN on how to resolve contention after community priority and such. We did spend a few years talking about it and getting feedback, so it was a thoughtful process, but we could not reach consensus.

We also went through a very strict process with ICANN staff, going through several readings and going through exercises with them. We would say "we think we would do it this way" and staff would respond with comments. It was a really interactive process in terms of coming up with our recommendations, and it took a year or two longer than many would have liked.

When I got to the GNSO, we started working on this. I was chair when we approved the recommendations and I'm now chairing the At-large group that is reviewing the process, so it's been quite an undertaking.

Q. What is your impression of the community applications?

A. I was disappointed in the number of community applications overall. I was hoping for more from around the world, especially from cultural and linguistic communities. I see this as part of the whole issue with the expense and style of outreach of the application round to other parts of the world. For example, we got assistance for applicants so late in the process that there was no time for outreach. By the time we told them about the money they would have been working on it with the assumption that there is no money. Or rather, they would have long since given up the idea because there was no money.

The At-large working group is looking at the failure of that pre-application round outreach and also at the failure on the community applicant front.

ICANN didn't make it easy. There was no specific outreach to communities. Many of us in the GNSO thought it would be a good thing but some in ICANN think communities are a bad thing — that people are cheating when they claim to be a community — but if you read the guidelines they were meant to be broad.

You are community because you say you are — and you only need to prove it if challenged by a standard application and/or an objection. If you apply for a standard string, you are implicitly saying there is no community that needs to be served. A standard application for a community's name is the same as telling a community that they do not exist, or at least that they do not matter. The other way to attack communities is the direct objection.

Q. Would you make any changes for the next round?

A. Review is one of the requirements of the program. We knew we could not figure everything out a priori, that the process was going to teach us. I certainly believe that we have to — have a remedial round — to pick up communities from other parts of the world.

We failed on community, on diversity, on international scope. Most of the IDN applications are the same old incumbents just getting an IDN. That failed and needs to be fixed.

Regarding the community test, I am of two minds. I think the testing idea is good, but I think the Community Priority Evaluation test is flawed. ICANN has improved its way of testing since then. For example, the qualification test it created for the outreach aid was richer — you might be this or that kind of community, the test had different ways to meet the threshold. It was still points based, but the way you built up your points to reach the required threshold was not quite as punitive.

As it stands, communities have to prove it the hard way under trial by ordeal, rather than starting from a notion of trust. It's "You're gaming, how do we prove you are innocent by putting you through an ordeal?" We don't need to wait for things to go further to think that this emphasis is wrong, This notion that communities are not to be trusted isn't right.

It's the "hard cases make bad law" syndrome — we can find methods of catching gamers without ordeals — the questions for the applicant support program were more nuanced and included clauses to catch gamers, so its encouraging to see that some learning has been done already.

I believe one of the primary reasons for this gTLD round was communities, around the world, cultural, linguistic, etc. This program has failed at that. We will certainly learn from this program how to allow for more categories of gTLDs — more people wanted this. What categories have been developed, what's special about them? Brand, community, geographic, etc., not all are community but many of them touch on the concept in a lot of places.

Q. What are you most looking forward to in watching these new community gTLDs?

A. 1,000 flowers blooming. I'm looking for many different kinds of communities finding cool ways to express their identities and creating a safe, useful and meaningful environment for themselves. Each one of them should somehow develop differently, following their own logic for the kind of community it is. I'm hoping that the communities manage to make it through the ICANN community priority gauntlet.

We will see how communities develop these things. The .xxx and IFFOR (International Foundation for Online Responsibility) process is just a start for what we'll learn as we watch communities try to create a self-regulated environment for their stakeholder groups. Some of those new community gTLDs should be the most beautiful blooms.

Q. What risks do you see for community gTLDs?

A. Many of them have done hand waving about how they'll really be able to implement and enforce their commitments. It won't be clear how many of them are really doing their policy work until they've won. How can they give metrics to ICANN to serve the process in a bottom-up manner? What are the 'success' metrics for community-based gTLDs?

Living up to ICANN commitments and expectations about how a community gTLD should function will be harder than most are imagining. Living up to evolving community requirements will be even more so.

Written by Jacob Malthouse, Co-founder at Big Room Inc.

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

Closed Generics Debate Rages On

Tue, 2013-03-05 18:25

The new gTLD program continues to throw up last-minute debates on what is acceptable as a TLDs and what is not.

The latest such verbal joust centers around closed generics. These are generic terms being applied for by applicants whom, should they be successful, will not open the TLD up to everyone on an equal access basis.

As an example, think .book being run by Amazon and only available to Amazon customers.

In order to understand the arguments for and against closed generics, ICANN has opened a public comment period. That period ends on March 7 and ICANN has so far received 80 emails/opinions on the matter.

Closed generics are striking fear into some people's hearts mainly because of Google and Amazon's bids to secure generic terms like "cloud", "book" or "blog". No one had ever expected the two Net giants to take such an interest in the new gTLD program in the first place. Let alone show the foresight they have displayed in going for a bevy of generic terms. Many of those operating in the industries those terms describe have been taken by complete surprise.

To them, having a generic term managed according to one entity's rules is heresy. As an example, consider comments drafted by the Federation of European Publishers (FEP) on March 4. "Similarly to the case of trade marks (where generic terms may not be registered), reserving the use of generic terms as gTLDs for individual companies is not desirable," says the FEP, which represents 28 national publishers associations in Europe. "From the point of view of consumer choice, locating a class of goods and a choice of suppliers with the help of the gTLD is by far preferable to its leading to a single producer or retailer."

"At the very least, the winning applicant (for .book) must be obliged to make the gTLD available without discrimination for registrations by all eligible parties, including all commercial entities within the book industry," the FEP goes on to say in its statement which was handed to Nigel Hickson, ICANN VP of Stakeholder Engagement for Europe on March 4, and also posted as a reply to ICANN's call for public comment.

But not everyone is against closed generics. "ICANN should not be dictating business models," wrote a selection of members of ICANN's Non Commercial Stakeholder Group (NCSG) also on March 4. "There should be no intervention until and unless there is a well-documented problem related to monopoly power."

Once the current comment period has closed, ICANN staff will analyse them and provide a summary to the ICANN Board's New gTLD Program Committee. It will be up to this committee to determine whether closed generics should be shut down.

Written by Stéphane Van Gelder, Chairman, STEPHANE VAN GELDER CONSULTING

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Governance, Top-Level Domains

Categories: Net coverage

Civil Society Hung Out To Dry in Global Cyber Espionage

Mon, 2013-03-04 21:38

This post was co-authored by Sarah McKune, a senior researcher at the Citizen Lab.

Public attention to the secretive world of cyber espionage has risen to a new level in the wake of the APT1: Exposing One of China's Cyber Espionage Units report by security company Mandiant. By specifically naming China as the culprit and linking cyber espionage efforts to the People's Liberation Army, Mandiant has taken steps that few policymakers have been willing to take publicly, given the significant diplomatic implications. The report has brought to the forefront US-China disagreements over cyberspace, igniting a furious response from the Chinese government.

Also cast in stark relief by this incident, however, are the priorities of the United States in securing the cyber domain: threats to critical infrastructure, and the theft of intellectual property, trade secrets and confidential strategy documents from key industry players and Fortune 500 companies. General Keith Alexander, the head of US Cyber Command and the National Security Agency, raised the profile of the theft issue last year in asserting that widescale cyber espionage had resulted in "the greatest transfer of wealth in history." The issue was highlighted again in the newly-released Administration Strategy on Mitigating the Theft of U.S. Trade Secrets.

Certainly, threats against critical infrastructure and theft of intellectual property and trade secrets are important. However, they are not the only targets of cyber intrusion and espionage that should merit public attention and government concern.

An often-overlooked dimension of cyber espionage is the targeting of civil society actors. NGOs, exile organizations, political movements, and other public interest coalitions have for many years encountered serious and persistent cyber assaults. Such threats — politically motivated and often with strong links to authoritarian regimes — include website defacements, denial-of-service attacks, targeted malware attacks, and cyber espionage. For every Fortune 500 company that's breached, for every blueprint or confidential trade secret stolen, it's a safe bet that at least one NGO or activist has been compromised in a similar fashion, with highly sensitive information such as networks of contacts exfiltrated. Yet civil society entities typically lack the resources of large industry players to defend against or mitigate such threats; you won't see them hiring information security companies like Mandiant to conduct expensive investigations. Nor will you likely see Mandiant paying much attention to their concerns, either: if antivirus companies do encounter attacks related to civil society groups, they may simply discard that information as there is no revenue in it.

While cyber espionage against a company may result in the loss of a blueprint, an attack on an NGO could result in a loss of individual life or liberty. Yet civil society is largely on its own as it goes about its work to advance human rights and other public policy goals while struggling to stay ahead of debilitating cyber threats.

In Citizen Lab's research on cyber espionage against civil society, going back to the Tracking GhostNet and Shadows in the Cloud reports, we've routinely encountered the very same malware families, social engineering tactics, and advanced persistent threats experienced by the private sector, governments, and international organizations. Our research indicates that the important details uncovered by Mandiant are just one slice of a much bigger picture of cyber espionage linked to China. For example, Citizen Lab's Seth Hardy has found that certain malware targeting a Tibetan organization incorporates much of the same code and uses one of the same command-and-control servers as the APT1 attacks documented by Mandiant. This suggests that APT1 is also targeting civil society groups alongside the "higher profile" companies and organizations on its roster.

Our findings confirm there's more to China's motivations than just industrial and government espionage. The Chinese government appears to view cyber espionage as a component of much broader efforts to defend against and control the influence of a variety of "foreign hostile forces" — considered to include not only Western government entities, but also foreign media and civil society — that could undermine the grip of the Communist Party of China.

The solutions presented by US policymakers, however, have left civil society out of the equation altogether, focusing on industry and government only, as if these are all that matter. Notably, a February 12, 2013 executive order on improving cybersecurity provides that US policy is to "increase the volume, timeliness, and quality of cyber threat information shared with U.S. private sector entities so that these entities may better protect and defend themselves against cyber threats." No similar initiative exists for outreach and information sharing with civil society. Without these considerations, we leave civil society hung out to dry and lose sight of that which we are aiming to protect in the first place — a vibrant democratic society.

As we consider what to do about mitigating cyber attacks, and the bleeding of our industrial base from unabashed cyber espionage, we would do well to remind ourselves of a fact that may be easily overlooked: China's domestic problems in the human rights arena are a major factor driving cyber insecurity abroad. China's aggressive targeting of "foreign hostile forces" in cyberspace includes groups simply exercising their basic human rights. We may well soften China's malfeasance around corporate and diplomatic espionage, but without dealing with the often-overlooked civil society dimension, we will not eradicate it entirely.

Written by Ron Deibert, Director, The Citizen Lab, Munk School of Global Affairs, University of Toronto

Follow CircleID on Twitter

More under: Cyberattack, Internet Governance, Malware, Security

Categories: Net coverage

Security and Reliability: A Deeper Dive into Network Assessments

Mon, 2013-03-04 21:05

As noted in the first part of this series, Security and Reliability encompasses holistic network assessments, vulnerability assessments, and penetration testing. In this post I'd like to go deeper into network assessments. I stated last time that the phrase "network assessment" is broad. Assessments may be categorized as "internal" (behind the firewall, corporate infrastructure) or "external" (outside the firewall, Internet infrastructure). Regardless of the scope and areas of technology assessed, the goals are to assess the current state of your infrastructure with respect to industry best practices, to provide a gap analysis that shows where best practices are not met, and finally to provide remediation steps to fill those gaps.

Internal network assessments may be highly customized and should evaluate a wide range of network infrastructure or specific areas of technology, including but not limited to:

  • Network switching/routing
  • Firewall and IDS/IPS
  • Wireless (Wi-Fi, microwave, satellite, etc.)
  • VoIP
  • DNS/DHCP/IPAM
  • Server infrastructure
    • Application
  • Client/desktop
    • System builds
    • Anti-virus/anti-malware
  • Physical security

External network assessments may also be customized and should examine areas including but not limited to:

  • IP address registration and routing policy
  • DNS and domain name registration
  • Electronic Mail
  • Internet gateways (border router, access controls, filtering, firewalls, etc)
  • VPN access to corporate network
  • Site-Site interconnections

You may also wish to assess information security policies and procedures, access controls (logical or physical), readiness for SSAE16, ISO 27000 series, or PCI compliance, and disaster recovery procedures, or business continuity plans for both internal and external assessments.

The benefits of a network assessment include documentation to help you understand your current security and reliability posture in terms of best practices, and steps to remediate gaps in best practices. This type of assessment can form the basis for system-wide documentation and further policy development if needed. In addition, once you remediate any gaps in the assessment, you can begin to document best practices with respect to network/system architecture, security, change management, disaster recovery and business continuity.

The next logical steps to enhancing your security and reliability posture are to execute periodic vulnerability assessments and penetration testing, which I will delve into in the following posts.

Written by Brett Watson, Senior Manager, Professional Services at Neustar

Follow CircleID on Twitter

More under: Security

Categories: Net coverage

72 Hours left on the Buzzer: Closed Generic TLDs

Mon, 2013-03-04 03:52

Recently, I sent my submissions on the current call for public comments on the Closed Generic TLDs which closes on the 7th March, 2013. I thought I would share it here as well as encourage people to post their comments on the public forum.

A "Closed Generic" is a TLD that is a generic term, but domains within that TLD will not be sold to the public.

Today, there are 22 generic TLDs. These include .COM, .BIZ, .INFO and .NET. Domain names within today's generic TLDs are available for purchase by the general public. Generic TLDs that are available for purchase by the general public are NOT closed generic TLDs.

When ICANN held its open application process in June 2012, there were many applicants for Top Level Domains for both branded and generic terms. For example, there were applications filed to create the .BMW Top Level Domain, the .DOT, Top Level Domain, the .SEARCH Top Level Domain, and the .SHOP Top Level Domain. Some of the applicants intend to sell domain names within their proposed new Top Level Domains to the public, while others do not intend to sell domain names within their proposed new TLDs to the public.

The litmus test in my mind is what is the impact on global public interest? The Affirmation of Commitments (AoC) by the United States Department of Commerce (DOC) and ICANN clearly specify the promotion of competition, consumer trust and consumer choice. There are two ways of examining the situation, one is by looking at the closed generic applications and the other is to look at it from the standpoint of ICANN which is beholden under the AoC.

The issues that arise are as follows:-

  1. Would the endorsement of "Closed Generic" Applications create a situation or a series of situations whether now or in the future that will restrict competition?
  2. Would the endorsement of "Closed Generic" Applications create a situation where there is a dominant position within the market?
  3. Would the endorsement of the "Closed Generic" Applications create a restraint in trade of a particular market?
  4. Would ICANN be immune from anti-trust liability?

Traditionally, the prohibition and control provisions laid out in competition rules basically aims to prevent cartelization and monopolization in markets for goods and services. Such developments in markets inevitably harm consumer welfare which competition rules aim to protect. On the same token, there are instances where some agreement may limit competition to allow for social and economic benefits to pass to the other. In order to ensure that such agreements with a net effect of increasing competition can be made, an exemption regime is regulated in competition law and agreements between undertakings in the same level (horizontal) and different levels (vertical) of the market may be left exempt from the prohibition of the competition rules under an exemption system, provided they are not cartel agreements which are, by nature, out of the scope of exemption.

The Sherman Antitrust Act also referred to as the Sherman Act prohibits certain business activities that federal government regulators deem to be anticompetitive, and requires the federal government to investigate and pursue trusts, companies, and organizations suspected of being in violation.

On 4 August 2012, the Honorable Philip S. Gutierrez, United States District Judge ruled in Manwin Licensing International S.A.R.L., et al. v. ICM Registry, LLC, et al, that "anti-trust" claims could be filed over controversial .xxx. See: a. ICANN's Involvement in Trade or Commerce By its terms, the Sherman Act applies to monopolies or restraints of "trade or commerce." 15 U.S.C. §§ 1, 2. The identity of a defendant as a nonprofit or charitable organization does not immunize that organization from antitrust liability. NCAA v. Bd. of Regents of Univ. of Okla., 468 U.S. 85, 101 n.22 (1984) ("There is no doubt that the sweeping language of § 1 [of the Sherman Act] applies to nonprofit entities."). On the contrary, nonprofit organizations that act in trade or commerce may be subject to the Sherman Act. Big Bear Lodging Ass'n v. Snow Summit, Inc., 182 F.3d 1096, 1103 n.5 (9th Cir. 1999) ("A nonprofit organization that engages in commercial activity . . . is subject to federal antitrust laws."). Rather than focusing on the legal character of an organization, an antitrust inquiry focuses on whether the transactions at issue are commercial in nature. Virginia Vermiculite, Ltd. v. W.R. Grace & Co. — Conn., 156 F.3d 535, 541 (4th Cir. 1998) ("We emphasize that the dispositive inquiry is whether the transaction is commercial, not whether the entity engaging in the transaction is commercial."). "Courts classify a transaction as commercial or noncommercial based on the nature of the conduct in light of the totality of surrounding circumstances." United States v. Brown Univ. in Providence in State of R.I., 5 F.3d 658, 666 (3rd Cir. 1993). In any circumstance, "[t]he exchange of money for services . . . is a quintessential commercial transaction." Id.

Each of the generic TLDs presents a market and there are generic brands like .blog which if were closed could pose serious threats to freedom of expression for those who wish to register .blog. Article 19 of the International Covenant on Civil and Political Rights (ICCPR) clearly provides for freedom of expression. The threat of limiting or restricting the ability of persons wishing to acquire .blog poses serious harm to the global blogging community and individuals.

For the purposes of assessing whether closed generic TLDs should be permitted, it is essential to engage in identifying the market for the TLD and whether there is likelihood that a monopoly or oligopoly would be created that could distort the market and prejudice public interest.

Under the Sherman Act § 2, 15 U.S.C. § 2 monopolizing trade is a felony. Under the circumstances where this trade involves foreign nations such as generic TLD applications that have been made by countries outside the US, then Sherman Act § 7 (Foreign Trade Antitrust Improvements Act of 1982), 15 U.S.C. § 6a will apply in relation to conduct involving trade or commerce with foreign nations.

There is the possibility that something which is declared open can be later declared closed, depending on market dynamics and how competition is controlled. The other issue is who regulates the competition of the gTLD market? Is this supposed to be self regulatory where market forces are left to determine how the pendulum swings or does ICANN or the Applicant of the gTLD given discretionary rights to control its respective gTLD market?

However complex these questions, the litmus test for advocates of an open and free internet is the impact on global public interest and I would propose that the considerations would be:-

  • Is there a visible threat to the global public interest?
  • What is the nature of the threat/challenge?
  • Is there need to "seal off a market" to preserve competition?
  • Are there generic terms where it is in the public interest to be closed?

There are no easy answers to the debate on closed generic TLDs. There is room however for discussion, dialogue and sharing perspectives to help policy makers in the decision making process. It is critical that people have their say and respond to the call for public comments on Closed Generic gTLD applications at http://www.icann.org/en/news/public-comment/closed-generic-05feb13-en.htm before it closes on March 7th, 2013. Have your say today!!!

Disclaimer: These are some reflections on closed generic TLDs. The views expressed are solely my own and is not the view of the At Large Advisory Committee (ALAC) nor the Civil Society Internet Governance Caucus (IGC). The views are made in my personal capacity as an individual.

Written by Salanieta Tamanikaiwaimaro, Director of Pasifika Nexus

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Policy & Regulation, Top-Level Domains

Categories: Net coverage

An Update on the Closed Generics Debate

Sun, 2013-03-03 21:38

ICANN is currently seeking public comment on the subject of "closed generic" Top Level Domain (TLD) applications. A "Closed Generic" is a TLD that is a generic term, but domains within that TLD will not be sold to the public.

There are those who object to generic terms such as .book being operated as closed registries, which means that domain names within the .book Top Level Domain as proposed by Amazon would not be sold to the public, but instead, Amazon.com would own and operate all domain names within the .Book Top Level Domain. (e.g. You might soon be able to navigate to Amazon.book.) For example, Google, Inc., has applied to create the .Search TLD to allow it to improve its search functionality, and Amazon.com has applied for the .book TLD to allow it to segregate its book product offerings onto a separate TLD. Many oppose these projects because, it is said, that these TLDs offer companies like Google, Inc. and Amazon.com an "unfair competitive advantage".

On the other hand, there are those who believe that Closed Generics should be permitted because they do not represent an unfair competitive advantage. By way of comparison, the leading bookseller online is Amazon.com, not Book.com. Those who support Closed Generics are in favor of innovation and competitive freedom, with no restrictions on the types of services that can be provided through a Top Level Domain.

To date, there have been 42 comments submited to ICANN's public comments section:

2 Commenters Support Allowing Closed Generic Top Level Domains
39 Commenters are Opposed to Closed Generic Top Level Domains
1 Commenter is Opposed to the current gTLD Application Process all together

View a comment summaries at:
http://www.getnewtlds.com/news/UpdateOnClosedGenericsDebate.aspx

Written by Mary Iqbal, Founder of Get New TLDs Inc.

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Top-Level Domains

Categories: Net coverage

Google: Not All ccTLD's Are Created Equally in Generic Search Rankings

Fri, 2013-03-01 22:26

There is a very interesting video posted on YouTube.com from Matt Cutts of Google who answered the question about how ccTLD's are viewed by Google especially when they are being used as domain hacks.

Here is the question:

"We have a vanity domain (http://ran.ge) that unfortunately isn't one of the generic TLDs, which means we can't set our geographic target in Webmaster Tools. Is there any way to still target our proper location?"

In the 2:30 minute video, Matt Cutts makes it clear that not all ccTLD's are going to be treated the same by Google:

As the domain space gets more exhaustive in .com, people are getting more creative using domain names like Ma.tt which is owned by Matt Mullenweg of WordPress.com, which is a very cool domain, but is the country code for Trinidad and Tobago.

Many others are using words that end in .es, and we see a lot of startups that have been using .io (Indian Ocean).

When using these ccTLD as either domain hacks or just because they make a cool domain name or brand, Matt Cutts is saying you have to be VERY careful otherwise the domain is going to be treated as a ccTLD and thought by Google to be only targeting residents the country the ccTLD represents.

"You have to think hard, if its going to be thought of as an international domain or a country code."

Matt calls out .co specifically as one which is treated as generic by Google and not as the ccTLD of Colombia.

"In some sense it comes down to a little bit of a call when a domain becomes truly generic, appropriate to the entire world."

"So like .Co, which I think used to be for Colombia. has become a generic like another .com"

"But if you're using an .es for a word that ends in .es or .li domain, which I understand is being used by a lot of businesses located in Long Island, because it's really a cool address, you have to be careful because in the case of .es we are going to think its related to Spain and in the case of .li we are going to associate it as targeting residents of Lichtenstein because 99% of the domain in use are related to those countries".

"Otherwise everyone starts to use crazy random domain names and they lose the sense of what they were originally intended for and that could be a bad experience for everyone".

A MUST see Video for anyone using or considering using ccTLD especially as a domain hacks.

Written by Michael Berkens, President of Worldwide Media, Inc.

Follow CircleID on Twitter

More under: Domain Names, Top-Level Domains, Web

Categories: Net coverage

The Internet Access Gap Survey: Right Conclusion, Wrong Numbers

Fri, 2013-03-01 20:23

A colleague sent me a story by Cecilia Kang in the Washington Post: Survey finds gap in Internet access between rich, poor students. With my interest in programs to get connected computers into low income households, my friend knew I would be interested in the article which talks about a survey released Thursday by the Pew Research Center.

Indeed, I would commend the Washington Post article and the survey itself to you for reading.

I want to highlight the problem representing the survey results in the Washington Post. The fifth paragraph says:

Half of all students in higher income families have access to the Internet at home through a computer or mobile device. The figure drops to 20 percent for middle income children and just 3 percent of students from poor homes, according to the survey of 2,462 teachers by the Pew Internet & American Life Project in cooperation with the College Board and National Writing Project.

Something seemed off with those figures.

After all, I recalled that two weeks ago, I wrote about digital literacy programs trying to deal with the one-third of American households that aren't on-line. How could it be that half of wealthy households with kids were without internet, if the national figures show two-thirds of households have internet access.

Something wasn't right.

In going to the actual Pew report, I found the likely source of the Washington Post numbers. But Pew didn't actually report on the availability of home internet by income. It was a different question.

The survey reported "% of teachers who say ALL or ALMOST ALL of their students have sufficient access to the digital tools they need [at home / at school] to effectively complete school assignments, by student socioeconomic status".

This question may point to whether teachers have to adapt homework assignments; can the teachers assume that digital tools will be available?

The Washington Post appears to have treated these numbers as though the question ask "percentage of students who have home internet, by income."

The Washington Post question is important to understand and address, but it was not addressed in the survey. And as a result, the numbers were just plain wrong.

Written by Mark Goldberg, Telecommunications Consultant

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: Net coverage

Can Plural and Singular New gTLDs Both Be Successful?

Fri, 2013-03-01 19:11

Now that ICANN has stuck to its guns and only placed 4 new gTLD's strings that look confusingly similar into contention sets, rather than those that sound identical, such as .inc and .ink or those that have the same meaning like .Law and .Lawyer or those that are singular and plurals of the same word, like .deal and .deals, we now that many new gTLD's are going to have a very a tough marketing road and face a lot of consumer confusion.

Not only will the new gTLD strings have to sell themselves to the public as alternatives to incumbent TLD's and ccTLD's but they will have to separate themselves from other new gTLD's that will be fighting in the same vertical for seemly the same customers with almost the same String.

Here are some new gTLD strings that will not only have to compete for the same vertical but possibly a real problem is separating themselves away a very close alternative:

.Law/.Lawyers.Game/.Games.New/.News.Hotel/.Hotels.Gift/.Gifts.Realestate/.Realty/.Realtor.Car/.Cars.Host/.Hosting.Secure/.Security.Coupon/.Coupons.Insure/.Insurance.Sport/.Sports.Deal/.Deals.Kid/.Kids.Shop/.Shopping.Fish/.Fishing.Loan/.Loans.Tech/.Technology.Film/.Movies.Photo/.Photography.Web/.Webs.Site/.Website

The question is can two or more gTLD's in many cases separated just by being singular or plural of the same a generic word both be successful in the marketplace.

Beyond the challenges of selling say a .deal from a .deals, what will be end users reactions? How much confusion has ICANN allowed to be created down the line. Will consumers really be able to get it right when they see or hear an ad for sale.deal without confusing it with sale.deals or deal.sales?

Of course there is still the objection period which doesn't close until March 13th under which applicants can object to other applications; the Initial Evaluation of applications which should start to be released this month and there are still the GAC objections.

ICANN certainly followed its own guidelines for setting contention strings as laid out in the Guidebook, ICANN should have defined contension sets differently in a way to place such really similar strings into the same contention set so that there would only be one surviving string that are many times simply separated by a "s".

Written by Michael Berkens, President of Worldwide Media, Inc.

Follow CircleID on Twitter

More under: ICANN, Policy & Regulation, Top-Level Domains

Categories: Net coverage

The International Space Station's Canadian Music Video Collaboration - and Google+ Hangout

Fri, 2013-03-01 18:26

As much as we talk here about the inner workings of the Internet's infrastructure, there are times when you have to just sit back and look at how incredibly cool some of the things are that are enabled by the Internet. For example, last week I was delighted to stumble across (via Google+) this excellent music video collaboration between the International Space Station's Canadian commander Chris Hadfield, the Canadian band Barenaked Ladies along with a Canadian student choir — all coordinated by the Canadian Space Agency, the Canadian Broadcasting Corporation and The Coalition for Music Education.

While I was sitting there very much enjoying the music, I was also thinking about the technology that enabled a space station to participate as they did — and the role the Internet infrastructure played in enabling the creation — and subsequent sharing of this music video. Naturally several of us were immediately wondering about latency and how much post-production was done… but regardless, it was great to see and enjoy! Listen yourself:

Not to be outdone by the Canadians, of course, NASA had their own Google+ Hangout with the I.S.S. last week, too, and if you watch the replay the connection with the station occurs about 30 minutes into the hangout. (Prior to that questions are being handled by astronauts on the ground.) The I.S.S. crew take questions from the moderator and from videos submitted through YouTube. One of the questions was about social media and the crew spoke about how the technology enabled them to collaborate with people all around the world.

On one level this is all mundane, "normal" collaboration that perhaps doesn't warrant being called out… I mean, it's just an IP network, right? But it's an IP network that includes a space station and, at least to me, that's very cool and something to celebrate!

P.S. And as an added bonus, the music video and Google+ Hangout are both available to me over IPv6, as it should be.

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: Internet Protocol

Categories: Net coverage

Booksellers Weigh In On Amazon's New TLDs

Fri, 2013-03-01 01:54

Some pretty big companies are beginning to show an active interest in ICANN's new TLD project. The most recent of them is bookseller Barnes & Noble.

The letter, which is available both on the ICANN website, is quite narrow and pointed in its scope and focusses on the perceived competition issues with Amazon's bids for several "closed generics".

Excerpt from the letter:

Barnes & Noble, Inc. submits this letter to urge ICANN to deny Amazon.com's application to purchase several top level domains (TLDs), most notably .book, .read and .author (collectively the "Book TLDs").1 Amazon, the dominant player in the book industry, should not be allowed to control the Book TLDs, which would enable them to control generic industry terms in a closed fashion with disastrous consequences not only for bookselling but for the American public. If Amazon, which controls approximately 60% of the market for eBooks and 25% of the physical book market2, were granted the exclusive use of .book, .read and .author, Amazon would use the control of these TLDs to stifle competition in the bookselling and publishing industries, which are critical to the future of copyrighted expression in the United States.

Amazon's ownership would also threaten the openness and freedom of the intenet and would have harmful consequences for intenet users worldwide. When ICANN announced its plan to increase the number of TLDs available on the Domain Name System, one of its stated goals was to enhance competition and consumer choice. However, if the Book TLDs applications are granted to Amazon, no bookseller or publisher other than Amazon will be able to register second-level domain names in .book, .read and .author without Amazon's approval, leaving Amazon free to exclude competitors and exploit the generic Book TLDs for its sole benefit.

The Booksellers Association of Switzerland is also opposing Amazon:

The Booksellers Association of Switzerland is of the strong opinion that closed generic gTLD applications have to be invalidated when submitted by commercial entities operating in a sector of activity related to the closed generic gTLD .

In the case of a closed generic TLD like .books, the exclusivity granted to the winning applicant would de facto strengthen the position of a single big operator in the book industry and would be detrimental to the industry as a whole.

Think I see a trend!

Written by Michele Neylon, MD of Blacknight Solutions

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Policy & Regulation, Top-Level Domains

Categories: Net coverage

An Introduction to Security and Reliability - What Does It Really Mean?

Thu, 2013-02-28 21:31

I co-authored a book in 2005, titled "Extreme Exploits: Advanced Defenses Against Hardcore Hacks." My chapters focused on securing routing protocols such as BGP, and securing systems related to DMZs, firewalls, and network connectivity.

As I look back over those chapters, I realize that the basic fundamentals of network security really haven't changed much even though technology has advanced at an incredible pace. "Defense in depth" was a hot catch phrase seven years ago, and it still applies today. I believe there are three broad steps any organization can take with respect to security and reliability to get a handle on their current security posture, whether internal (corporate, inside the firewall) or external (Internet, outside the firewall).

Network Assessment

Begin with a "network assessment." This is a broad term that might encompass a holistic view of an organization's Internet security posture, including Internet gateways, firewalls, DNS and email services, and B2B partner connectivity. In addition or alternatively, a network assessment may focus on an organization's internal network, including employee intranets, VPN, electronic mail, DNS, VoIP, vulnerability management and anti-virus services, change management, and business continuity planning and disaster recovery. A network assessment can be tailored to specific security requirements for any organization, but ultimately the assessment will provide a baseline gap analysis and remediation steps to fill those gaps.

Vulnerability Assessments

Once a baseline network assessment is completed, an organization may wish to perform periodic vulnerability assessments. Traditional vulnerability assessments tend to cover applications services and nothing more. However, an organization's security posture must include Internet gateway switches/routers, firewalls, DNS servers, mail servers, and other network infrastructure not directly related to providing service for a specific application. Whether internal or external, vulnerability assessments can uncover critical gaps in security that may lead to credential leaks, intellectual property theft, or denial of service to employees or customers. A well planned and executed vulnerability assessment should eliminate false positives, but can never give an organization 100% confidence that a specific vulnerability can be exploited. Vulnerability assessments should be executed on at least a quarterly basis, but it's not uncommon for larger organizations to execute them on a monthly basis.

Penetration Testing

The next step in assessing your organization's security and reliability is penetration testing. While I typically say that vulnerability assessments give you a "95% confidence level" that a vulnerability exists, penetration testing can give you 100% confidence that a specific vulnerability can be exploited and show you how it can be exploited by attackers. Alternatively, a penetration test may show you that you have proper compensating controls in place to prevent a vulnerability from being exploited. That is to say, the vulnerability exists, but a compensating control is in place that prevents attackers from succeeding.

One only needs to read the news to know that every organization, whether large or small, is susceptible to intrusions across their networks or exploits in their applications and services. It's prudent to execute a network assessment in order to understand your current security posture, and then follow up with periodic vulnerability assessments and penetration tests. These will give you a higher level of confidence that your architecture is sound, and that your staff is adhering to security policies and procedures. Ultimately, your customers trust you to secure your resources and their information, and your brand and market identity are at stake if you don't.

Written by Brett Watson, Senior Manager, Professional Services at Neustar

Follow CircleID on Twitter

More under: DNS, Security, VoIP

Categories: Net coverage

ICANN's Trademark Clearinghouse to Provide Unprecedented Protections in the Domain Name Space - HUH?

Thu, 2013-02-28 21:12

Really ICANN? The Trademark Clearinghouse provides unprecedented protection. According to your recent announcement it does.

Do tell, ICANN — in what way does the Trademark Clearinghouse protect anything?

• Does it block others from registering trademarks for which they have no legitimate right?
• Does it notify trademark owners in advance of a pending registration?
• Does it provide warnings of infringing names beyond exact match?
• Does it even provide notifications of exact-match registrations beyond the first 60/90 days of the general registration period?

Of course, the answers to all of these questions is a resounding NO.

Now, I am not saying that the Trademark Clearinghouse is not without value. It will undoubtedly streamline the validation process for trademarks so that they can qualify for Sunrise Registrations. Having to validate trademarks with each individual new gTLD registry would have been extremely time-consuming, and possibly much more expensive.

That said, the Trademark Clearinghouse is not, and has never been a Rights Protection Mechanism, and trying to classify it as one only makes trademark owners even more frustrated.

Written by Elisa Cooper, Director of Product Marketing at MarkMonitor

Follow CircleID on Twitter

More under: Cybersquatting, Domain Names, ICANN

Categories: Net coverage

10.1 Million .ORG Domains and Counting

Thu, 2013-02-28 19:29

PIR released the results of the bi-annual domain name report, "The Dashboard," which outlines the growth of .ORG in the second half of 2012. Overall, we had a remarkable year. Most notably, we hit a major milestone in June with the registration of the 10 millionth .ORG domain!

Our team compiles this report every six months and each time I'm impressed by the results. It's encouraging to see that year after year individuals and organizations continue to turn to .ORG to promote their cause and educate their audiences.

Some of the key findings of "The Dashboard" include the following:

  • New .ORG registrations increased by 11.9 percent in the second half of 2012.
  • The number of .ORG domains under management (DUM) grew by 4.3 percent in 2012.
  • .ORG experienced a net gain of 416,301 registrations in 2012.
  • .ORG DUM have more than doubled during the past seven years, increasing from 3.9 million in 2005 to more than 10.1 million in 2012.

We are particularly proud of our international growth in the past two years. From 2010 to 2012, new .ORG domain names created abroad increased in the following regions:

  • Asia, Australia and the Pacific region grew by 47 percent.
  • Africa grew by 23 percent.
  • Latin America grew by 25 percent.

This increased demand for .ORG in international markets only further solidifies our commitment to non-profits and non-governmental organizations (NGO) worldwide. To that end, our applications with the Internet Corporation for Assigned Names and Numbers (ICANN) to be the operator of .NGO and .ONG domains are currently under evaluation, and ICANN expects to delegate new top-level domains later this year. These proposed domain extensions would be specifically aimed at meeting the needs of the global NGO community, providing them with a secure and trusted venue that enables them to increase engagement, awareness and funding opportunities. For more than a decade, Public Interest Registry has served non-profit organizations and we look forward to growing our mission and global capabilities in 2013.

To see the full results of The Dashboard, download a PDF of report here.

Written by Brian Cute, Chief Executive Officer, .ORG, The Public Interest Registry

Follow CircleID on Twitter

More under: Domain Names, Registry Services, ICANN, Top-Level Domains

Categories: Net coverage

Are .Brand Applications Being Scared Off by Financial CQ's?

Thu, 2013-02-28 18:13

There has been an upsurge in brands withdrawing their applications. The timing undoubtedly is due to the deadline of 70% refund of the $185k application fee. But why are so many of the withdrawals .brand/closed generics?

Having been involved in drafting of financial projections for over 50 applications and having answered a number of financial Clarification Questions, I believe that the major reason why there is an acceleration in .brands, especially closed ones, is that they are receiving a large number of financial Clarifying Questions (CQs) and are deciding to cut their losses. In my opinion there are two main reasons for these types of gTLDs receiving an inordinate proportion of financial CQs.

In general, .brand applications were defensive in nature, hastily prepared and involved a lot more "cut and paste" in answering the questions of the application.

Many of the financial Clarification Questions received by brands seem to be geared towards open rather than closed systems. The applicants did not effectively consider how to write their answers to comply with the ICANN evaluation criteria.

It is very likely that .brand/closed generics are receiving a relative large amount of financial CQs and are deciding to opt out because:

  • They may be defensive registrations from the legal department with little or no business/financial or marketing participation
  • They have failed to identify a real and tangible business value
  • Risk they were mitigating against, just was not worth the trouble
  • They cannot figure out sufficient answers to CQs or do not have the confidence that they can put forward a reasonable business use case to answer the financial CQ.

The number of applicant withdraws reached 22 during the week and I think it's a shame to see such global brands depart the gTLD round, especially if the reason is due to the difficulty of answering CQs. ICANN was always clear, either answer the questions per the AGB criteria in your initial application, or you will get clarifying questions. Pay me now, or pay me later.

One concern is that for several brand applicants, the awareness of the new gTLD application never got out of the legal/trademark protection department. If the legal departments are making the call to abandon without fully exploring the future opportunities with the marketing department, they have done their brands a disservice. The .brand applications can provide clear financial answers that meet guidebook criteria. It will require additional speculative investment to secure undeveloped internet real estate, as well as urgent engagement with the marketing and branding functions to fully realize the risks and or potential benefits of your own TLD.

I hope that other brand applicants fully consider the long-game before pushing the trigger on a hasty "withdraw" button.

Written by Norbert Grey, CFO of Architelos

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage