CircleID posts

Syndicate content CircleID
Latest posts on CircleID
Updated: 17 weeks 4 days ago

Could You Go for a Year Without Internet Access? Paul Miller Reports on His Experiment…

Fri, 2013-05-03 23:29

Could you sign off of the Internet today — right now, in fact — and not come back online for 12 months? If you are a reader of CircleID, odds are pretty good that the answer is probably an emphatic "No!” This is, after all, a site for "Internet Infrastructure" and for most of us visiting the site (or writing here) the "Internet" is completely woven into the fabric of our lives… and we have a hard time thinking of a life without it.

Paul Miller did, though. Paul, a writer at The Verge, signed off on April 30, 2012, and just rejoined the rest of us this week… and being a writer naturally wrote a long piece about it: "I'm still here: back online after a year without the internet

As he says in the article and the accompanying video, he was thinking of an escape:

I thought the internet might be an unnatural state for us humans, or at least for me. Maybe I was too ADD to handle it, or too impulsive to restrain my usage. I'd used the internet constantly since I was twelve, and as my livelihood since I was fourteen. I'd gone from paperboy, to web designer, to technology writer in under a decade. I didn't know myself apart from a sense of ubiquitous connection and endless information. I wondered what else there was to life. "Real life," perhaps, was waiting for me on the other side of the web browser.

I won't quote much more of the article because I think it's worth a read in its entirety. I did, though, find this an interesting quote:

My plan was to leave the internet and therefore find the "real" Paul and get in touch with the "real" world, but the real Paul and the real world are already inextricably linked to the internet. Not to say that my life wasn't different without the internet, just that it wasn't real life.

and this:

But the internet isn't an individual pursuit, it's something we do with each other. The internet is where people are.

I think of my own life, and the connections that I have, and the connectedness I have with so many people and with so many different facets of my life. Sure, I could go without the Internet for a year… but would I want to?

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: Web

Categories: Net coverage

Tom Wheeler - New FCC Chairman

Fri, 2013-05-03 19:13

Tom Wheeler nominated by President Obama as the new chairman of the FCC.After a political and administrative process of more than a month Tom Wheeler has finally been nominated by President Obama as the new chairman of the FCC with the full support of Congress. Unlike other regulators around the world the FCC is directly accountable to the American Congress, making it a far more political body than most other regulators.

I have known Tom since 1983. He is an enormously energetic person and has been involved in the ICT industry for most of his working life, holding very senior positions within the American industry.

Currently he is the managing director at the Washington DC venture capital firm, Core Capital Partners, and before that, from 1979 to 1984, he served as president of the National Cable Television Association (NCTA) and as CEO of mobile carrier trade group CTIA from 1992 to 2004.

During all those years we have remained in touch and this connection was further strengthened when Barack Obama became President in 2008. As long as I have known Tom he has played a very active role in the Democratic Party and on one occasion I was invited to attend one of their events, which was quite an experience.

After the Obama win Tom became part of the Transition Team, overseeing the broad scale of technology, science and media. Before the election I had already discussed with Tom the idea that, if Obama were to win, I would be interested in sharing my views on telecoms with him. He took me up on that and put me in contact with Professor Susan Crawford who became the President's advisor on telecommunications. Together with an elite group of telecoms experts from America and Europe we produced several reports on telecoms infrastructure, structural separation, digital innovation and productivity.

There was also great interest in America in the developments around the Australian NBN and in 2009 I was invited to do a presentation on my views on this at a meeting in the White House. And our reports were used by the people within the FCC who wrote the American National Broadband Plan in 2010. It is interesting to see that many of the suggestions we made appeared in their plan.

The fact that Tom was part of the Transition Team, and the fact that he has shown great interest in different approaches to telecommunications, gives me a positive feeling about his appointment. Obviously an appointment like this is eliciting very strong comment in the USA — there are some who don't like the fact that Tom has such close links with the industry, while others see that as an advantage.

It is obvious that America is America, and that the political situation and the attitude to private and government investments is rather different from those in Europe and Australia. There will not be an NBN along the lines that developed in Australia, not even the tuned down-version of the Coalition.

As an American Tom is also a very strong proponent of reduced government involvement and strong support for commercial investments. While I do not always agree with his views on telecoms issues I have always been able to have very open discussions with him. My views are sometimes slightly more radical than his, but I have learned that the American way of thinking is indeed different and I can understand and respect that.

Tom's involvement in the mobile industry also gave him insight into spectrum issues, currently a hot topic in America. In the past he has challenged the broadcasters to become more active in the digital media and more innovative in using their spectrum for, among other things, mobile TV. So we can expect some fireworks there.

Of course, the really big issue in telecoms in the USA, as elsewhere, is the dominance of the vested interests and, particularly in America, their enormous influence in government policies (plutocracy). It will be interesting to see how Tom will handle these tricky issues. He will need all his diplomatic and negotiation skills to navigate a straightforward course through them.

I would like to take this opportunity to wish Tom wisdom and success in his new role.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Broadband, Policy & Regulation, Telecom

Categories: Net coverage

New Registry Agreement, All Good?

Fri, 2013-05-03 18:09

In the run-up to the launch of new gTLDs, ICANN has been negotiating both of its main supplier contracts. The registrar contract (Registrar Accreditation Agreement or RAA) negotiations are now all but complete. A new contract draft has been posted for public comment and it now seems likely that in little over a month, this will become the official new 2013 RAA.

The registry contract (Registry Agreement or RA) negotiations have been going on for much less time and really only picked up in earnest after several registries made outspoken, sometimes angry, comments at the way they felt ICANN was handling the negotiations.

Subsequently, a registry negotiating team was set up to work with ICANN in a similar fashion to the registrars (who have been locked in negotiations with ICANN for getting on to almost 2 years now). For ICANN and new gTLD applicants, time is of the essence as the program obviously cannot launch without proper contracts in place to cover the whole domain name registration, management and distribution chain.

This impacts registries as well of course, as many of them are either applicants themselves, or working for applicants.

On April 29, ICANN's VP for DNS Industry Engagement Cyrus Namazi posted an upbeat report on the negotiations on the ICANN blog. "I am delighted to report that we have now posted a proposed final draft of the New gTLD Registry Agreement," Namazi wrote. "Similar to the proposed 2013 Registrar Accreditation Agreement (RAA) that was posted for public comment on 22 April 2013, the ICANN community is now able to review and comment on this final draft before it is approved and adopted."

Namazi's comments are clearly drafted to get the message across that all is well and that the registries and ICANN left the negotiating room as BFFs. "A new and highly spirited sense of mutual trust has catapulted us into a fresh atmosphere of collaboration," he added. "The spirit of teamwork, productive dialogue and partnership that has underpinned this negotiation process is tremendously heartwarming, as it has allowed us to bring to fruition a robust contractual framework for the New gTLD Program."

Really? In a letter sent to ICANN, senior managers at Verisign, the most powerful registry by market share, are extremely critical of the way ICANN has handled the negotiations and of the end result.

Issues appear to center around a clause which would give the ICANN Board a unilateral right to amend the contract. This has been strongly criticized by both registries and registrars, and Verisign is not happy with what it sees as a tool to allow ICANN to change the rules of engagement for its contracted parties at will.

The letter is a strongly worded as Namazi's post is lovey-dovey. So who is right? The proposed new RA was posted for public comment on April 29 for 42 days. Comments will then be collated and summarised for the ICANN Board, so that it can decide whether to approve the contract or not.

This is a major test for today's ICANN. On the one hand, it needs to show that it can control its supplier chain and provide Internet users with a safe and stable environment. But it also needs to show that it can provide the businesses in the domain industry with such an environment, especially with an expected 1,200 new TLDs coming online in the next few years. And lastly, ICANN needs to show that the bottom-up policy development process that gives it its unique position in the world of Internet governance is sacrosanct. Right now, the registries seem to think that ICANN is ready to throw the model under the bus whenever it suits its own devises.

Written by Stéphane Van Gelder, Chairman, STEPHANE VAN GELDER CONSULTING

Follow CircleID on Twitter

More under: Registry Services, ICANN, Policy & Regulation, Top-Level Domains

Categories: Net coverage

Noncommercial Users Ask ICANN Board to Review Decision to Expand Trademark Rights in New Domains

Thu, 2013-05-02 07:38

ICANN's Non-Commercial Stakeholders Group (NCSG) has filed a Request for Reconsideration with ICANN's Board of Directors regarding the staff's decision to expand the scope of the trademark claims service beyond that provided by community consensus policy and in contradiction to ICANN Bylaws.

Specifically at issue is ICANN staff's unilateral decision to adopt the "trademark +50" proposal for new domains, which would provide trademark holders who have previously won a UDRP or court decision with rights to 50 additional derivations of their trademark in ICANN's Trademark Clearinghouse (TMCH). Under staff's plan, large trademark holders that register in the clearinghouse will be provided thousands of derivations of their trademarks since each separate country's registration of the same trademark provides the brand owner with an additional 50 entries in the TMCH.1 Entries in the TMCH trigger infringement warning notices to domain name registrants which can lead to increased liability for registrants, discourage lawful registrations, and chill speech on the Internet.

ICANN's bottom-up community-developed process for creating policy had approved of a TMCH model that allowed "exact matches" of trademarks only to be placed in the TMCH. In 2007, ICANN's GNSO Policy Council, including representatives from the Intellectual Property and Business Constituencies, approved the GNSO recommendations that created special protections for trademark rights by a supermajority vote.2 As part of the multi-year consensus process, both the subsequent Special Trademarks Implementation (STI) Team and the Implementation Review Team (IRT) considered the issue of providing rights to exact matches or additional derivations, and both community-developed teams specifically opted for exact matches only to be placed into the TMCH. ICANN's CEO testified before U.S. Congress in 2012 that expanding the scope of the TMCH further would be inappropriate since it would create new rights that do not exist in law and ICANN should not be creating unprecedented rights.3

Many months after the final TMCH model of exact matches only was published in ICANN's Applicant Guidebook and new domain businesses relied on it when filing their applications, ICANN's Intellectual Property and Business Constituencies lobbied ICANN's new CEO to make drastic changes to the community-developed policy and grant additional trademark rights in the TMCH.

After the October 2012 Toronto ICANN Meeting, a "strawman solution" was proposed by ICANN's new CEO which included a number of IPC/BC's substantive policy proposals to give trademark holders additional privileges in the domain name system, including changing the exact matches only standard approved of by the community.

Yet ICANN's CEO recognized that expanding the scope of the trademark claims service was a policy matter requiring GNSO Council guidance, as he stated on his blog4 in December 2012; and the CEO did write to the GNSO Council to request guidance on this policy proposal. Under ICANN's Bylaws, staff may not change GNSO-approved policy, except under a strict process that involves consulting with the GNSO and a 2/3 vote of the Board of Directors.

NCSG filed comments on the proposed policy changes and warned against re-opening previously closed consensus agreements and circumventing ICANN's stated bottom-up policy development process.5 In addition to the flawed process for adopting this policy, NCSG also detailed substantive concerns with staff's proposal to expand trademark rights beyond anything that exists in trademark law. It came as no surprise that only members of the IPC and BC supported the strawman proposals in ICANN's comment period.6

In the GNSO Council's February 29, 2013 response to the CEO regarding the proposal to expand the scope of trademark claims, the GNSO Chair wrote, "the majority of the council feels that proposal is best addressed as a policy concern, where the interest of all stakeholders can be considered."7 Thus the GNSO Council also determined this specific proposal to be a policy matter, requiring consultation from the entire community before such a change could be made to existing GNSO Council approved policy.

Yet with only an email sent on 20 March 2013, ICANN staff announced in an attached memorandum that it would expand the scope of the trademark claims service to give trademark holders rights to 50 additional derivations of their trademark, in contradiction to GNSO developed policy of exact matches only and the subsequent requested GNSO Council guidance on the matter.8

Staff's only explanation for such a drastic shift in the creation of new rights: "this proposal appears to be a reasonable add on to an existing service, rather than a proposed new service". Thus with a single line of evasive text, years of hard-fought community consensus policy was brushed under the rug and the new era of policy development via ICANN staff edict was solidified.

On 19 April 2013 NCSG filed this Request for Reconsideration of the staff decision because ICANN did not follow its stated process for changing GNSO-approved policy. If ICANN wants to deviate from Supermajority GNSO-approved policy, it must follow the process outlined in the organization's Bylaws, Annex A Section 9.9 As an organization that holds itself out as a champion of the bottom-up policy development process, ICANN is obligated to comply with community-developed policies, unless the Board of Directors can muster the necessary 2/3rd vote to over-turn the community decision. That mandatory process was not followed by ICANN's staff or Board in over-turning the community-approved policy in favor of staff's policy to expand the scope of TMCH.

ICANN's Board Governance Committee has thirty days in which to make to a recommendation to ICANN's Board of Directors regarding the NCSG's Request for Reconsideration or report to the Board on why no final recommendation is available and provide a timeframe for making a final recommendation on the matter. ICANN's entire Board should consider the recommendation of the Board Governance Committee at its next regularly-scheduled Board meeting.

Under Article IV Section 2 of ICANN's Bylaws, the Request for Reconsideration process is a mechanism intended to reinforce ICANN's accountability to the community for operating in a manner consistent with its Bylaws.10 Because the staff's unilateral decision to change GNSO-approved policy was not consistent with ICANN's Bylaws and contradicted ICANN stated policy, NCSG filed the Request to correct the error and bring ICANN into compliance with its Bylaws and stated policies.

NCSG requests that the Board reinstate the community-developed policy of giving trademark holders rights to include exact matches of their trademark only in the TMCH, which was the policy stated in ICANN's Applicant Guidebook when ICANN accepted applications for new domains.

NCSG's Request for Reconsideration (PDF)
Attachments to NCSG's Request for Reconsideration (PDF)
ICANN Website on Requests for Reconsideration

1 http://domainincite.com/...

2 http://gnso.icann.org/en/issues/new-gtlds/...

3 http://www.internetcommerce.org/ICANN_Amnesia

4 http://blog.icann.org/2012/11/a-follow-up-to-our-trademark-clearinghouse-meetings/

5 http://ipjustice.org/wp/2013/01/14/...

6 http://forum.icann.org/lists/tmch-strawman/msg00096.html / See also:
Comments of Registrar Stakeholder Group
Comments from New TLD Applicant Group
Comments of Non-Commercial Stakeholder Group
Comments of the Internet Service Provider Constituency
Comments of Public Interest Registry

7 http://gnso.icann.org/bitcache/...

8 http://newgtlds.icann.org/en/about/trademark-clearinghouse/...

9 http://www.icann.org/en/about/governance/bylaws#AnnexA

GNSO Policy Development Process

Section 9. Board Approval Processes. a. Any PDP Recommendations approved by a GNSO Supermajority Vote shall be adopted by the Board unless, by a vote of more than two-thirds (2/3) of the Board, the Board determines that such policy is not in the best interests of the ICANN community or ICANN. If the GNSO Council recommendation was approved by less than a GNSO Supermajority Vote, a majority vote of the Board will be sufficient to determine that such policy is not in the best interests of the ICANN community or ICANN.

b. In the event that the Board determines, in accordance with paragraph a above, that the policy recommended by a GNSO Supermajority Vote or less than a GNSO Supermajority vote is not in the best interests of the ICANN community or ICANN (the Corporation), the Board shall (i) articulate the reasons for its determination in a report to the Council (the "Board Statement"); and (ii) submit the Board Statement to the Council.

c. The Council shall review the Board Statement for discussion with the Board as soon as feasible after the Council's receipt of the Board Statement. The Board shall determine the method (e.g., by teleconference, e-mail, or otherwise) by which the Council and Board will discuss the Board Statement.

d. At the conclusion of the Council and Board discussions, the Council shall meet to affirm or modify its recommendation, and communicate that conclusion (the "Supplemental Recommendation") to the Board, including an explanation for the then-current recommendation. In the event that the Council is able to reach a GNSO Supermajority Vote on the Supplemental Recommendation, the Board shall adopt the recommendation unless more than two-thirds (2/3) of the Board determines that such policy is not in the interests of the ICANN community or ICANN. For any Supplemental Recommendation approved by less than a GNSO Supermajority Vote, a majority vote of the Board shall be sufficient to determine that the policy in the Supplemental Recommendation is not in the best interest of the ICANN community or ICANN.

10 http://www.icann.org/en/about/governance/bylaws#IV

Written by Robin Gross, Founder and Executive Director of IP Justice

Follow CircleID on Twitter

More under: Domain Names, Registry Services, ICANN, Internet Governance, Policy & Regulation, Top-Level Domains

Categories: Net coverage

Reframing the Infrastructure Debate

Wed, 2013-05-01 23:02

Fast and reliable infrastructure of any kind is good for business. That it's debatable for the Internet shows we still don't understand what the Internet is — or how, compared to what it costs to build and maintain other forms of infrastructure, it's damned cheap, with economic and social leverage in the extreme.

Here's a thought exercise… Imagine no Internet: no data on phones, no ethernet or wi-fi connections at home — or anywhere. No email, no Google, no Facebook, no Amazon, no Skype.

That's what we would have if designing the Internet had been left up to phone and cable companies, and not to geeks whose names most people don't know. What those geeks came up with was something no business or government would ever contemplate: a base infrastructure of protocols that nobody owns, everybody can use and anybody can improve. For all three of those reasons the Internet supports positive economic externalities beyond calculation.

The only reason we have the carriers in the Net's picture is that we needed their wires. They got into the Internet service business only because demand for Internet access was huge, and they couldn't avoid it. Yet, because we still rely on their wires, and we get billed for their services every month, we think and talk inside their conceptual boxes.

Remember that the telco and cableco business models are based on routing everything through billable checkpoints. Is this what we want for the rest of the Net's future?

We have to remember that the Internet isn't just a service. It's the platform for everything we connect. And the number of things we will connect over the next few years will rise to the trillions.

To understand how the Internet ought to grow, try this: cities are networks, and networks are cities.† Every business, every person, every government agency and employee, every institution, is a node in a network whose value increases as a high multiple of all the opportunities there are for those nodes to connect — and to do anything. This is why every city should care about pure connectivity, and not just about billable phone and cable company services.

We should be building a network infrastructure that is as neutral to purpose as water, electricity, roads and sewage treatment — and that anybody, including ordinary citizens, can improve. We can't do that if we're wearing blinders supplied by AT&T, Comcast, Time Warner and Verizon.

† I came to the realization that networks are cities, and vice versa, via Geoffrey West — first in Jonah Lehrer's "A Physicist Solves The City," in the New York Times, and then in West's TED talk, "The Surprising Math of Cities and Corporations." West is the physicist in Lehrer's piece. Both are highly recommended.

Written by Doc Searls, Author

Follow CircleID on Twitter

More under: Access Providers, Broadband, Telecom, Web

Categories: Net coverage

US Smart Grid Networks Exploiting Infrastructure to Provide Wireless Broadband

Wed, 2013-05-01 19:20

The USDA Rural Development's Rural Utilities Service (RUS) has now spent the $250 million committed for smart grid technologies. To this has been added an additional $201 million in funding approved by the Agriculture Secretary to electricity utilities in eight states to install smart grid technologies and improve their generation and transmission facilities. The beneficiaries are spread among a large number of states.

This investment is helping smart grids to become the norm across the country. A side benefit is that utilities are also developing their smart grids for telecoms over and above that used by meters to send data to network controllers.

As an example, earlier this year the utility serving Santa Clara began using its smart grid technology and infrastructure to deliver free citywide outdoor WiFi. While meters send data via an existing wireless network, a separate channel is used to provide outdoor internet access. The WiFi network is growing in scope and reach as more premises are equipped with smart meters.

The potential for expanding WiFi coverage is huge. There are about 120 municipalities with citywide WiFi networks accessible to the general public. In addition, there are about 60 cities with citywide or near citywide coverage though these networks are now limited to government applications, such as public safety. There are also about 80 or more cities with large outdoor WiFi areas, mostly located in parks and downtown zones.

A hindrance to cities aiming to develop comprehensive WiFi networks has come from the powerful telecoms industry, which employs its lobbying clout to push for laws blocking or preventing municipalities from offering WiFi or fixed broadband services.

The use of smart meters to provide WiFi using existing (and expanding) infrastructure presents a separate challenge, since the telcos would have to battle utilities rather than municipal governments.

Written by Henry Lancaster, Senior Analysts at Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband, Telecom

Categories: Net coverage

Will the Trademark Clearinghouse Fulfill its Potential?

Wed, 2013-05-01 17:40

ICANN created the Trademark Clearinghouse (TMCH) as a way to streamline the repetitive process forced on trademark owners during the launch of new top-level-domains. With the expected tsunami of hundreds of new TLD's starting later this year, the TMCH should generate a clear benefit for trademark owners who elect to participate in Sunrise and Claims Periods.

The side effect of introducing new TLDs is that the legacy TLDs will be making changes to make sure they are competitive against the new TLDs. This means they will be relaxing restrictions and opening up unused namespaces at the second and third-levels. Many of these will follow a Sunrise or Grandfathering process as a way to implement the changes.

Already three existing TLDs (one sTLD and two ccTLDs) have announced such policy changes and decided they would like to utilize the TMCH Sunrise tokens for their Sunrise Period. This includes .Jobs, Radio.AM and Radio.FM. Donuts, the largest applicant with over 300 TLD applications, have also indicated they will use the Sunrise token from the TMCH for a universal blocking service called Domain Protected Marks List (DPML).

All this is happening before the TMCH has even supported its first new TLD. While ICANN has welcomed the use of TMCH by .Jobs, it remains to be seen if ICANN will also welcome use of the TMCH by ccTLDs.

The eventual benefits and viability of the TMCH will hinge on a few factors:

• Will trademark owners even use it?
• Will the main driver be participation in Sunrise or Claims?
• Will other existing TLDs want to use it?

Will Trademark Owners Even Use it?

It is a given that trying to participate in every future Sunrise Period would overwhelm the budgets of nearly every trademark owner. Every sage legal advisor is counseling that the trademark owner must be ultra-selective about which Sunrise Periods they engage in.

On the other hand, a review of the Trademark Agents published on the TMCH website show a good number of law firms have already advanced the TMCH the minimum $15000 required to be an Agent. If this trend continues, then it is a clear indicator that law firms will aggressively market the TMCH to their clients. (Disclosure: My firm, TM.Biz is offering a portal for these Trademark Agents).

Will the Main Driver Be Participation in Sunrise or Claims?

Trademark Claims provides some protection in every new TLD. But it is for exact matches only and only for the first 90 days. This forces trademark owners to also subscribe to a watching service that catch confusingly similar registrations not caught by the Claims service. I predict trademark owners will elect to do both Claims and watching to ensure they catch domains that might confuse their customers.

Will other existing TLDs want to use it?

There are actually two parts to the TMCH. The validation service is performed by Deloitte and CHIP. They are issuing Sunrise tokens called Signed-Mark-Data (SMD) files to trademark owners as proof that a trademark has satisfied the requirements for the typical Sunrise Period. The Database Administrator for the TMCH is IBM. They actually help Registries and Registrars operate the Sunrise and Trademark Claims Periods. The validation service initially launched on March 26. The database part is expected to launch in July.

But there are applications for just the TMCH Sunrise tokens that do not require IBM to be used. This is because the SMD file is portable. For example, any country-code TLD who decides to change their policies and wanted to conduct a Sunrise Period first, could accept SMD files from trademark owners.

Also, any TLD that wanted to accept SMD files for a new Rights Protection Mechanism, as Donuts is planning; also do not need IBM in the process.

The .Jobs Sunrise Period

The .Jobs TLD has decided to eliminate the current restriction that .Jobs domain names must match company names. This means that product and division names will be eligible for .Jobs. Before this change takes effect, .Jobs will first conduct the Sunrise Period that is designed for new TLDs. .Jobs will utilize both parts of the TMCH. Thus they need to wait for IBM, their Back-end Registry and Registrars all to be operational before they can conduct their Sunrise Period.

The Radio Global Domains

The .AM and .FM ccTLD's, have long been re-purposed for the Radio industry. They are now introducing new namespaces, called Radio Global Domains, which are designed to target new market segments within the Radio industry. These will be radio.am and radio.fm. Before these changes take place, they will also undergo a Sunrise Period starting May 28. Validation for the Radio Global Domains Sunrise Period will be performed on either trademark data or the Sunrise tokens called Signed-Mark-Data (SMD) files issued by the TMCH. All this is happening without the need for the involvement of IBM, or even for Registrars to support the new protocols required for the new TLD Sunrise Periods. (Disclosure: My firm, TM.Biz will be handling the trademark validation, SMD validation and direct submission of Sunrise registrations to the Registry).

It is still an open issue whether the TMCH will be capable of issuing SMD files by May 28 for use by the Radio Global Domains. Or if the TMCH is capable of issuing SMD files by this date, whether ICANN will allow the TMCH to release the SMD files so that the ccTLDs can use them.

There are no doubt other ccTLDs that are interested in changing their registration rules and restrictions that might consider holding a Sunrise Period first. I predict that these ccTLDs would be interested in using the SMD files as well, if allowed by ICANN.

Additional Rights Protection Mechanisms

The largest TLD applicant, Donuts, is also planning to accept SMD files for its universal blocking service called Domain Protected Marks List, or DPML. As applicant of over 300 TLD's, with half of those uncontested, a DPML represents a good value for trademark owners.
There may be other applicants that decide to offer new Rights Protection Mechanisms that utilize the SMD file.

Hopelessly Optimistic

The Trademark Clearinghouse has enormous potential to support the domain name industry. The portability of the SMD files enables many uses that were not originally envisioned by its creators. Certainly, the days of a TLD manually checking trademark databases should be coming to an end with SMD files becoming the new de facto standard for trademark validation. It will be interesting to see how this evolves over time.

Written by Thomas Barrett, President - EnCirca, Inc

Follow CircleID on Twitter

More under: Domain Names, ICANN, Law, Top-Level Domains

Categories: Net coverage

CERN Celebrates 20 Years of The Free And Open Web

Wed, 2013-05-01 05:06

Of all the many applications and services that run on top of the Internet, arguably none has been more successful than that of the World Wide Web. Invented by Tim Berners-Lee back in 1989 while he was a physicist at CERN, the "Web" has fundamentally changed almost every aspect of our life… and become a part of basically every aspect of our life. Think of a part of your life… and then think of the websites that are part of that. Whether it is social networks, banking, shopping, dating, news, reading, publishing, writing, gaming, sports and now even communicating in real-time… all are aspects that somehow involve the "Web".

Today, April 30, is a special day in the history of the Web, because, as recounted on that newly-redesigned famous website (because it was the first website), info.cern.ch, it was twenty years ago today that CERN published a statement that put the WWW technology out in the public domain for all to use. Building on the long history of openness of the Internet, CERN stated very clearly that "CERN relinquishes all intellectual property rights to this code, both source and binary form and permission is granted for anyone to use, duplicate, modify and redistribute it”.

And thus was born the wider Web ... anyone could download, use and modify the W3 server software and the W3 client and start creating new sites. And people did! By the tens… and hundreds… and on and on… changing and modifying the code to satisfy their own dreams and ideas. Keep in mind, this was before Mosaic and other graphical clients changed the Web again by introducing images along with text. The original Web was one of text. I remember telnetting to info.cern.ch back in the early '90s to see what this "World Wide Web" thing was all about - and pressing numbers to follow links. It was a very different world.

Still, from those early days - and more importantly from the openness of those early days - came everything else about the Web that we use today. Those early adopters didn't need to ask anyone for permission to innovate… they just downloaded the code and started hacking away.

Thank you, CERN, for the reminder of the importance of today - and of the incredible importance of an open Web… riding on top of an open Internet.

P.S. Vint Cerf has a great retrospective out today as well: The open internet and the web

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: Web

Categories: Net coverage

CERN Recreating First Webpage to Commemorate 20th Anniversary

Tue, 2013-04-30 20:50

A team at the European Organisation for Nuclear Research (CERN) has launched a project to re-create the first web page. The aim is to preserve the original hardware and software associated with the birth of the web. The initiative coincides with the 20th anniversary of the research centre giving the web to the world.

Read full story: BBC

Follow CircleID on Twitter

More under: Web

Categories: Net coverage

Announcing the Final Terms of the First Applicant Auction for Contested gTLDs

Tue, 2013-04-30 02:25

We received several emails and phone calls with thoughtful comments on the proposed plan for the first Applicant Auction and have made several small changes to the plan. The final terms will be sent to applicants who requested the RFC, and can also be requested on our website.

Here is a quick summary of the changes:

Auction prices:
Several bidders wanted a bit more certainty about how prices will be set. In response, we commit to setting the first round prices exactly as proposed (min price is 0, max price is $50,000 * number of bidders). Further, if all bidders are still bidding in the second round, then second round prices will be exactly as proposed (start price + 10% * number of bidders).

Information transparency:
We received balanced comments on the new information policy, which indicates that we found a decent compromise. In particular, most applicants accepted that preserving winning bidders' privacy by not disclosing exact winning prices was a worthwhile goal. To make this provision meaningful, we will comply with one applicants' request to contractually obligate applicants to keep the winning price confidential.

Timeline:
Finally, applicants felt that a 2-day auction was too short and that more time should be allotted to ensure that all bidders have time to familiarize themselves with the system and the process, and to think through how to bid. To accommodate this, we will make two changes:

a) As originally planned, we will offer a voluntary mock auction 3 days before the auction, scheduled for Thursday May 23rd. This will be run exactly as the real auction except that the results have no meaning, and the schedule will be heavily accelerated. We encourage all bidders to participate - the mock auction is a good test to make sure that you have the right login credentials and know how to place bids.

b) we will plan for the auction to take 3 or 4 days. The first auction round will be on May 28th and will last for 24 hours, as before. For subsequent rounds, we will do our best to set a schedule that reflects actual bidders' time zones. Rounds will last 2 hours initially, but if bidding activity during the auction indicates that not that much time is needed, we may shorten the rounds. In no case will the rounds be shorter than 30 minutes.

Deposits:
We will not allow bidders to increase their deposit during this auction. The reason is that the auction will be relatively short (4 days), that we would like to keep the first auction simple. We are open to changing this for future auctions.

A small change to mitigate order of magnitude error:
Finally, a small addition to the terms. In any round after Round 1, bidders may bid up to 9 times the Start Price of a round, or Minimum Price to Continue of a round, whichever is higher. This helps protect bidders from accidentally adding an extra "0" when typing in their bid."

We hope that these rules will be acceptable to all interested bidders and maximize participation. Any bidders who do not find these terms workable for them are invited to comment and participate in one of our future auctions. Those interested in participating in the first auction and receiving legal documentation and login credentials for the mock auction should register their interest on our website.

Written by Sheel Mohnot, Consultant

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

Can't Sell Your IPv4 Numbers? Try Leasing Them

Mon, 2013-04-29 23:40

In a "policy implementation and experience report” presented at ARIN 31 in Barbados, ARIN's staff noted that they are seeing "circumstances" related to the leasing of IPv4 number blocks. At the recent INET in Denver, ARIN's Director John Curran alleged that there is a "correlation" between address leasing activity and organizations that have been unable to complete specified transfers through the ARIN process, which requires needs-based justification.

The issue of leasing — or rather sub-leasing, because ARIN is already leasing the addresses to its members — is yet another symptom of the growing scarcity of IPv4 addresses. Subleasing is interesting, however, as another example of the way RIR's bureaucratic control of transactions between willing sellers and buyers can lead to workarounds that make the Whois directory less accurate.

It's unclear exactly how ARIN is aware of this nominally private activity. Perhaps someone involved is tipping ARIN off, or maybe its staff is observing instances where the ASN information associated with a routed block is changing while the contact information in the ARIN Whois directory remains the same. In either case, a greater degree of transparency about refused transfers and the basis for ARIN's determination would be welcome. On a related note, we sought to shed some light on the emerging transfer market in a paper last year.

What is troubling, for ARIN at least, is that the subleasing of addresses is taking place outside of the RIR address governance regime. It is understandable that ARIN would react to something that might undermine its control over address space. Part of ARIN's power stems from its ability to identify who is allocated or assigned what address block(s) via its Whois Directory Service. Practically, the Whois has also been used to identify the party actually routing an address block, although technically this is a distinct activity over which ARIN claims no control.

From an operational perspective, if the organization actually routing an address block is unable to be contacted this could be detrimental to administrators attempting to resolve networking issues, and to parties seeking to use the Whois for law enforcement or related policy matters. However, at this point it is unclear if lessees are actually unreachable. In fact, one could argue that lessors are in a better position to keep accurate lessee contact records than the address registry — they are invoicing their lessees, we assume! Whether, and under what conditions, they would release contact information is basically unknown at this point.

For now, ARIN does not seem to be too alarmed. It suggests three potential policy solutions:

  1. Decide this is not an issue for ARIN to deal with
  2. Create new policy requiring that the actual party using the addresses be listed as an operational contact in Whois
  3. Create new policy that would prevent leasing of address space without needs based justification.

Again, absent any data on leasing, it is hard to say which way ARIN or its membership might go, although the third option seems increasingly unlikely as ARIN moves closer to IPv4 exhaustion and the RIPE region is contemplating elimination of needs based justification entirely.

It may just turn out that private subleasing transforms the address transfer market. As Addrex's Charles Lee pointed out at INET in Denver, all kinds of parties lease assets (including ARIN leasing addresses to its own customers). It serves a useful business purpose and is not a bad thing per se. The entry of large subleasing companies without any Internet operations, Lee noted, might transform the address market. It could create entirely new ways of allocating addresses and provisioning post allocation services. It might lead to innovative product offerings such as providing means to mitigate the technological obsolescence of IPv4. We just don't know. What we know for sure is that it will create governance dilemmas.

Written by Brenden Kuerbis, Fellow in Internet Security Governance, Citizen Lab, Univ of Toronto

Follow CircleID on Twitter

More under: IP Addressing

Categories: Net coverage

Typosquatting Claims Against Security Researcher Are Legally Complicated - Gioconda v. Kenzie

Mon, 2013-04-29 22:35

Kenzie is a security researcher who has registered numerous domain names that are typographic errors of well-known trademarks (e.g., rnastercard, rncdonalds, nevvscorp, rncafee, macvvorld, rnonster, pcvvorld). He points the domain names to the actual sites in question (e.g., rncdonalds points to mcdonalds.com), but he is looking to demonstrate how these typo domains are used for "social engineering" attacks.

Kenzie did not offer the domain names for sale, did not read the emails intended for the subject organization, and generally kept his whole scheme out of the public eye. Upon demand, he also offered to transfer the domain names to the organizations in question.

Nevertheless he was sued by Gioconda Law Group for registering Giocondolaw.com — with "o" instead of "a" [see: Gioconda Law Group v. Kenzie, 2012 US Dist LEXIS 187801 (S.D.N.Y. Apr. 23, 2013)]. In response to Gioconda's complaint, Kenzie, proceeding pro se, asserted a variety of defenses, including a critique of American privacy law. Gioconda moved for judgment on the pleadings.

The court struggles with the application of the Anticybersquatting Consumer Protection Act (ACPA) factors to this case. On the one hand, this is clearly not a case where the registrant is trying to profit by selling back the domain name. On the other hand, the court says, all non-commercial uses are not necessarily exempt from the ACPA. [Not a particularly speech friendly position.]

Ultimately, the court says that it's not a case that can be resolved on the pleadings:

Defendants's alleged ideological, scholarly, and personal motives for squatting on the [domain name], while perhaps idiosyncratic, do not fall within the sphere of conduct targeted by the ACPA's bad faith requirement, If anything, given that defendant aims to both influence plaintiff's behavior and shape public understanding of what he perceives to be an important vulnerability in cyber security systems, this case arguably falls closer to cases involving parody and consumer complaint sites designated to draw public attention to various social, political, or economic issue.

It's possible plaintiff can prevail, but it would have do to so under a more fact-specific totality of the circumstances inquiry.

This is an interesting case that highlights the problems faced by security researchers generally. While the risk of liability here is less than what security researchers generally face (e.g., liability under the Computer Fraud and Abuse Act), it still shows a judge reluctant to grant the researcher's conduct full protection as a non-commercial, First Amendment-protected venture.

Written by Venkat Balasubramani, Tech-Internet Lawyer at Focal PLLC

Follow CircleID on Twitter

More under: Cybersquatting, Domain Names, Law, Security

Categories: Net coverage

Arrest Made in Connection to Spamhaus DDoS Case

Mon, 2013-04-29 22:15

According to a press release by the Openbaar Ministerie (the Public Prosecution Office), a dutch man with the initials SK has been arrested in Spain for the DDoS attacks on Spamhaus.

Brian Krebs reports: "A 35-year-old Dutchman thought to be responsible for launching what's been called 'the largest publicly announced online attack in the history of the Internet' was arrested in Barcelona on Thursday by Spanish authorities. The man, identified by Dutch prosecutors only as 'SK,' was being held after a European warrant was issued for his arrest in connection with a series of massive online attacks last month against Spamhaus, an anti-spam organization."

Follow CircleID on Twitter

More under: Cyberattack, Cybercrime, DDoS, Law, Security, Spam

Categories: Net coverage

Why Most Discussions for Fibre Optic Infrastructure Take Place from the Wrong Perspective

Mon, 2013-04-29 21:44

Fibre-based infrastructure requires vision and recognition of the fact that many of today's social, economic and sustainability problems can only be solved with the assistance of information and communications technology (ICT). In many situations the capacity, robustness, security and quality necessary for this calls for fibre optic infrastructures. This need will increase dramatically over the next 5 to 10 years as industries and whole sectors (healthcare, energy, media, retail) carry out the process of transforming themselves in order to much better address the challenges ahead.

Most discussions regarding the need for fibre optic infrastructure take place from the wrong perspective — based on how fast people need the internet to be when they download their emails, web information, games and movies. Fibre optic technology has very little to do with this — ultimately all of that 'residential' traffic will account for less than 50% of all the traffic that will eventually flow over fibre optic networks.

The real reason this type of network is needed relates to the social and economic needs of our societies, and there are many clear examples that indicate that we are running out of steam trying to solve some of our fundamental problems in traditional ways.

For instance, at this moment discussions are taking place in every single developed country in the world about the fact that the cost of healthcare is unsustainable. These costs will grow — over the next 20 years — to 40%-50% of total government budgets — clearly impossible. So we face a dilemma. Do we lower the standard of healthcare services, at the same time making them more costly for the end-user?

If we want to maintain our current lifestyle the only solution is to make the healthcare system more effective, efficient and productive. And this can only be done with the help of ICT. To make it more productive, health needs to be brought to the people rather than the other way around, as is the case at present. Similar examples apply to the education system, the energy systems and the management of cities and countries in general. We need to create smart cities, smart businesses and smart countries, with high-speed infrastructure, smart grids, intelligent buildings, etc.

In order to manage our societies and economies better we need to have much better information about what is happening within all of the individual ecosystems, and in particular information about how these different systems interact. Currently they all operate within silos and there is little or no cooperation or coordination between them. ICT can be the bridge to bring them together; to collect data from them and process it in real time. Information can then be fed back to those who are managing the systems, and those who operate within them, such as doctors, teachers, business people, bureaucrats, politicians — and, of course, to you and me.

Some of these data interactions are already happening around smartphones, social media, traffic and crowd control and weather information. This is only the start of what is known as the Internet of Things (IoT) or machine-to-machine communication (M2M).

ICT cannot solve world hunger, but without ICT world hunger cannot be solved, and this applies to all the important social and economic problems that societies around the world are now facing.

None of this can be done overnight; it requires massive transformations of industries and sectors. There is no instant business model available that will supply an immediate return on the investment that is needed to create these smart systems. All of these investments need to be looked at over a period of 10, 20 years and even longer. No private business will take such a business risk. To make it happen government leadership and government policies are needed.

This is also the message from the UN Broadband Commission for Digital Development, and it applies to countries all over the world. More than 120 countries worldwide have now developed broadband policies, recognising that such infrastructure is critical to their development. The challenge now is to put these policies into practice/implement these policies, and at a time when government leadership around the world as at an all-time low.

Ultimately all of these developments will require national fibre optic networks. There simply is no other technology that can handle the capacity of data and applications that will be needed to run the cities and countries from today onwards. This infrastructure needs to be robust. It has to have enormous capacity. It needs to be secure and to be able to protect privacy. There is simply no other infrastructure technology that is up to that job.

So those business and government leaders who are in charge of looking towards the future do have an obligation to ask themselves, based on the above, whether we can afford not to have a fibre optic network.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband, Telecom

Categories: Net coverage

Join Uniregistry

Mon, 2013-04-29 20:42

What happens when you take a team of experts, at the top of the naming industry, and unite them behind a single, high-minded purpose? You get the most service-based and holistic approach to registry operations that the industry has ever seensomething we call "Uniregistry."

Software Developers – We are looking for full-stack developers that are comfortable working at any level of web-development and have the initiative to see a project through from start to finish. Our technology is currently built on top of PHP, MySQL, and Javascript but we are looking for anyone who feels at the top of their game developing for web and mobile in any technology. If you fit the bill we will fly you here, interview you confidentially and deliver you a lifetime opportunity to work on things that millions of people a day will use.

Systems Specialist – A successful technology company doesn't exist without a robust and scalable foundation. Do you have what it takes to build infrastructure to handle millions of visitors a day? Then we are looking for you. Candidates should have multiple years experience managing Linux based systems, popular opensource databases, as well as have a sound understanding of networking and the services that operate over them. Being well versed in systems automation, virtualization, and mass hosting are assests as well. Big things lie ahead for the fortunate candidate who chooses the red pill.

Front-end Developers – Someone with a keen sense of aesthetics and human behavior. Can turn sketches and ideas into web reality. The right individual needs to understand the consequences of their choice in code and execution. Not just nice looking pages but the ability to turn designs into a functioning Websites. HTML, CSS, Javascript, will be your primary tools. Our facility in Cayman is world-class and right across from the beach. Swim to work and shower here. Work with people like you and live tax-free.

Marketing People – Help us find the right programmer/developers in your organization and join them here in Cayman as we grow our existing registry operations business. We will need to promote the new namespaces we're charged with operating. We are going to have all kinds of fun doing that, but first we need to finish the critical infrastructure we've started. We need a great team for that and we want you to be a part of it.

Send your resumé today: careers@uniregistry.com

Follow CircleID on Twitter

More under: Domain Names, ICANN, Top-Level Domains

Categories: Net coverage

Wrap-up: ICANN 46 in Beijing

Fri, 2013-04-26 18:24

Earlier this April, the largest ICANN meeting ever — more than 2,500 attendees — kicked off in Beijing. Given the imminent addition of hundreds of "dot Brands" to the Internet, the topic of new gTLDs was at the top of the discussion list for all attendees. So far, well over 100 new gTLD applications have passed the Initial Evaluation stage, meaning they're on their way to becoming live domains.

At the meeting, ICANN's Government Advisory Committee (GAC) released its formal advice on new gTLDs. The GAC made a number of points to the ICANN Board including:

  • A request to further review a specified list of strings and present advice at ICANN 47
  • Six specific contractual safeguards that should be placed on all gTLDs, including WHOIS verification and abuse mitigation
  • Contractual safeguards that should be placed on particular categories of TLDs, including consumer protection, sensitive strings, regulated markets and those with restricted registration policies
  • Urging the ICANN board to reconsider its decision to allow singular and plural versions of the same strings

GAC advice is becoming the single biggest area of uncertainty for new TLD applicants. It not only appears to adjust requirements approved by the community in the Applicant Guidebook, it also is evolving with each new communique.

One reporter noted, "It looks like at least 517 new gTLD applications [may] be affected by the GAC's advice." I'm sure there will be many more discussions about this topic.

Registrar Accreditation Agreement (RAA) and Registry Agreement

ICANN CEO Fadi Chehade announced newly revised versions of both the 2013 Registrar Accreditation Agreement and Registry Agreement, which are now posted for public comment. ICANN is looking at ways to keep the debate over these contracts from delaying the overall application process.

Trademark Clearinghouse

Earlier, in March, the Trademark Clearinghouse (TMCH) opened. TMCH allows brand owners to submit their trademark data into one centralized database, prior to and during the launch of new gTLDs. Since opening, the pace of sign-ups by both individual mark owners and agents has been rapid, ensuring the long-term success of the TMCH project.

With ICANN 47 in Durban, South Africa coming up in mid-July, many of these subjects will continue to be discussed and, hopefully, resolved in the weeks ahead.

Written by Roland LaPlante, Senior Vice President and CMO at Afilias

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Top-Level Domains

Categories: Net coverage

Will LTE Steal the Broadband Revolution?

Fri, 2013-04-26 09:12

There is no doubt that LTE is going to take a prime position in broadband developments. With competitively priced services, innovative smartphones and an increasing range of very innovative apps this market is set to continue to boom. So how will all this impact the overall broadband market?

First of all, this is not an 'us or them' issue between fixed and mobile broadband. As a matter of fact, the companies that are rolling out LTE are increasingly dependent on deep fibre rollouts as they need to handle massive amounts of data, to which the mobile infrastructure technology is not well-suited. So the quicker they can offload their mobile traffic onto a fixed network the better. As I've said before, one of the key drivers of fibre deployment will be the growth in mobile broadband.

A similar situation will occur in the home. More and more, people are using their mobile devices rather than PCs and laptops; and more people within the home are using more and different mobile devices, so this will significantly increase the need for capacity within the home. The reality of mobile broadband is that 60%-80% of capacity usage of smartphone and tablet use is in the home, and these devices are all connected to the fixed network through the WiFi modem. People are becoming accustomed to the quality of the LTE network, so they will want a similar quality of service over the fixed network; and over the next 3-5 years the current network will start to run out of steam. And, with at least one-third of all fixed broadband connections being of such an inferior quality, these households are already facing these quality problems now.

So, while access to the internet and broadband is moving quickly towards smartphones and tablets as the preferred access devices, at the same time the majority of broadband capacity required through these devices will still need to be provided by the fixed network.

While the capacity of the mobile network is greatly improved by LTE — as well as by the upcoming extra capacity through new spectrum allocation — the physics of mobile technology is such that it will be impossible to handle all the traffic of these mobile devices over the mobile network.

Obviously the mobile operators are not sitting still. They are improving their network infrastructure in order to capture as much of the traffic as possible, and increasingly they are looking at WiFi technologies as another alternative to off-load traffic and/or add extra access points for users in high traffic areas such as shopping centres, entertainment venues, transport stations, etc. But again these WiFi access points need to be connected to the fixed network, and in the case of WiFi access points you virtually need fibre-to-the premise/business to be of any use.

So, while LTE will greatly increase the use of broadband and broadband applications, this will at the same time put increased pressure on the fixed network.

On the end-user side of the fixed broadband market — we don't have the same dynamics as in the mobile market. Few, if any, fixed network devices capture the users' attention in the way the new smartphones do. Also, there is a clear lack of exciting fixed broadband applications. Entertainment is largely captured by content providers who want to protect their existing business models, and applications in healthcare, education, energy, etc are going to take a long time to reach maturity and mass market penetration levels. So all attention is clearly on mobile and this is creating a skewed perspective on what is needed overall to ensure that these mobile developments can be used to their full potential.

The developments in mobile and LTE will generally stimulate the need for better fixed networks, but at the same time there will be a significant group of users who — at this point in time — do not have high capacity requirements, and for whom a $30 or $40 monthly mobile connection will cater for all their comms needs. This group will actually lead to stagnation, and even a decline, in fixed broadband connections. We already see this happening in the Hong Kong market. The situation will only be exacerbated if LTE becomes available in areas that have very poor fixed broadband coverage. BuddeComm estimates that up to 25% of users could simply abandon their unsatisfactory fixed broadband connection in favour of LTE. Most will eventually re-connect in 3-5 years' time, but only when important applications are becoming available over the fixed network.

These short-term developments could be interpreted by some who don't have a good understanding of the total picture as an indication that fixed broadband is not needed, and this could potentially undermine the build-out of the fixed broadband networks that are so desperately needed for the longer-term social and economic developments in the country.

If we look at the very latest smartphone devices (e.g. GalaxyS4) we see an increase in what is called machine-to-machine (M2M) or Internet of Things (IoT) applications, often linked to location-based services (LBS). What happens behind the scenes of these applications is that they gather data often from a variety of sources and process that information in real time, giving users interesting services in relation to healthcare, sport achievement, calorie intake, weather transport and traffic information and so on.

It is these M2M and IoT applications that are finally going to stimulate the sort of killer apps that are needed to drag some of the lagging sectors into the digital age — such as healthcare, education, utilities, government and business, who are at present trying to limit the impact of the digital economy, rather than embracing it. This, in turn, will start stimulating the sort of applications that require the capacity, robustness and security that can only be delivered by fibre optic networks.

All of this will come together in 5 to 10 years' time when the requirements from the mobile-based developments, the rapid growth of M2M applications, and the somewhat slower growth from the requirements following the industry and sector transformations, combined, make the need for a fibre-based infrastructure essential for the economic development and social wellbeing of any developed economy.

What is required from business leaders and politicians is that they recognise this need and start planning for it from the earliest possible opportunity. Doing this on the run is not the ideal way to make infrastructure investments that will have to last for 25-50 years.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband, Mobile, Telecom, Wireless

Categories: Net coverage

Different Focus on Spam Needed

Thu, 2013-04-25 20:26

It is surprisingly difficult to get accurate figures for the amount of spam that is sent globally, yet everyone agrees that the global volume of spam has come down a lot since its peak in late 2008. At the same time, despite some recent small decreases, the catch rates of spam filters remain generally high.

Spam still accounts for a significant majority of all the emails that are sent. A world in which email can be used without spam filters is a distant utopia. Yet, the decline of spam volumes and the continuing success (recent glitches aside) of filters have two important consequences.

The first is that we don't have to fix email. There is a commonly held belief that the existence of spam demonstrates that email (which was initially designed for a much smaller Internet) is somehow 'broken' and that its needs to be replaced by something that is more robust against spam.

Setting aside the Sisyphean task of replacing a tool that is used by billions, proposals for a new form of email tend either to put the bar for sending messages so high as to prevent many legitimate senders from sending them, or break significant properties of email (usually the ability to send messages to someone one hasn't had prior contact with).

Still, if spam volumes had continued to grow, we would have had little choice but to introduce a sub-optimal replacement. The decline in spam volumes means we don't have to settle for such a compromise.

Secondly, current levels of spam mean there is little threat of a constant flow of spam causing mail servers to fall over.

At the same time, one would be hard-pressed to find a user whose email is not filtered somewhere — whether by their employer, their provider, or their mail client.

Thus, looking at the spam that is sent isn't particularly interesting as it provides us with little insight into the actual problem. What matters is that small minority of emails that do make it to the user — whether because their spam filter missed it, or because they found it in quarantine and assumed it had been blocked by mistake.

Equally important is the question of which legitimate emails are blocked, and why — and what can be done to prevent this from happening again in the future.

It is tempting to look at all the spam received by a spam trap, or by a mail server, and draw conclusions from that. They certainly help paint a picture, but in the end they say as much about what users see as the number of shots on target in a football match says about the final result.

Despite the doom predicted by some a decade ago, email is still with us — and we have won a number of important battles against spam. But if we want to win the war, we need to shift our focus.

Written by Martijn Grooten, Email, web security tester

Follow CircleID on Twitter

More under: Email, Spam

Categories: Net coverage

Breaking Down Silos Doesn't Come Easy

Wed, 2013-04-24 19:51

"We need to break down silos", is a phrase often heard in national and international meetings around cyber security and enforcing cyber crime. So it is no coincidence that at the upcoming NLIGF (Netherlands Internet Governance Forum), the IGF, but also an EU driven event like ICT 2013 have "Breaking down silos" and "Building bridges" on the agenda. But what does it mean? And how to do so?

The internet and borders

People often refer to the internet as borderless and that there is a need to cooperate cross border between police agencies and other agencies regulating or enforcing the internet. This falls under the category "This needs a global solution" or the "this is cross border, we can not do anything!" type of comments.

Breaking down silos goes way beyond this. It is a national, organisational as well as international problem. Specific organisations work within their own remit and have, in some cases extreme, difficulty to reach out to other organisations. Others are not aware of each others capabilities. This discussion is about mental borders as well as legal, organisational and state ones.

The worst example

Usually the police is pointed to as a hard partner to work with. "We never hear anything back" or "We never receive information from them" are often heard comments. It is my impression that police organisations (and prosecutors) could have more understanding of what the capabilities of other enforcement agencies are, in order to coordinate actions in a better way. (What happens when two or three different organisations investigate the same botnet at the same time?!)

Law enforcement is more than enforcing the law from a penal code objective. Other agencies may be better equipped to solve a specific cyber crime than police on the basis of enforcing their "own" law. A "serious" crime could be dealt with through e.g. a Consumer Protection Act also. Or together there is a higher chance at success. These are important lessons. Break down your silos!

Cyber security

Cyber security organisations like Computer Emergency Response Teams (CERTs) and Computer Security Incident Response Services (Csirt) secure and monitor governmental and industry ICT systems, alert and respond to breaches, e.g. like ddos attacks or hacks. They have a lot of information and evidence that could actually assist enforcement agencies in doing their work. At the same time they can act on certain breaches in ways that law enforcement never could.

Cooperation between the two is not something which comes easily. For dozens of reasons. Hence the need to break down silos and create understanding.

Industry

And what about industry? What is the information it has on cyber crimes? If industry does not see the incentive to report all, let's say relevant, breaches to the proper authority, enforcement and security will never get the priority it deserves. Hence another reason to break down silos.

Who needs to act?

In the report of De Natris Consult (click here to view) called "National cyber crime and online threats reporting centres. A study into national and international cooperation." it is clearly shown that for an individual organisation it is nearly impossible to break a silo down. Simply because it's to difficult and not a part of the organisations primary task. So despite the fact that it is in the direct interest of a single organisation to be able to cooperate, it is nearly impossible to break through on your own when no one hears you knocking. It is important however to report your impossibilities to those who can make a difference. How will people who can actually make a difference ever know otherwise? Start breaking down your own silo in the right places.

So who needs to act then?

There are a few options. (My apologies for non-EU readers. I'm a bit EU-centric here, but please allow your imagination to run to your corner of the world and the options it provides.)

1. National government
This would help at national level. E.g. in a national strategy on cyber security a national coordinating body is foreseen and instituted by the national government. E.g. The Netherlands created the National Cyber Security Centre. It is very interesting to see the developments going on. Embedded officers from different agencies, industry and vital infrastructure work part time within the centre.

Some questions could be asked that can make a difference over time. How does the centre change knowledge and perceptions with time? Does it make a solid inventory of skills, complementary powers and different possibilities that different laws supply to fight cyber crimes? Does it take a closer look at whether present laws supply the needed powers to fight the different forms of cyber crime?

2. International bodies
ENISA currently plays a role in bringing CERTs and police agencies together. Could it play that role in a broader sense? So for other LEAs and police and CERTS?

EC3 could open itself to more enforcement entities, e.g. by providing common trainings, coordinate cyber actions, etc. It does not so at present, but it would be a good thing if EC3 looked into this option in the very near future. Who invites them to break down their silo?

Fill in your option here .....

3. International projects
What will a project like ACDC (Advanced Cyber Defense Centre) do to international cooperation? In this case it is about fighting botnets. From disinfecting end users computers to gathering, analysing and sharing data on botnets, botnet traffic and command and control servers in and through the central clearing house. What will aggregated data do in the fight against cyber crime and more so, what will it do for cooperation and understanding between different entities both public and private?

Conclusion

Why are all these questions so relevant? Because my bet is that all these agencies, from the military to secret services and from police to consumer fraud, spam and privacy agencies are all looking for the same people who make the internet not a very safe place to do business and pleasure today. There is, well there should be, a strong need to cooperate and coordinate.

Breaking down silos will not come easy. For many a reason. Still, if people responsible for this task are to make serious business with it, it is important to start asking the right questions. Let's do so at NLIGF this June, in Bali in October (I will do so here as moderator) and Vilnius in November and in all places where you think it is possible and necessary to do so. I'm always happy to discuss further or help out creating strategies or programs. The time seems right.

Written by Wout de Natris, Consultant international cooperation cyber crime + trainer spam enforcement

Follow CircleID on Twitter

More under: Cybercrime, DDoS, Internet Governance, Law, Malware, Policy & Regulation, Spam

Categories: Net coverage

Spanish Joint-Network Investment in FttH Seeing Returns

Wed, 2013-04-24 04:34

Spain's economic anguish has had a number of repercussions for the country's telcos, with stable or declining revenue causing much nervousness as operators struggle to fund essential investment in spectrum and both fixed-line and mobile networks. Earlier this year Vodafone felt the pinch, announcing plans to cut its Spanish workforce by up to 1,000. Though general economic conditions have not helped, the move partly resulted from its own decisions. The company saw revenue drop for several quarters and so decided to save money by cutting handset subsidies. The ploy backfired: by the end of 2012 the company had lost 2.29 million mobile subscribers in the year, and as a result revenue dropped from £5 billion to £4.2 billion.

Yet Vodafone is one of the key players in Spain's surging fibre market, where investment in networks is a precondition of customer growth and financial reward. In common with development elsewhere (not least in the mobile sector), Vodafone is not going it alone, but is sharing the cost with other parties. In Spain, it has partnered with Orange. Unlike many other European markets, where operators have tended to concentrate on high-density towns (Paris, Milan, Amsterdam), in Spain FttH is more widely available in smaller towns and rural areas, often guided by the policies of regional governments. In this market there is plenty of room for smaller players to co-exist with the incumbent.

Orange launched an FttH pilot in Madrid as early as 2010, and earlier this year teamed up with Vodafone to invest up to €1 billion on a joint fibre network covering 50 of the largest cities. With complementary footprints, the fibre is owned independently though the companies share technical specifications to ensure compatibility as a single network. Each operator provides access to its own footprint, making the entire network available to each other. Orange recently switched on its fibre for commercial services, initially in Madrid, and planned to have some 800,000 premises connected to the network by March 2014, rising to three million by September 2015 and six million by 2017. In Madrid alone, up to 40,000 homes could be connected to the network.

The Orange/Vodafone joint network is open to co-investing third parties to share, which could dramatically extend the availability of fibre to Catalonia and Asturias where there are already extensive deployments through existing projects.

These developments are encouraging, and show that telcos operating through long-term economic doldrums are reassured that sensible investment strategies will provide dividends down the track.

Written by Henry Lancaster, Senior Analysts at Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband, Mobile, Telecom

Categories: Net coverage