NavigationSearchUser loginRecent blog posts
|
Net coverageNoncommercial Users Ask ICANN Board to Review Decision to Expand Trademark Rights in New DomainsICANN's Non-Commercial Stakeholders Group (NCSG) has filed a Request for Reconsideration with ICANN's Board of Directors regarding the staff's decision to expand the scope of the trademark claims service beyond that provided by community consensus policy and in contradiction to ICANN Bylaws. Specifically at issue is ICANN staff's unilateral decision to adopt the "trademark +50" proposal for new domains, which would provide trademark holders who have previously won a UDRP or court decision with rights to 50 additional derivations of their trademark in ICANN's Trademark Clearinghouse (TMCH). Under staff's plan, large trademark holders that register in the clearinghouse will be provided thousands of derivations of their trademarks since each separate country's registration of the same trademark provides the brand owner with an additional 50 entries in the TMCH.1 Entries in the TMCH trigger infringement warning notices to domain name registrants which can lead to increased liability for registrants, discourage lawful registrations, and chill speech on the Internet. ICANN's bottom-up community-developed process for creating policy had approved of a TMCH model that allowed "exact matches" of trademarks only to be placed in the TMCH. In 2007, ICANN's GNSO Policy Council, including representatives from the Intellectual Property and Business Constituencies, approved the GNSO recommendations that created special protections for trademark rights by a supermajority vote.2 As part of the multi-year consensus process, both the subsequent Special Trademarks Implementation (STI) Team and the Implementation Review Team (IRT) considered the issue of providing rights to exact matches or additional derivations, and both community-developed teams specifically opted for exact matches only to be placed into the TMCH. ICANN's CEO testified before U.S. Congress in 2012 that expanding the scope of the TMCH further would be inappropriate since it would create new rights that do not exist in law and ICANN should not be creating unprecedented rights.3 Many months after the final TMCH model of exact matches only was published in ICANN's Applicant Guidebook and new domain businesses relied on it when filing their applications, ICANN's Intellectual Property and Business Constituencies lobbied ICANN's new CEO to make drastic changes to the community-developed policy and grant additional trademark rights in the TMCH. After the October 2012 Toronto ICANN Meeting, a "strawman solution" was proposed by ICANN's new CEO which included a number of IPC/BC's substantive policy proposals to give trademark holders additional privileges in the domain name system, including changing the exact matches only standard approved of by the community. Yet ICANN's CEO recognized that expanding the scope of the trademark claims service was a policy matter requiring GNSO Council guidance, as he stated on his blog4 in December 2012; and the CEO did write to the GNSO Council to request guidance on this policy proposal. Under ICANN's Bylaws, staff may not change GNSO-approved policy, except under a strict process that involves consulting with the GNSO and a 2/3 vote of the Board of Directors. NCSG filed comments on the proposed policy changes and warned against re-opening previously closed consensus agreements and circumventing ICANN's stated bottom-up policy development process.5 In addition to the flawed process for adopting this policy, NCSG also detailed substantive concerns with staff's proposal to expand trademark rights beyond anything that exists in trademark law. It came as no surprise that only members of the IPC and BC supported the strawman proposals in ICANN's comment period.6 In the GNSO Council's February 29, 2013 response to the CEO regarding the proposal to expand the scope of trademark claims, the GNSO Chair wrote, "the majority of the council feels that proposal is best addressed as a policy concern, where the interest of all stakeholders can be considered."7 Thus the GNSO Council also determined this specific proposal to be a policy matter, requiring consultation from the entire community before such a change could be made to existing GNSO Council approved policy. Yet with only an email sent on 20 March 2013, ICANN staff announced in an attached memorandum that it would expand the scope of the trademark claims service to give trademark holders rights to 50 additional derivations of their trademark, in contradiction to GNSO developed policy of exact matches only and the subsequent requested GNSO Council guidance on the matter.8 Staff's only explanation for such a drastic shift in the creation of new rights: "this proposal appears to be a reasonable add on to an existing service, rather than a proposed new service". Thus with a single line of evasive text, years of hard-fought community consensus policy was brushed under the rug and the new era of policy development via ICANN staff edict was solidified. On 19 April 2013 NCSG filed this Request for Reconsideration of the staff decision because ICANN did not follow its stated process for changing GNSO-approved policy. If ICANN wants to deviate from Supermajority GNSO-approved policy, it must follow the process outlined in the organization's Bylaws, Annex A Section 9.9 As an organization that holds itself out as a champion of the bottom-up policy development process, ICANN is obligated to comply with community-developed policies, unless the Board of Directors can muster the necessary 2/3rd vote to over-turn the community decision. That mandatory process was not followed by ICANN's staff or Board in over-turning the community-approved policy in favor of staff's policy to expand the scope of TMCH. ICANN's Board Governance Committee has thirty days in which to make to a recommendation to ICANN's Board of Directors regarding the NCSG's Request for Reconsideration or report to the Board on why no final recommendation is available and provide a timeframe for making a final recommendation on the matter. ICANN's entire Board should consider the recommendation of the Board Governance Committee at its next regularly-scheduled Board meeting. Under Article IV Section 2 of ICANN's Bylaws, the Request for Reconsideration process is a mechanism intended to reinforce ICANN's accountability to the community for operating in a manner consistent with its Bylaws.10 Because the staff's unilateral decision to change GNSO-approved policy was not consistent with ICANN's Bylaws and contradicted ICANN stated policy, NCSG filed the Request to correct the error and bring ICANN into compliance with its Bylaws and stated policies. NCSG requests that the Board reinstate the community-developed policy of giving trademark holders rights to include exact matches of their trademark only in the TMCH, which was the policy stated in ICANN's Applicant Guidebook when ICANN accepted applications for new domains.
• NCSG's Request for Reconsideration (PDF)
1 http://domainincite.com/...
Written by Robin Gross, Founder and Executive Director of IP Justice Follow CircleID on Twitter More under: Domain Names, Registry Services, ICANN, Internet Governance, Policy & Regulation, Top-Level Domains Categories: Net coverage
Reframing the Infrastructure DebateFast and reliable infrastructure of any kind is good for business. That it's debatable for the Internet shows we still don't understand what the Internet is — or how, compared to what it costs to build and maintain other forms of infrastructure, it's damned cheap, with economic and social leverage in the extreme. Here's a thought exercise… Imagine no Internet: no data on phones, no ethernet or wi-fi connections at home — or anywhere. No email, no Google, no Facebook, no Amazon, no Skype. That's what we would have if designing the Internet had been left up to phone and cable companies, and not to geeks whose names most people don't know. What those geeks came up with was something no business or government would ever contemplate: a base infrastructure of protocols that nobody owns, everybody can use and anybody can improve. For all three of those reasons the Internet supports positive economic externalities beyond calculation. The only reason we have the carriers in the Net's picture is that we needed their wires. They got into the Internet service business only because demand for Internet access was huge, and they couldn't avoid it. Yet, because we still rely on their wires, and we get billed for their services every month, we think and talk inside their conceptual boxes. Remember that the telco and cableco business models are based on routing everything through billable checkpoints. Is this what we want for the rest of the Net's future? We have to remember that the Internet isn't just a service. It's the platform for everything we connect. And the number of things we will connect over the next few years will rise to the trillions. To understand how the Internet ought to grow, try this: cities are networks, and networks are cities.† Every business, every person, every government agency and employee, every institution, is a node in a network whose value increases as a high multiple of all the opportunities there are for those nodes to connect — and to do anything. This is why every city should care about pure connectivity, and not just about billable phone and cable company services. We should be building a network infrastructure that is as neutral to purpose as water, electricity, roads and sewage treatment — and that anybody, including ordinary citizens, can improve. We can't do that if we're wearing blinders supplied by AT&T, Comcast, Time Warner and Verizon. † I came to the realization that networks are cities, and vice versa, via Geoffrey West — first in Jonah Lehrer's "A Physicist Solves The City," in the New York Times, and then in West's TED talk, "The Surprising Math of Cities and Corporations." West is the physicist in Lehrer's piece. Both are highly recommended. Written by Doc Searls, Author Follow CircleID on Twitter More under: Access Providers, Broadband, Telecom, Web Categories: Net coverage
US Smart Grid Networks Exploiting Infrastructure to Provide Wireless BroadbandThe USDA Rural Development's Rural Utilities Service (RUS) has now spent the $250 million committed for smart grid technologies. To this has been added an additional $201 million in funding approved by the Agriculture Secretary to electricity utilities in eight states to install smart grid technologies and improve their generation and transmission facilities. The beneficiaries are spread among a large number of states. This investment is helping smart grids to become the norm across the country. A side benefit is that utilities are also developing their smart grids for telecoms over and above that used by meters to send data to network controllers. As an example, earlier this year the utility serving Santa Clara began using its smart grid technology and infrastructure to deliver free citywide outdoor WiFi. While meters send data via an existing wireless network, a separate channel is used to provide outdoor internet access. The WiFi network is growing in scope and reach as more premises are equipped with smart meters. The potential for expanding WiFi coverage is huge. There are about 120 municipalities with citywide WiFi networks accessible to the general public. In addition, there are about 60 cities with citywide or near citywide coverage though these networks are now limited to government applications, such as public safety. There are also about 80 or more cities with large outdoor WiFi areas, mostly located in parks and downtown zones. A hindrance to cities aiming to develop comprehensive WiFi networks has come from the powerful telecoms industry, which employs its lobbying clout to push for laws blocking or preventing municipalities from offering WiFi or fixed broadband services. The use of smart meters to provide WiFi using existing (and expanding) infrastructure presents a separate challenge, since the telcos would have to battle utilities rather than municipal governments. Written by Henry Lancaster, Senior Analysts at Paul Budde Communication Follow CircleID on Twitter More under: Access Providers, Broadband, Telecom Categories: Net coverage
Will the Trademark Clearinghouse Fulfill its Potential?ICANN created the Trademark Clearinghouse (TMCH) as a way to streamline the repetitive process forced on trademark owners during the launch of new top-level-domains. With the expected tsunami of hundreds of new TLD's starting later this year, the TMCH should generate a clear benefit for trademark owners who elect to participate in Sunrise and Claims Periods. The side effect of introducing new TLDs is that the legacy TLDs will be making changes to make sure they are competitive against the new TLDs. This means they will be relaxing restrictions and opening up unused namespaces at the second and third-levels. Many of these will follow a Sunrise or Grandfathering process as a way to implement the changes. Already three existing TLDs (one sTLD and two ccTLDs) have announced such policy changes and decided they would like to utilize the TMCH Sunrise tokens for their Sunrise Period. This includes .Jobs, Radio.AM and Radio.FM. Donuts, the largest applicant with over 300 TLD applications, have also indicated they will use the Sunrise token from the TMCH for a universal blocking service called Domain Protected Marks List (DPML). All this is happening before the TMCH has even supported its first new TLD. While ICANN has welcomed the use of TMCH by .Jobs, it remains to be seen if ICANN will also welcome use of the TMCH by ccTLDs. The eventual benefits and viability of the TMCH will hinge on a few factors:
• Will trademark owners even use it?
Will Trademark Owners Even Use it? It is a given that trying to participate in every future Sunrise Period would overwhelm the budgets of nearly every trademark owner. Every sage legal advisor is counseling that the trademark owner must be ultra-selective about which Sunrise Periods they engage in. On the other hand, a review of the Trademark Agents published on the TMCH website show a good number of law firms have already advanced the TMCH the minimum $15000 required to be an Agent. If this trend continues, then it is a clear indicator that law firms will aggressively market the TMCH to their clients. (Disclosure: My firm, TM.Biz is offering a portal for these Trademark Agents). Will the Main Driver Be Participation in Sunrise or Claims? Trademark Claims provides some protection in every new TLD. But it is for exact matches only and only for the first 90 days. This forces trademark owners to also subscribe to a watching service that catch confusingly similar registrations not caught by the Claims service. I predict trademark owners will elect to do both Claims and watching to ensure they catch domains that might confuse their customers. Will other existing TLDs want to use it? There are actually two parts to the TMCH. The validation service is performed by Deloitte and CHIP. They are issuing Sunrise tokens called Signed-Mark-Data (SMD) files to trademark owners as proof that a trademark has satisfied the requirements for the typical Sunrise Period. The Database Administrator for the TMCH is IBM. They actually help Registries and Registrars operate the Sunrise and Trademark Claims Periods. The validation service initially launched on March 26. The database part is expected to launch in July. But there are applications for just the TMCH Sunrise tokens that do not require IBM to be used. This is because the SMD file is portable. For example, any country-code TLD who decides to change their policies and wanted to conduct a Sunrise Period first, could accept SMD files from trademark owners. Also, any TLD that wanted to accept SMD files for a new Rights Protection Mechanism, as Donuts is planning; also do not need IBM in the process. The .Jobs Sunrise Period The .Jobs TLD has decided to eliminate the current restriction that .Jobs domain names must match company names. This means that product and division names will be eligible for .Jobs. Before this change takes effect, .Jobs will first conduct the Sunrise Period that is designed for new TLDs. .Jobs will utilize both parts of the TMCH. Thus they need to wait for IBM, their Back-end Registry and Registrars all to be operational before they can conduct their Sunrise Period. The Radio Global Domains The .AM and .FM ccTLD's, have long been re-purposed for the Radio industry. They are now introducing new namespaces, called Radio Global Domains, which are designed to target new market segments within the Radio industry. These will be radio.am and radio.fm. Before these changes take place, they will also undergo a Sunrise Period starting May 28. Validation for the Radio Global Domains Sunrise Period will be performed on either trademark data or the Sunrise tokens called Signed-Mark-Data (SMD) files issued by the TMCH. All this is happening without the need for the involvement of IBM, or even for Registrars to support the new protocols required for the new TLD Sunrise Periods. (Disclosure: My firm, TM.Biz will be handling the trademark validation, SMD validation and direct submission of Sunrise registrations to the Registry). It is still an open issue whether the TMCH will be capable of issuing SMD files by May 28 for use by the Radio Global Domains. Or if the TMCH is capable of issuing SMD files by this date, whether ICANN will allow the TMCH to release the SMD files so that the ccTLDs can use them. There are no doubt other ccTLDs that are interested in changing their registration rules and restrictions that might consider holding a Sunrise Period first. I predict that these ccTLDs would be interested in using the SMD files as well, if allowed by ICANN. Additional Rights Protection Mechanisms
The largest TLD applicant, Donuts, is also planning to accept SMD files for its universal blocking service called Domain Protected Marks List, or DPML. As applicant of over 300 TLD's, with half of those uncontested, a DPML represents a good value for trademark owners.
Hopelessly Optimistic The Trademark Clearinghouse has enormous potential to support the domain name industry. The portability of the SMD files enables many uses that were not originally envisioned by its creators. Certainly, the days of a TLD manually checking trademark databases should be coming to an end with SMD files becoming the new de facto standard for trademark validation. It will be interesting to see how this evolves over time. Written by Thomas Barrett, President - EnCirca, Inc Follow CircleID on Twitter More under: Domain Names, ICANN, Law, Top-Level Domains Categories: Net coverage
CERN Celebrates 20 Years of The Free And Open WebOf all the many applications and services that run on top of the Internet, arguably none has been more successful than that of the World Wide Web. Invented by Tim Berners-Lee back in 1989 while he was a physicist at CERN, the "Web" has fundamentally changed almost every aspect of our life… and become a part of basically every aspect of our life. Think of a part of your life… and then think of the websites that are part of that. Whether it is social networks, banking, shopping, dating, news, reading, publishing, writing, gaming, sports and now even communicating in real-time… all are aspects that somehow involve the "Web". Today, April 30, is a special day in the history of the Web, because, as recounted on that newly-redesigned famous website (because it was the first website), info.cern.ch, it was twenty years ago today that CERN published a statement that put the WWW technology out in the public domain for all to use. Building on the long history of openness of the Internet, CERN stated very clearly that "CERN relinquishes all intellectual property rights to this code, both source and binary form and permission is granted for anyone to use, duplicate, modify and redistribute it”. And thus was born the wider Web ... anyone could download, use and modify the W3 server software and the W3 client and start creating new sites. And people did! By the tens… and hundreds… and on and on… changing and modifying the code to satisfy their own dreams and ideas. Keep in mind, this was before Mosaic and other graphical clients changed the Web again by introducing images along with text. The original Web was one of text. I remember telnetting to info.cern.ch back in the early '90s to see what this "World Wide Web" thing was all about - and pressing numbers to follow links. It was a very different world. Still, from those early days - and more importantly from the openness of those early days - came everything else about the Web that we use today. Those early adopters didn't need to ask anyone for permission to innovate… they just downloaded the code and started hacking away. Thank you, CERN, for the reminder of the importance of today - and of the incredible importance of an open Web… riding on top of an open Internet. P.S. Vint Cerf has a great retrospective out today as well: The open internet and the web Written by Dan York, Author and Speaker on Internet technologies Follow CircleID on Twitter More under: Web Categories: Net coverage
CERN Recreating First Webpage to Commemorate 20th AnniversaryA team at the European Organisation for Nuclear Research (CERN) has launched a project to re-create the first web page. The aim is to preserve the original hardware and software associated with the birth of the web. The initiative coincides with the 20th anniversary of the research centre giving the web to the world. Read full story: BBC Follow CircleID on Twitter More under: Web Categories: Net coverage
CERN Recreating First Webpage to Commemorate 20th AnniversaryA team at the European Organisation for Nuclear Research (CERN) has launched a project to re-create the first web page. The aim is to preserve the original hardware and software associated with the birth of the web. The initiative coincides with the 20th anniversary of the research centre giving the web to the world. Read full story: BBC Follow CircleID on Twitter More under: Web Categories: Net coverage
Announcing the Final Terms of the First Applicant Auction for Contested gTLDsWe received several emails and phone calls with thoughtful comments on the proposed plan for the first Applicant Auction and have made several small changes to the plan. The final terms will be sent to applicants who requested the RFC, and can also be requested on our website. Here is a quick summary of the changes:
Auction prices:
Information transparency:
Timeline:
a) As originally planned, we will offer a voluntary mock auction 3 days before the auction, scheduled for Thursday May 23rd. This will be run exactly as the real auction except that the results have no meaning, and the schedule will be heavily accelerated. We encourage all bidders to participate - the mock auction is a good test to make sure that you have the right login credentials and know how to place bids. b) we will plan for the auction to take 3 or 4 days. The first auction round will be on May 28th and will last for 24 hours, as before. For subsequent rounds, we will do our best to set a schedule that reflects actual bidders' time zones. Rounds will last 2 hours initially, but if bidding activity during the auction indicates that not that much time is needed, we may shorten the rounds. In no case will the rounds be shorter than 30 minutes.
Deposits:
A small change to mitigate order of magnitude error:
We hope that these rules will be acceptable to all interested bidders and maximize participation. Any bidders who do not find these terms workable for them are invited to comment and participate in one of our future auctions. Those interested in participating in the first auction and receiving legal documentation and login credentials for the mock auction should register their interest on our website. Written by Sheel Mohnot, Consultant Follow CircleID on Twitter More under: ICANN, Top-Level Domains Categories: Net coverage
Can't Sell Your IPv4 Numbers? Try Leasing ThemIn a "policy implementation and experience report” presented at ARIN 31 in Barbados, ARIN's staff noted that they are seeing "circumstances" related to the leasing of IPv4 number blocks. At the recent INET in Denver, ARIN's Director John Curran alleged that there is a "correlation" between address leasing activity and organizations that have been unable to complete specified transfers through the ARIN process, which requires needs-based justification. The issue of leasing — or rather sub-leasing, because ARIN is already leasing the addresses to its members — is yet another symptom of the growing scarcity of IPv4 addresses. Subleasing is interesting, however, as another example of the way RIR's bureaucratic control of transactions between willing sellers and buyers can lead to workarounds that make the Whois directory less accurate. It's unclear exactly how ARIN is aware of this nominally private activity. Perhaps someone involved is tipping ARIN off, or maybe its staff is observing instances where the ASN information associated with a routed block is changing while the contact information in the ARIN Whois directory remains the same. In either case, a greater degree of transparency about refused transfers and the basis for ARIN's determination would be welcome. On a related note, we sought to shed some light on the emerging transfer market in a paper last year. What is troubling, for ARIN at least, is that the subleasing of addresses is taking place outside of the RIR address governance regime. It is understandable that ARIN would react to something that might undermine its control over address space. Part of ARIN's power stems from its ability to identify who is allocated or assigned what address block(s) via its Whois Directory Service. Practically, the Whois has also been used to identify the party actually routing an address block, although technically this is a distinct activity over which ARIN claims no control. From an operational perspective, if the organization actually routing an address block is unable to be contacted this could be detrimental to administrators attempting to resolve networking issues, and to parties seeking to use the Whois for law enforcement or related policy matters. However, at this point it is unclear if lessees are actually unreachable. In fact, one could argue that lessors are in a better position to keep accurate lessee contact records than the address registry — they are invoicing their lessees, we assume! Whether, and under what conditions, they would release contact information is basically unknown at this point. For now, ARIN does not seem to be too alarmed. It suggests three potential policy solutions:
Again, absent any data on leasing, it is hard to say which way ARIN or its membership might go, although the third option seems increasingly unlikely as ARIN moves closer to IPv4 exhaustion and the RIPE region is contemplating elimination of needs based justification entirely. It may just turn out that private subleasing transforms the address transfer market. As Addrex's Charles Lee pointed out at INET in Denver, all kinds of parties lease assets (including ARIN leasing addresses to its own customers). It serves a useful business purpose and is not a bad thing per se. The entry of large subleasing companies without any Internet operations, Lee noted, might transform the address market. It could create entirely new ways of allocating addresses and provisioning post allocation services. It might lead to innovative product offerings such as providing means to mitigate the technological obsolescence of IPv4. We just don't know. What we know for sure is that it will create governance dilemmas. Written by Brenden Kuerbis, Fellow in Internet Security Governance, Citizen Lab, Univ of Toronto Follow CircleID on Twitter More under: IP Addressing Categories: Net coverage
Typosquatting Claims Against Security Researcher Are Legally Complicated - Gioconda v. KenzieKenzie is a security researcher who has registered numerous domain names that are typographic errors of well-known trademarks (e.g., rnastercard, rncdonalds, nevvscorp, rncafee, macvvorld, rnonster, pcvvorld). He points the domain names to the actual sites in question (e.g., rncdonalds points to mcdonalds.com), but he is looking to demonstrate how these typo domains are used for "social engineering" attacks. Kenzie did not offer the domain names for sale, did not read the emails intended for the subject organization, and generally kept his whole scheme out of the public eye. Upon demand, he also offered to transfer the domain names to the organizations in question. Nevertheless he was sued by Gioconda Law Group for registering Giocondolaw.com — with "o" instead of "a" [see: Gioconda Law Group v. Kenzie, 2012 US Dist LEXIS 187801 (S.D.N.Y. Apr. 23, 2013)]. In response to Gioconda's complaint, Kenzie, proceeding pro se, asserted a variety of defenses, including a critique of American privacy law. Gioconda moved for judgment on the pleadings. The court struggles with the application of the Anticybersquatting Consumer Protection Act (ACPA) factors to this case. On the one hand, this is clearly not a case where the registrant is trying to profit by selling back the domain name. On the other hand, the court says, all non-commercial uses are not necessarily exempt from the ACPA. [Not a particularly speech friendly position.] Ultimately, the court says that it's not a case that can be resolved on the pleadings: Defendants's alleged ideological, scholarly, and personal motives for squatting on the [domain name], while perhaps idiosyncratic, do not fall within the sphere of conduct targeted by the ACPA's bad faith requirement, If anything, given that defendant aims to both influence plaintiff's behavior and shape public understanding of what he perceives to be an important vulnerability in cyber security systems, this case arguably falls closer to cases involving parody and consumer complaint sites designated to draw public attention to various social, political, or economic issue. It's possible plaintiff can prevail, but it would have do to so under a more fact-specific totality of the circumstances inquiry. This is an interesting case that highlights the problems faced by security researchers generally. While the risk of liability here is less than what security researchers generally face (e.g., liability under the Computer Fraud and Abuse Act), it still shows a judge reluctant to grant the researcher's conduct full protection as a non-commercial, First Amendment-protected venture. Written by Venkat Balasubramani, Tech-Internet Lawyer at Focal PLLC Follow CircleID on Twitter More under: Cybersquatting, Domain Names, Law, Security Categories: Net coverage
Arrest Made in Connection to Spamhaus DDoS CaseAccording to a press release by the Openbaar Ministerie (the Public Prosecution Office), a dutch man with the initials SK has been arrested in Spain for the DDoS attacks on Spamhaus. Brian Krebs reports: "A 35-year-old Dutchman thought to be responsible for launching what's been called 'the largest publicly announced online attack in the history of the Internet' was arrested in Barcelona on Thursday by Spanish authorities. The man, identified by Dutch prosecutors only as 'SK,' was being held after a European warrant was issued for his arrest in connection with a series of massive online attacks last month against Spamhaus, an anti-spam organization." Follow CircleID on Twitter More under: Cyberattack, Cybercrime, DDoS, Law, Security, Spam Categories: Net coverage
Arrest Made in Connection to Spamhaus DDoS CaseAccording to a press release by the Openbaar Ministerie (the Public Prosecution Office), a dutch man with the initials SK has been arrested in Spain for the DDoS attacks on Spamhaus. Brian Krebs reports: "A 35-year-old Dutchman thought to be responsible for launching what's been called 'the largest publicly announced online attack in the history of the Internet' was arrested in Barcelona on Thursday by Spanish authorities. The man, identified by Dutch prosecutors only as 'SK,' was being held after a European warrant was issued for his arrest in connection with a series of massive online attacks last month against Spamhaus, an anti-spam organization." Follow CircleID on Twitter More under: Cyberattack, Cybercrime, DDoS, Law, Security, Spam Categories: Net coverage
Why Most Discussions for Fibre Optic Infrastructure Take Place from the Wrong PerspectiveFibre-based infrastructure requires vision and recognition of the fact that many of today's social, economic and sustainability problems can only be solved with the assistance of information and communications technology (ICT). In many situations the capacity, robustness, security and quality necessary for this calls for fibre optic infrastructures. This need will increase dramatically over the next 5 to 10 years as industries and whole sectors (healthcare, energy, media, retail) carry out the process of transforming themselves in order to much better address the challenges ahead. Most discussions regarding the need for fibre optic infrastructure take place from the wrong perspective — based on how fast people need the internet to be when they download their emails, web information, games and movies. Fibre optic technology has very little to do with this — ultimately all of that 'residential' traffic will account for less than 50% of all the traffic that will eventually flow over fibre optic networks. The real reason this type of network is needed relates to the social and economic needs of our societies, and there are many clear examples that indicate that we are running out of steam trying to solve some of our fundamental problems in traditional ways. For instance, at this moment discussions are taking place in every single developed country in the world about the fact that the cost of healthcare is unsustainable. These costs will grow — over the next 20 years — to 40%-50% of total government budgets — clearly impossible. So we face a dilemma. Do we lower the standard of healthcare services, at the same time making them more costly for the end-user? If we want to maintain our current lifestyle the only solution is to make the healthcare system more effective, efficient and productive. And this can only be done with the help of ICT. To make it more productive, health needs to be brought to the people rather than the other way around, as is the case at present. Similar examples apply to the education system, the energy systems and the management of cities and countries in general. We need to create smart cities, smart businesses and smart countries, with high-speed infrastructure, smart grids, intelligent buildings, etc. In order to manage our societies and economies better we need to have much better information about what is happening within all of the individual ecosystems, and in particular information about how these different systems interact. Currently they all operate within silos and there is little or no cooperation or coordination between them. ICT can be the bridge to bring them together; to collect data from them and process it in real time. Information can then be fed back to those who are managing the systems, and those who operate within them, such as doctors, teachers, business people, bureaucrats, politicians — and, of course, to you and me. Some of these data interactions are already happening around smartphones, social media, traffic and crowd control and weather information. This is only the start of what is known as the Internet of Things (IoT) or machine-to-machine communication (M2M). ICT cannot solve world hunger, but without ICT world hunger cannot be solved, and this applies to all the important social and economic problems that societies around the world are now facing. None of this can be done overnight; it requires massive transformations of industries and sectors. There is no instant business model available that will supply an immediate return on the investment that is needed to create these smart systems. All of these investments need to be looked at over a period of 10, 20 years and even longer. No private business will take such a business risk. To make it happen government leadership and government policies are needed. This is also the message from the UN Broadband Commission for Digital Development, and it applies to countries all over the world. More than 120 countries worldwide have now developed broadband policies, recognising that such infrastructure is critical to their development. The challenge now is to put these policies into practice/implement these policies, and at a time when government leadership around the world as at an all-time low. Ultimately all of these developments will require national fibre optic networks. There simply is no other technology that can handle the capacity of data and applications that will be needed to run the cities and countries from today onwards. This infrastructure needs to be robust. It has to have enormous capacity. It needs to be secure and to be able to protect privacy. There is simply no other infrastructure technology that is up to that job. So those business and government leaders who are in charge of looking towards the future do have an obligation to ask themselves, based on the above, whether we can afford not to have a fibre optic network. Written by Paul Budde, Managing Director of Paul Budde Communication Follow CircleID on Twitter More under: Access Providers, Broadband, Telecom Categories: Net coverage
Join UniregistryWhat happens when you take a team of experts, at the top of the naming industry, and unite them behind a single, high-minded purpose? You get the most service-based and holistic approach to registry operations that the industry has ever seen — something we call "Uniregistry."
Software Developers – We are looking for full-stack developers that are comfortable working at any level of web-development and have the initiative to see a project through from start to finish. Our technology is currently built on top of PHP, MySQL, and Javascript but we are looking for anyone who feels at the top of their game developing for web and mobile in any technology. If you fit the bill we will fly you here, interview you confidentially and deliver you a lifetime opportunity to work on things that millions of people a day will use. Systems Specialist – A successful technology company doesn't exist without a robust and scalable foundation. Do you have what it takes to build infrastructure to handle millions of visitors a day? Then we are looking for you. Candidates should have multiple years experience managing Linux based systems, popular opensource databases, as well as have a sound understanding of networking and the services that operate over them. Being well versed in systems automation, virtualization, and mass hosting are assests as well. Big things lie ahead for the fortunate candidate who chooses the red pill. Front-end Developers – Someone with a keen sense of aesthetics and human behavior. Can turn sketches and ideas into web reality. The right individual needs to understand the consequences of their choice in code and execution. Not just nice looking pages but the ability to turn designs into a functioning Websites. HTML, CSS, Javascript, will be your primary tools. Our facility in Cayman is world-class and right across from the beach. Swim to work and shower here. Work with people like you and live tax-free. Marketing People – Help us find the right programmer/developers in your organization and join them here in Cayman as we grow our existing registry operations business. We will need to promote the new namespaces we're charged with operating. We are going to have all kinds of fun doing that, but first we need to finish the critical infrastructure we've started. We need a great team for that and we want you to be a part of it. Send your resumé today: careers@uniregistry.com Follow CircleID on Twitter More under: Domain Names, ICANN, Top-Level Domains Categories: Net coverage
Wrap-up: ICANN 46 in BeijingEarlier this April, the largest ICANN meeting ever — more than 2,500 attendees — kicked off in Beijing. Given the imminent addition of hundreds of "dot Brands" to the Internet, the topic of new gTLDs was at the top of the discussion list for all attendees. So far, well over 100 new gTLD applications have passed the Initial Evaluation stage, meaning they're on their way to becoming live domains. At the meeting, ICANN's Government Advisory Committee (GAC) released its formal advice on new gTLDs. The GAC made a number of points to the ICANN Board including:
GAC advice is becoming the single biggest area of uncertainty for new TLD applicants. It not only appears to adjust requirements approved by the community in the Applicant Guidebook, it also is evolving with each new communique. One reporter noted, "It looks like at least 517 new gTLD applications [may] be affected by the GAC's advice." I'm sure there will be many more discussions about this topic. Registrar Accreditation Agreement (RAA) and Registry Agreement ICANN CEO Fadi Chehade announced newly revised versions of both the 2013 Registrar Accreditation Agreement and Registry Agreement, which are now posted for public comment. ICANN is looking at ways to keep the debate over these contracts from delaying the overall application process. Trademark Clearinghouse Earlier, in March, the Trademark Clearinghouse (TMCH) opened. TMCH allows brand owners to submit their trademark data into one centralized database, prior to and during the launch of new gTLDs. Since opening, the pace of sign-ups by both individual mark owners and agents has been rapid, ensuring the long-term success of the TMCH project. With ICANN 47 in Durban, South Africa coming up in mid-July, many of these subjects will continue to be discussed and, hopefully, resolved in the weeks ahead. Written by Roland LaPlante, Senior Vice President and CMO at Afilias Follow CircleID on Twitter More under: ICANN, Internet Governance, Top-Level Domains Categories: Net coverage
Will LTE Steal the Broadband Revolution?There is no doubt that LTE is going to take a prime position in broadband developments. With competitively priced services, innovative smartphones and an increasing range of very innovative apps this market is set to continue to boom. So how will all this impact the overall broadband market? First of all, this is not an 'us or them' issue between fixed and mobile broadband. As a matter of fact, the companies that are rolling out LTE are increasingly dependent on deep fibre rollouts as they need to handle massive amounts of data, to which the mobile infrastructure technology is not well-suited. So the quicker they can offload their mobile traffic onto a fixed network the better. As I've said before, one of the key drivers of fibre deployment will be the growth in mobile broadband. A similar situation will occur in the home. More and more, people are using their mobile devices rather than PCs and laptops; and more people within the home are using more and different mobile devices, so this will significantly increase the need for capacity within the home. The reality of mobile broadband is that 60%-80% of capacity usage of smartphone and tablet use is in the home, and these devices are all connected to the fixed network through the WiFi modem. People are becoming accustomed to the quality of the LTE network, so they will want a similar quality of service over the fixed network; and over the next 3-5 years the current network will start to run out of steam. And, with at least one-third of all fixed broadband connections being of such an inferior quality, these households are already facing these quality problems now. So, while access to the internet and broadband is moving quickly towards smartphones and tablets as the preferred access devices, at the same time the majority of broadband capacity required through these devices will still need to be provided by the fixed network. While the capacity of the mobile network is greatly improved by LTE — as well as by the upcoming extra capacity through new spectrum allocation — the physics of mobile technology is such that it will be impossible to handle all the traffic of these mobile devices over the mobile network. Obviously the mobile operators are not sitting still. They are improving their network infrastructure in order to capture as much of the traffic as possible, and increasingly they are looking at WiFi technologies as another alternative to off-load traffic and/or add extra access points for users in high traffic areas such as shopping centres, entertainment venues, transport stations, etc. But again these WiFi access points need to be connected to the fixed network, and in the case of WiFi access points you virtually need fibre-to-the premise/business to be of any use. So, while LTE will greatly increase the use of broadband and broadband applications, this will at the same time put increased pressure on the fixed network. On the end-user side of the fixed broadband market — we don't have the same dynamics as in the mobile market. Few, if any, fixed network devices capture the users' attention in the way the new smartphones do. Also, there is a clear lack of exciting fixed broadband applications. Entertainment is largely captured by content providers who want to protect their existing business models, and applications in healthcare, education, energy, etc are going to take a long time to reach maturity and mass market penetration levels. So all attention is clearly on mobile and this is creating a skewed perspective on what is needed overall to ensure that these mobile developments can be used to their full potential. The developments in mobile and LTE will generally stimulate the need for better fixed networks, but at the same time there will be a significant group of users who — at this point in time — do not have high capacity requirements, and for whom a $30 or $40 monthly mobile connection will cater for all their comms needs. This group will actually lead to stagnation, and even a decline, in fixed broadband connections. We already see this happening in the Hong Kong market. The situation will only be exacerbated if LTE becomes available in areas that have very poor fixed broadband coverage. BuddeComm estimates that up to 25% of users could simply abandon their unsatisfactory fixed broadband connection in favour of LTE. Most will eventually re-connect in 3-5 years' time, but only when important applications are becoming available over the fixed network. These short-term developments could be interpreted by some who don't have a good understanding of the total picture as an indication that fixed broadband is not needed, and this could potentially undermine the build-out of the fixed broadband networks that are so desperately needed for the longer-term social and economic developments in the country. If we look at the very latest smartphone devices (e.g. GalaxyS4) we see an increase in what is called machine-to-machine (M2M) or Internet of Things (IoT) applications, often linked to location-based services (LBS). What happens behind the scenes of these applications is that they gather data often from a variety of sources and process that information in real time, giving users interesting services in relation to healthcare, sport achievement, calorie intake, weather transport and traffic information and so on. It is these M2M and IoT applications that are finally going to stimulate the sort of killer apps that are needed to drag some of the lagging sectors into the digital age — such as healthcare, education, utilities, government and business, who are at present trying to limit the impact of the digital economy, rather than embracing it. This, in turn, will start stimulating the sort of applications that require the capacity, robustness and security that can only be delivered by fibre optic networks. All of this will come together in 5 to 10 years' time when the requirements from the mobile-based developments, the rapid growth of M2M applications, and the somewhat slower growth from the requirements following the industry and sector transformations, combined, make the need for a fibre-based infrastructure essential for the economic development and social wellbeing of any developed economy. What is required from business leaders and politicians is that they recognise this need and start planning for it from the earliest possible opportunity. Doing this on the run is not the ideal way to make infrastructure investments that will have to last for 25-50 years. Written by Paul Budde, Managing Director of Paul Budde Communication Follow CircleID on Twitter More under: Access Providers, Broadband, Mobile, Telecom, Wireless Categories: Net coverage
Different Focus on Spam NeededIt is surprisingly difficult to get accurate figures for the amount of spam that is sent globally, yet everyone agrees that the global volume of spam has come down a lot since its peak in late 2008. At the same time, despite some recent small decreases, the catch rates of spam filters remain generally high. Spam still accounts for a significant majority of all the emails that are sent. A world in which email can be used without spam filters is a distant utopia. Yet, the decline of spam volumes and the continuing success (recent glitches aside) of filters have two important consequences. The first is that we don't have to fix email. There is a commonly held belief that the existence of spam demonstrates that email (which was initially designed for a much smaller Internet) is somehow 'broken' and that its needs to be replaced by something that is more robust against spam. Setting aside the Sisyphean task of replacing a tool that is used by billions, proposals for a new form of email tend either to put the bar for sending messages so high as to prevent many legitimate senders from sending them, or break significant properties of email (usually the ability to send messages to someone one hasn't had prior contact with). Still, if spam volumes had continued to grow, we would have had little choice but to introduce a sub-optimal replacement. The decline in spam volumes means we don't have to settle for such a compromise. Secondly, current levels of spam mean there is little threat of a constant flow of spam causing mail servers to fall over. At the same time, one would be hard-pressed to find a user whose email is not filtered somewhere — whether by their employer, their provider, or their mail client. Thus, looking at the spam that is sent isn't particularly interesting as it provides us with little insight into the actual problem. What matters is that small minority of emails that do make it to the user — whether because their spam filter missed it, or because they found it in quarantine and assumed it had been blocked by mistake. Equally important is the question of which legitimate emails are blocked, and why — and what can be done to prevent this from happening again in the future. It is tempting to look at all the spam received by a spam trap, or by a mail server, and draw conclusions from that. They certainly help paint a picture, but in the end they say as much about what users see as the number of shots on target in a football match says about the final result. Despite the doom predicted by some a decade ago, email is still with us — and we have won a number of important battles against spam. But if we want to win the war, we need to shift our focus. Written by Martijn Grooten, Email, web security tester Follow CircleID on Twitter Categories: Net coverage
Breaking Down Silos Doesn't Come Easy"We need to break down silos", is a phrase often heard in national and international meetings around cyber security and enforcing cyber crime. So it is no coincidence that at the upcoming NLIGF (Netherlands Internet Governance Forum), the IGF, but also an EU driven event like ICT 2013 have "Breaking down silos" and "Building bridges" on the agenda. But what does it mean? And how to do so? The internet and borders People often refer to the internet as borderless and that there is a need to cooperate cross border between police agencies and other agencies regulating or enforcing the internet. This falls under the category "This needs a global solution" or the "this is cross border, we can not do anything!" type of comments. Breaking down silos goes way beyond this. It is a national, organisational as well as international problem. Specific organisations work within their own remit and have, in some cases extreme, difficulty to reach out to other organisations. Others are not aware of each others capabilities. This discussion is about mental borders as well as legal, organisational and state ones. The worst example Usually the police is pointed to as a hard partner to work with. "We never hear anything back" or "We never receive information from them" are often heard comments. It is my impression that police organisations (and prosecutors) could have more understanding of what the capabilities of other enforcement agencies are, in order to coordinate actions in a better way. (What happens when two or three different organisations investigate the same botnet at the same time?!) Law enforcement is more than enforcing the law from a penal code objective. Other agencies may be better equipped to solve a specific cyber crime than police on the basis of enforcing their "own" law. A "serious" crime could be dealt with through e.g. a Consumer Protection Act also. Or together there is a higher chance at success. These are important lessons. Break down your silos! Cyber security Cyber security organisations like Computer Emergency Response Teams (CERTs) and Computer Security Incident Response Services (Csirt) secure and monitor governmental and industry ICT systems, alert and respond to breaches, e.g. like ddos attacks or hacks. They have a lot of information and evidence that could actually assist enforcement agencies in doing their work. At the same time they can act on certain breaches in ways that law enforcement never could. Cooperation between the two is not something which comes easily. For dozens of reasons. Hence the need to break down silos and create understanding. Industry And what about industry? What is the information it has on cyber crimes? If industry does not see the incentive to report all, let's say relevant, breaches to the proper authority, enforcement and security will never get the priority it deserves. Hence another reason to break down silos. Who needs to act? In the report of De Natris Consult (click here to view) called "National cyber crime and online threats reporting centres. A study into national and international cooperation." it is clearly shown that for an individual organisation it is nearly impossible to break a silo down. Simply because it's to difficult and not a part of the organisations primary task. So despite the fact that it is in the direct interest of a single organisation to be able to cooperate, it is nearly impossible to break through on your own when no one hears you knocking. It is important however to report your impossibilities to those who can make a difference. How will people who can actually make a difference ever know otherwise? Start breaking down your own silo in the right places. So who needs to act then? There are a few options. (My apologies for non-EU readers. I'm a bit EU-centric here, but please allow your imagination to run to your corner of the world and the options it provides.)
1. National government
Some questions could be asked that can make a difference over time. How does the centre change knowledge and perceptions with time? Does it make a solid inventory of skills, complementary powers and different possibilities that different laws supply to fight cyber crimes? Does it take a closer look at whether present laws supply the needed powers to fight the different forms of cyber crime?
2. International bodies
EC3 could open itself to more enforcement entities, e.g. by providing common trainings, coordinate cyber actions, etc. It does not so at present, but it would be a good thing if EC3 looked into this option in the very near future. Who invites them to break down their silo? Fill in your option here .....
3. International projects
Conclusion Why are all these questions so relevant? Because my bet is that all these agencies, from the military to secret services and from police to consumer fraud, spam and privacy agencies are all looking for the same people who make the internet not a very safe place to do business and pleasure today. There is, well there should be, a strong need to cooperate and coordinate. Breaking down silos will not come easy. For many a reason. Still, if people responsible for this task are to make serious business with it, it is important to start asking the right questions. Let's do so at NLIGF this June, in Bali in October (I will do so here as moderator) and Vilnius in November and in all places where you think it is possible and necessary to do so. I'm always happy to discuss further or help out creating strategies or programs. The time seems right. Written by Wout de Natris, Consultant international cooperation cyber crime + trainer spam enforcement Follow CircleID on Twitter More under: Cybercrime, DDoS, Internet Governance, Law, Malware, Policy & Regulation, Spam Categories: Net coverage
Spanish Joint-Network Investment in FttH Seeing ReturnsSpain's economic anguish has had a number of repercussions for the country's telcos, with stable or declining revenue causing much nervousness as operators struggle to fund essential investment in spectrum and both fixed-line and mobile networks. Earlier this year Vodafone felt the pinch, announcing plans to cut its Spanish workforce by up to 1,000. Though general economic conditions have not helped, the move partly resulted from its own decisions. The company saw revenue drop for several quarters and so decided to save money by cutting handset subsidies. The ploy backfired: by the end of 2012 the company had lost 2.29 million mobile subscribers in the year, and as a result revenue dropped from £5 billion to £4.2 billion. Yet Vodafone is one of the key players in Spain's surging fibre market, where investment in networks is a precondition of customer growth and financial reward. In common with development elsewhere (not least in the mobile sector), Vodafone is not going it alone, but is sharing the cost with other parties. In Spain, it has partnered with Orange. Unlike many other European markets, where operators have tended to concentrate on high-density towns (Paris, Milan, Amsterdam), in Spain FttH is more widely available in smaller towns and rural areas, often guided by the policies of regional governments. In this market there is plenty of room for smaller players to co-exist with the incumbent. Orange launched an FttH pilot in Madrid as early as 2010, and earlier this year teamed up with Vodafone to invest up to €1 billion on a joint fibre network covering 50 of the largest cities. With complementary footprints, the fibre is owned independently though the companies share technical specifications to ensure compatibility as a single network. Each operator provides access to its own footprint, making the entire network available to each other. Orange recently switched on its fibre for commercial services, initially in Madrid, and planned to have some 800,000 premises connected to the network by March 2014, rising to three million by September 2015 and six million by 2017. In Madrid alone, up to 40,000 homes could be connected to the network. The Orange/Vodafone joint network is open to co-investing third parties to share, which could dramatically extend the availability of fibre to Catalonia and Asturias where there are already extensive deployments through existing projects. These developments are encouraging, and show that telcos operating through long-term economic doldrums are reassured that sensible investment strategies will provide dividends down the track. Written by Henry Lancaster, Senior Analysts at Paul Budde Communication Follow CircleID on Twitter More under: Access Providers, Broadband, Mobile, Telecom Categories: Net coverage
gTLD Contention Auction in May: Request for CommentsMany gTLD applicants with strings in contention have already heard about the Applicant Auction, a voluntary private auction for resolving string contention that my colleagues and I are organizing. In this post we'd like to share some updates on our progress. Most importantly, we realized that more than just an escrow agent is needed for the success of a private auction of this scale, and we have partnered with Morrison & Foerster, LLP, a global law firm, who will be acting as the neutral party for our auctions. We had the opportunity to talk to many applicants in Beijing last week, and we received some great feedback and suggestions. We have distilled these conversations into a more detailed proposal, covering the schedule, policies on which information is published and which is kept confidential, the procedure for handling withdrawals, the handling of bid deposits, and more. Although many applicants have been asking us to hold an auction as soon as possible and several have already committed to participate in the first auction, we would like to give all applicants a chance to review the proposal and submit final comments, until Thursday this week (11pm UTC). Based on the applicants' input, the final schedule and rules for the first auction will then be published by Tuesday, April 30, and applicants interested in participating can then sign up their TLDs in an online enrollment system. We have summarized some of the suggested changes below, and we encourage participants to take a look at the full RFC and send us comments: Schedule: We propose beginning Thursday, May 2, with publication of the auction rules and other legal documents, and we plan to hold the auction on Thursday, May 23. Interested parties will need to commit online by May 8. Dates are subject to change with input from participating applicants. Information policy: As presented in the workshops, all bidders participating in a given auction can see the number of bidders still bidding for a domain in each round, for all domains being auctioned. However, the winning price is not disclosed to all bidders; only bidders for a particular domain can see the price at which the domain was sold. Amounts of bids and deposits will be kept strictly confidential. Withdrawal procedure: Several applicants asked: What if I don't win in the auction, and, as required, I withdraw my application, but some of my fellow non-winning competitors don't? We took this concern very seriously and propose the following solution: Before the auction, bidders irrevocably authorize the neutral party to request a withdrawal with ICANN on their behalf. In addition, bidders that do not win are required to withdraw their applications via ICANN's online system and send a screenshot to the neutral party, along with a withdrawal statement signed by bidder and two witnesses confirming that the seller performed the withdrawal. A bidder who does not submit proof of withdrawal will forfeit their deposit, and Morrison & Foerster LLP will take legal steps, if necessary, to execute the withdrawal. For bidders who do submit proof, the deposit is held until the neutral party has ensured that the withdrawal took place. ICANN has assured us that withdrawals will be made public within 48 hours, and the neutral party will not release any payments or deposits until withdrawals have been confirmed by ICANN. Deposit: Each applicant must make a deposit of at least 20% of the maximum amount the applicant would like to be able to bid, as noted previously. The deposit must be at least $80,000. The purpose of the minimum deposit is to help ensure that bidders who didn't win in the auction withdraw their application. To level the playing field for single-domain applicants who had requested this, we also made an important change from the previously proposed policy: the effective deposit does not increase if participant becomes a seller for a TLD, and payments received from one TLD cannot be used to pay for another TLD within that auction. Applicants who are participating in the auction with more than one TLD must make the minimum deposit for each TLD. We hope that the procedure we proposed adequately captures the feedback we received from applicants. Overall, there were surprisingly few topics on which we had to come up with a compromise; in most cases, applicant's preferences were in agreement. Where we did have to find a balance between different perspectives, we hope we have found solutions that will satisfy all applicant's concerns. We look forward to receiving comments to the Request For Comments posted on the applicant auction website. Written by Sheel Mohnot, Consultant Follow CircleID on Twitter More under: ICANN, Top-Level Domains Categories: Net coverage
|
Recent comments
ICANN newsNet coverage |