CircleID posts

Syndicate content CircleID
Latest posts on CircleID
Updated: 17 weeks 4 days ago

NCUC Workshop: One World, One Internet? New gTLDs & Competition in a Changing Global Environment

Tue, 2013-04-02 23:12

The Noncommercial Users Constituency (NCUC) has organized and is holding a policy workshop, One World, One Internet? New gTLDs & Competition in a Changing Global Environment, next week in Beijing at ICANN-46. The program, which brings together top Western and Chinese experts, will explore pressures for integration versus fragmentation of the Internet and implications for ICANN, as well as different competition and regulation perspectives as they relate to new gTLDs.

Panelists include:

Tarek Kamel, Senior Advisor to President for Governmental Engagement, ICANN
Markus Kummer, Vice President of Public Policy, The Internet Society
William J. Drake, NCUC, and International Fellow and Lecturer, University of Zurich
Yongge Sun, Director, The Internet Society of China
He Baohong, MIIT China Academy of Telecommunication Research
Leonid Todorov, Deputy Director for Government and International Relations, Russian Registry for TLDs

When & Where:

The workshop will be held on Wednesday April 10, 2013 from 13:00 to 15:00 (Beijing time) in Function Room 8AB of the Beijing International Hotel. The workshop is open to everyone and is free to attend. However, for planning purposes, it is requested that you please register. Interpretation and remote participation facilities will be provided by ICANN, more details on this on ICANN's website.

More Details on the workshop provided by NCUC.

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Policy & Regulation, Top-Level Domains

Categories: Net coverage

What Is the Potential Business Impact of New gTLDs On Existing TLDs?

Tue, 2013-04-02 16:46

How will the business of existing top-level domains (TLDs) be impacted by the new gTLDs? Someone asked me this simple question and I was very surprised to see that my online searches couldn't easily find many detailed articles or research related to that point. I found a great number of articles about the potential impact of new gTLDs on regular businesses/brands and any number of articles about how great the new gTLDs will be for companies in the domain name industry, but found surprisingly little research or analysis into how the new gTLDs would impact the business of existing TLDs. I found a few examples of analysis at a ccTLD level (such as a report from NIC.AT), but not much looking at the domain name industry overall. Maybe I was just using the wrong search terms, but my searches yielded little with any detailed view.

So I ask you all here… what research or analysis is out there on this topic? Any suggestions and links left in the comments would be greatly appreciated. Thanks.

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: Domain Names, Top-Level Domains

Categories: Net coverage

Observations in and Around the UN Broadband Commission

Tue, 2013-04-02 06:57

Towards gender equality

The 7th meeting of the UN Broadband Commission in Mexico City was again a good combination of announcements about new plans, results of previously undertaken activities, and views on the future of broadband. Very noticeable was the enthusiasm and acknowledgement of the impact of ICT, and of broadband in particular.

In September 2012 the Commission launched its working group on gender equality. Research undertaken by the various members of the workgroup provided somewhat similar results:

  • Globally there is a 21% gender gap in relation to access to mobile phones, although in South-East Asia this gap is 37%.
  • 40% of women in developing economies find a job due to ownership of a mobile phone.
  • The global gap for internet access is 25%, while in the sub-Saharan countries this is 45%.
  • There are most likely thousands of gender equality pilots. Of these pilots, those that are now delivering results need to move on to the implementation stage.
  • Only 29% of the 119 national broadband plans around the world include policies for gender equality.
  • Empowering young people to adopt ICT will give them the ability to teach their parents, and the reverse of this will also apply.
  • A full half-day of the two-day meeting of the Commission was dedicated to gender equality in broadband. The following day the full Commission endorsed the goal set by the working group calling for global equality in broadband access by 2020. Women are key in household and community development, and gender equality will add between US$13 and US$18 billion to economic GDP (Intel. 2013).

7th Broadband Commission for Digital Development Meeting – Mexico City, Mexico, 16-17 March 2013.
Photo: ITU (Click to Enlarge)The Commission also specifically mentioned that gender equality should not be, or become, a separate single issue. It is not another 'ism'. It should automatically be included in all aspects of ICT, broadband and policies in general. At the moment, technology is not gender-neutral.

An unexpected good news story came from Iraq. In 2011 only 20% of women in that country had access to a mobile phone. Thanks to a new mobile package specifically designed for women by mobile operator Asiacell (part of the Qtel Group) 40% of Asiacell's subscriber base are now women, and an additional 1.8 million of them will have access to a mobile phone by the end of 2014. The package specifically addresses the cultural aspects of womanhood in an Arab country — for example, female sales assistants, access to an all-female call centre, blocking of calls and SMS from certain people — and the way women use mobile — e.g., reduced tariffs for longer calls. It is to be hoped that the ideas and success of this initiative will spread.

The issue of violence against women was highlighted. Worldwide there are most likely hundreds of millions of women who suffer abuse, and this was highlighted with shocking examples from the Syrian refugee camps in Jordan, where girls as young as 12 years will be forced to sell themselves in order to survive. Radio and TV programs are used by the Jordanian government to try and empower these girls, but ICT, and mobile phones in particular, can be used to break through this cycle of abuse.

One million ICT-empowered community workers

In January 2013 the One Million Community Workers program, aimed at providing one million smartphones to community workers — predominantly in the sub-Saharan countries, which has the largest group of least developed countries in the world — was officially launched and adopted by the African Union. Nine countries have already signed up to the program, with another six in the pipeline and more to follow. Both the smartphone vendor community and the mobile operators — MTN in particular — have given their support to this program. This is critical as rural mobile coverage will have to be extended in these countries and low-cost smartphones need to be made available (Huawei announced that by the end of the year there will be a US$50 smartphone).

In relation to healthcare, the UN Foundation (UNF) mentioned that there is huge shift in providing healthcare rather than bringing people to it. Through m-health, healthcare will increasingly be delivered to the people. The UNF recently also launched a report on standards and interoperability in e-health.

New projects of the Commission

New projects that received support from the Commission included:

A commitment to promote digital accessibility for the one billion people with disabilities worldwide, similar to the gender equality goal stimulating the development of policies that will lead to equality in relation to ICT access. Between 30%-50% of people with disabilities do not have access to the internet. In all developing economies, people with disabilities, together with older-aged people, form by far the largest unconnected segment.

Youssou N'Dour – New AfricaCommissioner Youssou N'Dour, the famous African musician and Minister of Tourism of Senegal, received support for his project 'New Africa 2014'. I would like to recommend this very moving video clip to you. His aim is to encourage the use of ICT and broadband by the youth of Africa, through his music. Several Commissioners will attend and speak at his concert in Dakar, Nigeria.

The Commission also launched a new Task Force on the post-2015 development agenda and the future Sustainable Development Goals (SDGs) — or as some prefer to call them Continuous Development Goals. The initiative aims to leverage the huge installed base of mobile handsets to bring new services to communities globally, particularly in the world's poorest countries. ITU's m-Powering Initiative, seeks to act as a catalyst to achieve sustainability, harnessing the power of state-of-the-art ICTs and smart solutions to meet new Sustainable Development Goals.

The Commission's working group on Youth will lead a Global Youth Summit on technology issues, to be held in Costa Rica in November at the invitation of President Laura Chinchilla. Interesting research presented at the meeting by Alcatel-Lucent indicated that in countries with high youth unemployment (Spain, Bangladesh, India, Ghana) 30% of young people indicated a willingness to become an entrepreneur by using their mobile phone and ICT skills.

As young people are quickly becoming tech-savvy it is critical to launch 'train-the-trainer' projects — train community workers, etc. The recently announced educational reforms in Mexico are a good example of a positive direction, as they include a much larger role for ICT in education.

The future of broadband

Last but not least, the future…

While promoting the development of national broadband access and affordability policies continues to be the key goal for the Commission, the focus is starting to shift towards 'broadband as a catalyst for social and economic transformation'. According to Ericsson, 6.5 billion people will be connected to the internet by 2018, and by that time 95% of the global population will have access to mobile technology, with the majority having access to a smartphone.

Several Commissioners were very pleased that access is well and truly underway in many developing countries, and noted that policy development now needs to encompass the demand side (services and applications). While progress has been made in bridging the digital divide, there is now a growing policy gap. This exists particularly in relation to government policies towards the development of e-health, e-education, e-government and e-commerce. There is increased awareness among governments and politicians that their citizens have a right to information, but the problem is that most of that information is not yet available. There is an urgent need to ensure that the supply side in relation to the broadband revolution is addressed as well.

This was demonstrated by an example from India, where the government is presented with one million questions per day. A reply often takes 90 days or more, and, depending upon who answers it, the same question can supply different answers. Imagine the costs that can be taken out of the economy if e-government was widely available.

To illustrate the transformative impact of broadband, Ericsson reports that villages in the Amazon that have a mobile base station saw their GDP increase by 300%. This is done through a completely private project known as Amazon Connect.

On the other hand, the American government has calculated that not being connected to the internet creates an extra cost to the economy of $70,000 per year per family. Internet access allows families and the government to remove costs from their social and economic expenditure.

Another interesting observation is that there has been much faster growth in technology than there has been in the generation of government policies. Governments need to be made aware of the rapidly increasing gap between technology and policy. While this is an international problem — western governments are also struggling with such policies — the gap is growing most quickly in the least developed economies, and the Commission is committed to placing its full network of Commissioners behind the notion of assisting these countries in policy development. The key here is to lower the costs and give these countries complete solutions.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband, Mobile, Telecom

Categories: Net coverage

INET Denver: IPv4 Exhaustion and the Path to IPv6

Mon, 2013-04-01 21:45

INET Denver is April 17, 2013 — register today to reserve your spot!

You won't want to miss this unique opportunity to join IPv6 networking professionals from across North America, who will attend to learn the latest on IPv4 exhaustion and how to transition to IPv6. The INET Denver agenda will bring together top experts in the networking field to discuss the latest on IPv4 exhaustion in our market, and the TCO of IPv6.

The line up of speakers includes industry experts like:

John Curran, President & CEO, ARIN
Owen DeLong, IPv6 Evangelist, Hurricane Electric
Lee Howard, Director of Network Technology, Time Warner Cable
Dr. Patrick Ryan, Public Policy & Government Relations Counsel, Google

When:

April 17, 2013
Registration: 12:00 - 1:00 PM
INET Denver: 1:00 - 6:00 PM
Refreshments: 6:00 - 7:30 PM

Where:

Grand Hyatt Denver
1750 Welton Street
Denver, CO 80202

Additional Details:

http://www.internetsociety.org/events/inet-denver

Registration:

http://www.internetsociety.org/form/inet

The INET Denver will co-locate with the 2013 North American IPv6 Summit. Take part in this unique opportunity to learn from top experts in the networking field discussing the latest on IPv4 exhaustion in our market and the TCO of IPv6.

Don't delay and register today!

Follow CircleID on Twitter

More under: IP Addressing, IPv6

Categories: Net coverage

Second Round of Initial Evaluations for New gTLDs

Mon, 2013-04-01 20:31

Mary Iqbal writes to report that ICANN has released the second round of Initial Evaluation Results on March 29. ICANN is currently reviewing new gTLD applications at a rate of 30 applications per week and has plans to increase that to 100 per week. ICANN is targeting completing Initial Evaluation for all applicants by August 2013. To learn more, visit www.GetNewTLDs.com/news.

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

ICANN Announces Blocking Usage Review Panel

Mon, 2013-04-01 17:52

Culminating a year-long policy development process, ICANN today launched its new Blocking Usage Review Panel (BURP). The BURP provides long-needed oversight over services that block Internet traffic.

"While everyone understands that national laws such as the U.S. CAN SPAM define what traffic is or is not elegible to block, legal processes can be slow and cumbersome," said a spokeswoman. "Since the Internet is global and traffic often traverses multiple countries, the array of different laws cause uncertainty."

The BURP is designed to be quick and easy. No signup process is needed, since everyone who sends traffic to or from the Internet is covered automatically. When a complaint is filed, an evaluation panel is selected with a member from each constituency:

  • IP based blocklists including Spamhaus, UCEPROTECT, SORBS, and Spamcop
  • Major brand advertisers including Kraft, the AARP, and Vistaprint
  • Public interest groups such as the Electronic Frontier Foundation, Free Software Foundation, and Stophaus

The BURP panel will meet and promptly produce its decision, typically in no more than six to ten weeks. During that time, to prevent inadvertent damage, any blocking will be suspended.

"While it is possible that a small amount of spam or malware might slip through during the decision period, we're confident that the increased transparency far outweighs any minor inconvenience," noted ICANN.

Spamhaus president Steve Linford, contacted at their temporary headquarters in space subleased from Google in Chapel Hill NC commented:

"Spamhaus welcomes this increased level of detailed oversight. We expect the BURP to increase confidence among major stakeholders including marketers, the press, and developers of installable software."

ICANN disclosed that they have hired a well known specialist in e-mail marketing, who recently completed a multi-year assignment.

"We are fortunate to have been able to retain Mr. Alan Ralsky to oversee the new BURP. His broad industry experience uniquely qualifies him for the role," said ICANN, "and the timing couldn't be better."

Written by John Levine, Author, Consultant & Speaker

Follow CircleID on Twitter

More under: ICANN

Categories: Net coverage

U.S. CERT Issues Alert on DNS Amplification Attacks

Sun, 2013-03-31 19:22

Neil Schwartzman writes to report that U.S. Cert issued Alert TA13-088A on Friday March 29, 2013. "It is a solid how-to guide to test for, and remediate DNS configurations that can be used for Distributed Denial of Service attacks."

From the Alert: "While the attacks are difficult to prevent, network operators can implement several possible mitigation strategies. The primary element in the attack that is the focus of an effective long-term solution is the detection and elimination of open recursive DNS resolvers. These systems are typically legitimate DNS servers that have been improperly configured to respond to recursive queries on behalf of any system, rather than restricting recursive responses only to requests from local or authorized clients. By identifying these systems, an organization or network operator can reduce the number of potential resources that the attacker can employ in an attack."

Follow CircleID on Twitter

More under: Cyberattack, DDoS, DNS, DNS Security, Security

Categories: Net coverage

A Closer Look at Recent Submarine Cable Failures

Sat, 2013-03-30 05:29

In light of the recent submarine cable failures, Doug Madory from Renesys has a detailed report on what has happened to some of the providers in four countries along the route of the cable: Egypt, Saudi Arabia, Pakistan and India.

Madory writes: "It has been a rough few weeks for the global Internet, given numerous submarine cable failures and the largest DDOS attack ever reported. While we're hard-pressed to find evidence of the purported global Internet slowdown due to the DDOS attack, the dramatic impacts of yesterday's SMW4 submarine cable cut were profound. Recent reports that the cable break was the result of sabotage, makes the incident even more intriguing."

Read the full report here.

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: Net coverage

Verisign Doesn't Think the Net Is Ready for a Thousand New TLDs

Sat, 2013-03-30 05:12

Yesterday Verisign sent ICANN a most interesting white paper called New gTLD Security and Stability Considerations. They also filed a copy with the SEC as an 8-K, a document that their stockholders should know about.

It's worth reading the whole thing, but in short, their well-supported opinion is that the net isn't ready for all the new TLDs, and even if they were, ICANN's processes or lack thereof will cause other huge problems.

The simplest issues are administrative ones for ICANN. In the olden days updates to the root zone were all handled manually, signed email from ICANN to Verisign, who manages the root zone, with a check at NTIA, who oversees it under longstanding contracts. As the number of changes increased, more due to added IPv6 and DNSSEC records than increased numbers of TLDs, the amount of email got unwieldy so they came up with a new system where the change data is handled automatically with people looking at secure web sites rather than copy and paste from their mailboxes. This system still in testing and isn't in production yet; Verisign would really prefer that it was before ICANN starts adding large numbers of new TLDs.

The new domains all have to use the Trademark Clearinghous (TMCH), a blacklist of names that people aren't allowed to register. Due to lengthy dithering at ICANN, the the TMCH operator was just recently selected, and they haven't even started working out the technical details of how registry operators will query it in real time as registrations arrive.

There are other ICANN issues as well, the process for transferring a failed registry's data to a backup provider isn't ready, nor is zone file access for getting copies of zone data, nor are the pre-delegation testing reqiurements done, and the GAC (the representatives from various governments) could still retroactively veto new domains even after they'd been placed in service.

All of these issues are well known, and the technical requirements have been listed in the applicant guidebook for several years, so it does reflect poorly on ICANN that they're so far from being ready to implement the new domains.

Most importantly, Verisign notes that the root servers, who are run by a variety of fiercely independent operators, have no coordinated logging or problem reporting system. If something does go wrong at one root server, there's no way to tell whether it's just them or everyone other than making phone calls. Verisign gives some examples of odd and unexpected things that happened as DNSSEC was rolled out, and again their concerns are quite reasonable.

An obvious question is what is Verisign's motivation in publishing this now. Since they are the registry for .COM and .NET and a few smaller domains, one possibility is FUD, trying to delay all the new domains to keep competitors out of the root. I don't think that's it. Over 200 of the applications say that they'll use Verisign to run their registries, so Verisign stands to make a fair amount of money from them. And everyone expects that to the extent the new TLDs are successful at all, it'll be additional, often defensive registrations, not people abandoning .COM and .NET.

So my take on this is that Verisign means what they say, the root isn't ready for all these domains, nor are ICANN's processes ready, and Verisign as the root zone manager is justifiably worried that if they go ahead anyway, the root could break.

Update: Thu April 4, 2013
A follow up to the discussed Verisign's white paper, New gTLD Security and Stability Considerations, in which they listed a bunch of reasons that ICANN isn't ready to roll out lots of new TLDs. Among the reasons were that several of the services the new GTLDs are required to use aren't available yet, including the Emergency Back End Registry Operators (EBEROs), who would take over the registry functions for a TLD whose operator failed. They were supposed to have been chosen in mid-2012. By complete coincidence, ICANN has announced that they had chosen the three Emergency End Registry Operators. I can't wait to see what happens next week.

Written by John Levine, Author, Consultant & Speaker

Follow CircleID on Twitter

More under: DNS, DNS Security, ICANN, Security, Top-Level Domains

Categories: Net coverage

The Spamhaus Distributed Denial of Service - How Big a Deal Was It?

Sat, 2013-03-30 02:49

If you haven't been reading the news of late, venerable anti-spam service Spamhaus has been the target of a sustained, record-setting Distributed Denial-of-Service (DDoS) attack over the past couple of weeks.

Al Iverson over at Spamresource has a great round-up of the news, if you haven't managed to catch the news, go check it out, then come on back, we'll wait ...

Of course, bad guys are always mad at Spamhaus, and so they had a pretty robust set-up to begin with, but whoever was behind this attack was able to muster some huge resources, heretofore never seen in intensity, and it had some impact, on the Spamhaus website, and to a limited degree, on the behind-the-scenes services that Spamhaus uses to distribute their data to their customers.

Some reasonable criticism, was aimed at the New York Times, and Cloudflare for being a little hyperbolic in their headlines and so on, and sure, it was a bit 'Chicken Little'-like, the sky wasn't falling and the Internet didn't collapse.

But, don't let the critics fools you, this was a bullet we all dodged.

For one, were Spamhaus to be taken offline, their effectiveness in filtering spam and malware would rapidly decay, due to the rate at which their blocklists need to be updated. The CBL anti-botnet feed and the DROP list both have many additions and deletions every day. These services are used to protect mail servers and networks against the most malicious criminal traffic. If they go down, a lot of major sites would have trouble staying up, or become massively infected with malware.

There are also a ton of small email systems that use the Spamhaus lists as a key part of their mail filtering (for free as it turns out). Were those lookups prevented, or tampered with, those systems would buckle under the load of spam that they dispense with easily thanks to Spamhaus.

To put it into perspective, somewhere between 80% & 90% of all email is spam, and that's the stuff Spamhaus helps filter. So it doesn't take a Rocket Scientist to figure out that if filters go out, so do the email systems, in short order. AOL's Postmaster famously said, at an FTC Spam Summit a decade ago, before the inception of massive botnets, that were their filtering to be taken offline, it'd be 10 minutes before their email systems crashed.

Due to some poorly researched media reports (hello, Wolf Blitzer!), there is a perception that this is a fight between two legitimate entities, Spamhaus and Stophaus; some press outlets and bloggers have given equal time to the criminals (we use that word advisedly, there is an ongoing investigation by law enforcement in at least five countries to bring these people to justice). Nothing could be further from the truth. The attackers are a group of organized criminals, end of story. There is nothing to be celebrated in Spamhaus taking it on the chin, unless you want email systems and networks on the Internet to stop working.

So yeah, it was a big deal.

Written by Neil Schwartzman, Executive Director, CAUCE North America

Follow CircleID on Twitter

More under: Cyberattack, Cybercrime, Data Center, DDoS, DNS, DNS Security, Email, Malware, Security, Spam

Categories: Net coverage

DNS Reflection/Amplification Attack: Proved

Fri, 2013-03-29 18:49

Last year there was a "threat" by anonymous group to black out Internet by using DNS Reflection/Amplification attack against the Internet DNS Root servers. I even wrote a little article about it: "End of the world/Internet

In the article I was questioning if this was even possible and what was needed as general interest and curiosity.

Well, looking at the "stophaus" attack last week, we are getting some answers.

I would say it is a real threat now and is a valid attack vector. Seems you only need a couple of ingredients:

Open recursive DNS servers

Many of these are already available, and numbers increase. This not only includes dedicated DNS Server systems, but also any equipment attached to the internet capable of handling DNS requests it seems (like cable-modems, routers, etc). So the risk this will be utilized again, will be greater every day now.

A party that is capable/willing do set it off

Seems that there are more and more parties on the Internet that open to "attack" certain entities on the Internet to defend their believes. In above case, stressing even the Internet and influence the usage of everyone on it.

Infrastructure

Lets call it the "Internet", "Logistics" and "Bandwidth". Looking at the numbers, it is apparent that you need little (in context) and it is possible to do so if you want. Technology, services or other wise it is not really challenging. And it can be done not from a shady area/country either.

I suspect we will see more of this happening now the "proof-of-concept" is done. It still worries me when the real guns are pulled out and focus would shift from particular entities to the root infrastructure of the Internet.

I had a couple of talks with my expertise peers on this how to mitigate this, it is very difficult as it is sheer load coming from every corner of the Internet. We really did not come up with a single solution. Mitigation would probably mean "breaking" some parts of the Internet as collateral damage, which in size would probably be disruptive enough as well.

Main concern in this, again, is the "open resolvers" out there that we cannot control without education and regulation on how DNS is deployed (you know, the thing we are allergic/apathetic about on/about Internet).

The more thoughts I give this, the more I think the solution is not only technical but mostly an organisational/educational/regulation one… Before that is in place, we probably will experience some outages…

Written by Chris Buijs, Head of Delivery

Follow CircleID on Twitter

More under: Cyberattack, DDoS, DNS, DNS Security

Categories: Net coverage

Largest DDoS Attack To Date Aimed at Spamhaus Effects Global Internet Traffic

Wed, 2013-03-27 18:31

The internet around the world has been slowed down in what security experts are describing as the biggest cyber-attack of its kind in history. A row between a spam-fighting group and hosting firm has sparked retaliation attacks affecting the wider internet. It is having an impact on popular services like Netflix — and experts worry it could escalate to affect banking and email systems.

Read full story: BBC

Follow CircleID on Twitter

More under: Cyberattack, DDoS, Spam

Categories: Net coverage

Live Webcast Thursday March 28 of ION Singapore IPv6 and DNSSEC Sessions

Wed, 2013-03-27 18:00

For those of you interested in IPv6 and/or DNSSEC, we'll have a live webcast out of the Internet Society's ION Singapore conference happening tomorrow, March 28, 2013, starting at 2:00pm Singapore time.

Sessions on the agenda include:

  • The Business Case for IPv6 & DNSSEC
  • Deploying DNSSEC: From End-customer to Content
  • Industry Collaboration: Working Together to Deploy IPv6

Joining the sessions are a variety of speakers from across the industry and within the Asia Pacific region. Information about the webcast can be found at:

http://www.internetsociety.org/deploy360/ion/singapore2013/webcast/

We'll also be recording the sessions so you can view them later. For example, given that Singapore time is 12 hours ahead of U.S. Eastern time, I don't expect many of the folks I know there to be up at 2am to watch these sessions!

The ION Singapore conference is produced by the Internet Society Deploy360 Programme and is part of the ICT Business Summit taking place this week in Singapore. I just got to meet some of the panelists at a dinner tonight and I think the sessions tomorrow should be quite educational and also quite engaging and fun. Please do feel free to tune in if you are interested and have the chance to do so.

P.S. In full disclosure I am employed by the Internet Society to work on the Deploy360 Programme and for once a post of mine at CircleID IS related to my employer.

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: DNS, DNS Security, IPv6, Security

Categories: Net coverage

ICANN Launches the Trademark Clearinghouse Amid gTLD Expansion

Tue, 2013-03-26 17:43

ICANN today launched a database to enable trademark holders register their brands for protection against the upcoming new gTLDs. The Trademark Clearinghouse, according to ICANN, is the only officially authorised solution offering brands a one-stop-foundation for the safeguarding of their trademarks in domain names across the multiple new gTLDs that will go live from summer 2013. The cost of registering a trademark ranges between $95 and $150 a year.

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

SQL Injection in the Wild

Mon, 2013-03-25 23:13

As attack vectors go, very few are as significant as obtaining the ability to insert bespoke code in to an application and have it automatically execute upon "inaccessible" backend systems. In the Web application arena, SQL Injection vulnerabilities are often the scariest threat that developers and system administrators come face to face with (albeit way too regularly). In fact the OWASP Top-10 list of Web threats lists SQL Injection in first place.

This "in the wild" SQL Injection attempt was based upon the premise that video cameras are actively monitoring traffic on a road, reading license plates, and issuing driver warnings, tickets or fines as deemed appropriate by local law enforcement.
(Click to Enlarge)More often than not, when security professionals discuss SQL Injection threats and attack vectors, they focus upon the Web application context. So it was with a bit of fun last week when I came across a photo of a slightly unorthodox SQL Injection attempt — that of someone attempting to subvert a traffic monitoring system by crafting a rather novel vehicle license plate.

My original tweet got retweeted a couple of thousand of times — which just goes to show how many security nerds there are out there in the twitterverse.

This "in the wild" SQL Injection attempt was based upon the premise that video cameras are actively monitoring traffic on a road, reading license plates, and issuing driver warnings, tickets or fines as deemed appropriate by local law enforcement.

At some point the video captures of the passing vehicle's license plate must be converted to text and stored — almost certainly in some kind of backend database. The hope of the hacker that devised this attack was that the process would be vulnerable to SQL Injection — and crafted a simple SQL statement that could potentially cause the backend database to drop (i.e. "delete") the table containing all of the license plate information.

Whether or not this particular attempt worked, I have no idea (probably not if I have to guess an outcome); but it does help nicely to raise attention to this category of vulnerability.

As surveillance systems become more capable — digitally storing information, distilling meta-data from image captures, and sharing observation data between systems — it opens many new doors for mischievous and malicious attack.

The physical nature of these systems, coupled with the complexities of integration with legacy monitoring and reporting systems, often makes them open to attacks that would be classed as fairly simple in the world of Web application security.

A common failure of system developers is to assume that the physical constraints of the data acquisition process are less flexible than they are. For example, if you're developing a traffic monitoring system it's easy to assume that license plates are a fixed size and shape, and can only contain 10 alphanumeric characters. Meanwhile, the developers of the third-party image processing code had no such assumptions and will digitize any image. It reminds me a little of the story in which reuse of some object-oriented code a decade ago resulted in Kangaroos firing Stinger missiles during a military training simulation.

While the image above is amusing, I've encountered similar problems before when physical tracking systems integrate with digital backend processes — opening the door to embarrassing and fraudulent events. For example, in the past I've encountered similar SQL Injection vulnerabilities within systems such as:

  • Toll booths reading RFID tags mounted on vehicle windshields — where the tag readers would accept up to 2k of data from each tag (even though the system was only expecting a 16 digit number).
  • Credit card readers that would accept pre-paid cards with negative balances — which resulted in the backend database crediting the wrong accounts.
  • RFID inventory tracking systems — where a specially crafted RFID token could automatically remove all record of the previous hours' worth of inventory logging information from the database allowing criminals to "disappear" with entire truckloads of goods.
  • Luggage barcode scanners within an airport — where specially crafted barcodes placed upon the baggage would be automatically conferred the status of "manually checked by security personnel" within the backend tracking database.
  • Shipping container RFID inventory trackers — where SQL statements could be embedded to adjust fields within the backend database to alter Custom and Excise tracking information.

Unlike the process of hunting for SQL Injection vulnerabilities within Internet accessible Web applications, you can't just point an automated vulnerability scanner at the application and have at it. Assessing the security of complex physical monitoring systems is generally not a trivial task and requires some innovative approaches. Experience goes a long way.

Written by Gunter Ollmann, Chief Technology Officer at IOActive

Follow CircleID on Twitter

More under: Security

Categories: Net coverage

So, How Big Is the Internet?

Mon, 2013-03-25 21:26

The results of an excellent study made, for reasons that will become clear, by an anonymous author reaches this conclusion:

So, how big is the Internet?
That depends on how you count. 420 Million pingable IPs + 36 Million more that had one or more ports open, making 450 Million that were definitely in use and reachable from the rest of the Internet. 141 Million IPs were firewalled, so they could count as "in use". Together this would be 591 Million used IPs. 729 Million more IPs just had reverse DNS records. If you added those, it would make for a total of 1.3 Billion used IP addresses. The other 2.3 Billion addresses showed no sign of usage.

Notice that, of the roughly 4 billion possible IPv4 addresses, less than half appear to be "owned" by somebody and only 591 million appear to be active.

The problem is, to make the study, the author created a botnet — that is he wrote a small program that took advantage of insecure devices to enlist additional machines to help in the study. What is amazing (if you are not a security researcher) is the extent to which he was able to coop insecure devices testing only four name/password combinations, e.g. root:root, admin:admin and both without passwords.

This is very valuable research and it was apparently done without causing anyone any harm. None-the-less, the US government has treated this kind of research as a crime in the past even before all the cyber security laws of the past decade. So I hope this researcher anonymity holds.

Written by Brough Turner, Founder & CTO at netBlazr

Follow CircleID on Twitter

More under: Web

Categories: Net coverage

ICANN Releases Initial Evaluation Results for First Set of New gTLD Applications

Mon, 2013-03-25 18:58

The first round of Initial Evaluation results has been released exactly on schedule. On March 23, ICANN announced that 27 out of 30 new gTLD applications reviewed this round passed Initial Evaluation. The remaining three applicants are still marked as in Initial Evaluation. For more details see, '27 Applicants Passed Initial Evaluation in the First Round' via www.GetNewTLDs.com.

Follow CircleID on Twitter

More under: ICANN, Top-Level Domains

Categories: Net coverage

To Tax or Not to Tax

Mon, 2013-03-25 18:36

The Writing's On The Wall

Well it is not new that the US has always maintained that the Internet should be a tax free zone as per the US Congress's Tax Freedom Act 1998 (authored by Representative Christopher Cox and Senator Ron Wyden and signed into law on October 21 1998 by then President Clinton) which following expiry continued to be reauthorized and its most recent re-authorization (legal speak for extension) was in October 2007 where this has been extended till 2014. It is unclear whether there will be another extension post 2014. There is a moratorium on new taxes on e-commerce, and the taxing of internet access via the Tax Freedom Act. Whilst the US Congress's Tax Freedom Act 1998 bars federal, state and local governments from taxing Internet access and from imposing discriminatory Internet only taxes such as bit taxes, bandwidth taxes and email taxes, it also bars multiple taxes on electronic commerce. It does not exempt sales made on the Internet from taxation, as these may be taxed at the same state and local sales tax rate as non Internet sales.

New Bill in the House

With the introduction of the US Marketplace Fairness Act in 2013 in both the Senate and the House of Representatives will make for some interesting discussions and lobbying at the Hill. Whilst the Bill in its current form acknowledges the exemptions that are currently in place — the manner in which discussions play out by the manner in which both Senators and Representatives are having reflect a change in atmospheric pressure — which in my mind is significant.

In 1998 the US Senate voted 96-2 to approve the Tax Freedom Act and the mere fact that the new Bill has 28 Co Sponsors and in the House of Reps, there are 47 co sponsors is indicative of either a shift in paradigm or that State coffers are screaming to be filled.

The S.336 Marketplace Fairness Act of 2013 introduced on February 14 day, 2013 and sponsored by US Senator Michael Enzi [R-WY] There are 28 co-sponsors (21D, 6R, 1I).

There is a prognosis that the Bill might not get past the Committee and 0% chance of getting enacted.

The H.R.684: Marketplace Fairness Act of 2013 introduced on February 14, 2013 and sponsored by US Rep. Steve Womack [R-AR3] had 47 cosponsors (25D, 22R). There is a prognosis that it has a 28% chance of getting past the committee and 11% chance of getting enacted.

To Tax or Not to Tax

The term 'electronic commerce' (e commerce) means any transaction conducted over the Internet or through Internet access, comprising the sale, lease, license, offer, or delivery of property, goods, services, or information, whether or not for consideration, and includes the provision of Internet access.

As early as 2000, the problems of tax free e commerce was discussed during the first E Commerce Roundtable meeting in Washington D.C. If e-commerce proceeds untaxed, it would mean that state treasuries would face an eroding tax base. States within the United States of America rely on sales tax for approximately 25-40% of their revenue. As such there is a trade-off or opportunity cost as other taxes may have to increase to make up for the deficit caused by tax-free e-commerce.

The deficit caused by tax free e-commerce means that other taxes may be subjected to increase and also potential funding may be siphoned away from other priority areas. Traditional firms or businesses who do not trade electronically are at a disadvantage as they are forced to collect sales tax at the register. This is why it is sometimes cheaper to purchase a pair of boots online than if you were to walk into a traditional store.

One of the issues that was discussed in the E commerce round table meeting was the widening of the digital divide where people without credit cards or Internet access may be forced to shoulder the burden of sales tax.

E Commerce is blossoming

Global business-to-consumer e-commerce sales will pass the 1 trillion euro ($1.25 trillion) mark by 2013, and the total number of Internet users will increase to approximately 3.5 billion from around 2.2 billion at the end of 2011, according to a new report by the Interactive Media in Retail Group (IMRG), a U.K. online retail trade organization as reported by Internet Retailer dot com . The study estimates that business-to-consumer e-commerce sales in 2011 increased to 690 billion euros ($961 billion), an increase of close to 20% from a year earlier.

According to that study, the US remains the world's largest single market as far as e commerce goes. The same study highlighted that with China's phenomenal growth rates that it is speculated to surpass the United States in this regard shortly.

The US Department of Commerce reported that Total Retail Sales from the fourth quarter of 2012 was estimated at $1,105.8 billion which is an increase of 4% from the third quarter of the same year.

Only Time Will Tell

Whether the US Marketplace Fairness Act will eventually get passed and enacted is something that only time will tell but the timing is certainly interesting.

Written by Salanieta Tamanikaiwaimaro, Director of Pasifika Nexus

Follow CircleID on Twitter

More under: Internet Governance, Law, Policy & Regulation

Categories: Net coverage

Fiber to the Home: 'Awesome' - But What Is Its Purpose?

Fri, 2013-03-22 19:56

Two approaches can be taken towards the development of Fiber to the Home (FttH). One is all about its commercial potential — the sale of the most awesome commercial applications in relation to video entertainment, gaming and TV. The other is a perhaps more sophisticated approach — from the perspective of social and economic development.

Of course the two are not mutually exclusive. Those who successfully follow the commercial route create an infrastructure over which those other social and economic applications will eventually be carried as well. This is quite a legitimate route, but the reality is that most people in this situation will say 'the FttH entertainment applications are absolutely awesome, but totally useless'. In other words, nice to have but it is highly unlikely that people will pay for them.

We basically see this with such commercial FttH deployments around the world. Commercial FttH subscriptions cost consumers well over $100 per month, and at such a price penetration in developed countries will reach no more than approximately 20%. That will not be sufficient mass to launch other social and economic applications over such a network.

If we are serious about those national benefits we will have to treat FttH differently — not just as another telecoms network, but as national infrastructure. However the all-powerful telcos will fight such an approach tooth and nail, since that would make their network a utility. They are used to extracting premium prices based on their vertically-integrated monopolies and they are in no mood to relinquish this. Simply looking at the amount of money telcos spend on lobbying reveals that they do not want to see government making any changes to their lucrative money-making schemes.

It will be interesting to see what Google Fibre in Kansas City will do. Its price is more affordable (around $75) but it is still operating on that 'awesome entertainment' level. Will it be able to attract sufficient customers to eventually create that broader infrastructure that will be used by a far greater range of applications? We estimate that it would be able to achieve around 40% penetration, and if it could move past 'awesome but useless' that could grow to 60%. By that time sufficient mass would have been created to move to the next stage. So, all very doable over, let us say, a five-year period.

The good thing is that if any company can create such a breakthrough development it is Google. It is not a telco. It simply wants to prove the business case — that FttH makes business sense. If it can prove the commercial success of Ftth it is more likely that other telcos will follow. There is no way Google on its own can fibre the USA, let alone the world. So its role in relation to Google Fibre is to extend the global FttH footprint by example, as that would allow it to increase the number of next-gen applications and service. With its dominant position in this market the spill-over from that is many times larger than the financial gains the company can make running a FttH network.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: Net coverage

Technology Fights Against Extreme Poverty

Thu, 2013-03-21 19:05

One of the good things about participating in the meetings of the UN Broadband Commission for Digital Development is seeing the amazing impact our industry has on the daily lives of literally billions of people. While everybody — including us — is talking about healthcare, education and the great applications that are becoming available in these sectors, the real revolution is taking place at a much lower level.

If one looks in particular at those who live below the extreme poverty line of $1.25 per day then e-health and e-education are certainly not the first applications that reach these people. The most fundamental change happens when people get access to communications — thus extending their network beyond neighbours, who are probably living below the poverty line as well, and so are unable to do much to lift the community out of its misery. In the 1990s Broadband Commissioner Muhammad Yunus through his Grameen Bank initiative showed that a simple mobile phone (2G) in a Bangladesh village, and, by extension, in any other village operating below the poverty line, can lift the local economy by 20%. This technology gives access to data, and people can make calls to find out what is the best market to go to today to sell the fish they just caught, or find out what the market price is for their wheat (not just the price that their middleman is quoting).

Access to facts is liberating people, and with facts they can start improving their lives. Once people know something, it cannot be taken away from them and therefore will create a lasting change. People will use that knowledge, data and information to make social and economic improvements.

On a larger scale the same thing happens when access is obtained to facts that go beyond what the local politicians are providing, or hiding. The Arab Spring is a good example here. While its end result is not yet clear there is no way back once people have the facts; again, this is a very liberating experience and will ultimately lead to improving people's lives and lifestyles.

Another of the Broadband Commissioners, Dr Mohamed Ibrahim, the founder of Celtel in Africa, is a staunch supporter of the movement 'one.org'. This grassroots, non-political organisation is concentrating on eradicating extreme poverty and statistics are showing that this could be possible before 2030.

Extreme poverty has already declined and this trend is accelerating. In 1990 43% of the global population fell into the category of extreme poverty; by 2000 this had dropped to 33%; and by 2010 it had dropped further, to 21%. Interestingly, the fastest acceleration of this trend is taking place in most of the poorest countries in Africa.

Rock star and activist Bono stated in a recent TED presentation that the major obstacles to this process of acceleration are inertia, loss of momentum and corruption. The silver lining here, especially in relation to the latter, is that again technology is a driving force for change. With access to communications and facts it becomes much easier to expose corruption. Technology makes it easier to create a more transparent society and, while corruption will never be stamped out altogether, extreme corruption will be greatly reduced.

It is great to work with the Broadband Commission to develop projects and programs, using our technologies, to ensure that the social and economic processes accelerate these positive developments, creating greater equality. The high ranking of those involved makes it possible to get these messages across at the highest levels of government and the highest level governance of the international organisations addressing these issues.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: Net coverage