Joho the Blog » broadband

August 4, 2010

Akamai report: U.S. broadband speeds continue to fall behind

Akamai is in a unique position to judge actual broadband speeds around the world. Its latest “state of the Internet” report says that the U.S. is continuing to fall behind.

BroadbandBreakfast‘s takeaway is:

…Only 25% of the US has access to a connection above 5Mbps.

The fastest city in the world is Masan, South Korea which has an average Maximum Connection speed of 40.56Mbps; the first showing of the United States is at number 57 with Monterey Park, CA with a speed of 25.2Mbps.

When looking at the average connection speeds the United States again lags behind the rest of the world. Monterey Park, CA having the fastest connection again possesses the fastest average connection of just 7Mbps.

Within the United States, Delaware boasts the fastest average measured connection speed of 7.6Mbps, with the District of Columbia being the next fastest with a speed of 5.9Mbps. The slowest states in the nation are Alaska and New Jersey.

Doc Searls has a different take-away. He notes that Akamai only reports on download speeds, not uploads, because Akamai is among the set of institutions — which includes the U.S. access providers and, alas, too much of our government — that thinks the Net is primarily for the passive consumption of content. Doc has written about this here (and I recommend the discussion in the comments as well).


Meanwhile, as a single data point that proves nothing, but does let me vent: Our daughter is moving into an apartment in Brighton, Mass. The landlord has done a deal that gives Comcast exclusive rights to provide Internet access, freezing out Verizon and RCN, both of which are available next door. So, we’ve been looking at Comcast’s service plans. Herewith a rant:

My daughter only wants Net access, not TV or landline, but Comcast makes it as hard as possible to buy unbundled service. E.g., across the page of Comcast service offerings are four tabs: Net, TV, Phone, Bundles. We are on the Net page. But, guess what? All of the offers on that page turn out to be for bundles. There is functionally no way to buy unbundled Internet service from Comcast over the Net. Or, if there is, they’ve successfully hidden it. Well done!

The Comcast Web site is a mess. On the same page (here, but I had to go through an address verification to get there) the same offer — “Performance” — is listed twice, at different prices. Further, the service description notes “This special price is for customers who currently subscribe to Comcast Cable or Comcast Digital Voice® service,” but the Terms and Conditions make no mention of that. Further, there is no information about what the price would be for non-subscribers and people who don’t want to buy a bundle.

A long phone call revealed that the price for “Performance” is about $60/month for 15mbps down and 3mbps up. (Of course, those are maximums; there is no guarantee of what actual speeds will be if, say, there are “broadband hogs” — i.e., people who use more of what they’ve paid for than Comcast wants). My daughter would prefer to pay less for a lower broadband rate, but the only lower offer is for a tenth of the capacity — they call it “Economy” but they ought to rename it “The Email Package” — which is too little for her needs.

The landlord’s exclusivity deal has locked out competition, but Comcast’s pricing, packages, and anti-user Web site are its responsibility.

1 Comment »

August 3, 2010

Tough love for Jules Genachowski

Harold Feld, who I consider to be one of the essential commenters on FCC issues, has written a “tough love” post, urging FCC Chair Jules Genachowski to take decisive action and lead the FCC. I agree. I think JG can do great things at the FCC. He should do them beginning now.

My hunch — and it’s nothing more than that — is that JG is trying to lead in the Obama-esque way: according each side its dignity and trying to find common ground. I support that when it has a possibility of working. I supported that even when it failed for Obama, because it was important to remind Americans that strong leadership doesn’t mean contemptuously disregarding those who disagree with you. But I also supported Obama when, after giving reconciliation a more than generous effort, he stood firm and acted.

It’s time for Genachowski to stand firm and act at the FCC. He has a vision for the Internet as a place where small voices speak and where new ideas get a fair chance. He understands the Internet as a potentially transformative force in culture, business, education, and democracy. He will not achieve his vision by compromising with those who view our Internet as their delivery channel for commercial content.

Jules Genachowski can have a transformative impact. It is far from too late for that. The Genachowski FCC can clear the way for the Internet — our Internet — to achieve its transformative possibilities for culture, business, education, democracy. I believe in Genachowski’s vision. I trust his intentions. I hope he will act.

7 Comments »

July 29, 2010

GE pushes ahead with software-defined radio … good news for civilians, too?

In a press release that is barely comprehensible (or, quite possible, totally incomprehensible) to one such as I, GE has announced a new generation of components that can be used for, among other things, software-defined radios. It is unclear to me whether this technology is designed for anything except military use, but …

Software-defined radios (SDRs) are not the next generation of transistor radios or boomboxes (ask your parents, kids). They are radios in the more primordial sense of being devices that can receive radio-wave signals. The radios you and I are used to are hard-wired to do one thing: Tune into specific frequencies and translate the radio signals into toe-tapping tunes or the blather of infuriating talk show hosts. SDRs can be programmed to do anything they want with any type of signal they can receive. For example, they might treat messages as, say, maps, or signals to turn on the porch light … or as Internet packets.

SDRs matter a lot if only because they promise an alternative to the current broadcast medium. The way it works now, the FCC divvies up spectrum (i.e., frequencies) for particular uses and sells much of it to particular broadcasters. So, your hard-wired radio responds to particular frequencies as carriers of acoustic information sent by known, assigned providers: 106.7 on your radio dial, or whatever. This is a highly inefficient use of spectrum, like dedicating particular lanes of a multi-lane highway to a specific trucking companies. It’d be far more efficient if transmitters and receivers could intelligently negotiate, in real time, which frequencies they’re communicating on, switching to frequencies that are under-trafficked when a particular “lane” is jammed. If our radio receivers — not just our in-dash radios, but all devices that receive radio wave transmissions — were smart devices (SDRs), we could minimize the amount of spectrum we assign to a handful of highly-capitalized broadcasters. We would have more bandwidth than we could eat.

So, I think it’s good news that GE is pushing ahead with this and is commercializing it … unless I’m misunderstanding their announcement, the technology’s uses, and GE’s intentions to commercialize it.

1 Comment »

July 20, 2010

The competitive difference

Brough Turner has done some investigative work. Here’s the photo that summarizes it:

The following is an edited, paraphrased version of Brough’s comments on the mailing list I got this from (with Brough’s permission):

In the picture, the building on the right is 111 Huntington Avenue in Boston. It’s served by 7+ separate carriers each of which owns their own fiber into the building. The price quoted on the slide is Cogent’s list price for a 3 year contract (lower prices and/or shorter terms are available to those who can wait for an end-of-quarter special).

The building on the left is 170 Huntington Avenue in Boston. There is Verizon fiber into this building, but apparently no other carrier has their own fiber into this building. The price quoted is what a friend’s IT department signed up for less than 45 days ago.

In both cases we are comparing “dedicated” services, i.e. a supposedly committed information rate service. Yes, Verizon’s price per Mbps would be better if the customer had ordered 155 Mbps, but the disparity would still be outrageous.

Gotta love competition. Brough’s case study is one more data point confirming Yochai Benkler’s massive study of broadband around the world [pdf] that found the countries that surpass the US in price and penetration are generally ones with competitive markets for broadband.

Be the first to comment »

July 15, 2010

Verizon wants to own the exchange of health information

According to a post by Carl Brooks at SearchCloudComputing, Verizon is making a major push to be the provider of health information exchange services:

The Verizon Health Information Exchange can be used by doctors and healthcare providers to store, manage and transfer patient information, including medical records, test results, medical images and more, all hosted on Verizon’s infrastructure.

The project is nothing if not ambitious. Verizon says it is ready to roll nationwide and can absorb as many electronic medical records (EMR) as are currently out there; there may eventually be one for every person in the United States. It may even offer personal health records (PHR) to its telco customers.

This sort of service seems valuable. In fact, it’s so valuable that it makes me nervous that it would be in the hands of a telecommunications provider. For example, MedVirginia says that “its entire base of patient records will be stored with Verizon and delivered via the cloud.” Are we sure that this vital service should be a company that also sells access to the cloud? Will there be temptations for Verizon to use its ownership of the medical records infrastructure (“store, manage and transfer patient information”) to leverage its position as an access provider, or vice versa? Will medical images in Verizon’s vault arrive faster for Verizon’s ISP customers? Is that what we really want? Wouldn’t it be better for us all to have this service in the hands of someone who has zero interest in how we access that information? And I’m putting all of these as questions because I have vague suspicions but nothing more.

Maybe I’m just especially nervous because today is the last day to leave a comment for the FCC about Net neutrality.
The

4 Comments »

June 4, 2010

[pdf] Susan Crawford: Rethinking broadband

Susan Crawford says, “We are in the course of a titanic battle for the future of the Internet in the United States. The technology community is radically underrepresented in this battle.”

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Telephone providers and cable providers have each been merging, increasing monopoly holds on regions.The government has a key role in providing a level playing field for innovators. If you’re worried about personalization at the app level (as per Eli Pariser yesterday), you should be very worried about it at the network level.

“The Net would not exist absent government regulation.” E.g., the telcos were required to allow modems to attach to telephone lines. When cable modems arrived, government regulators were confused. Thinking that competition was right around the corner, the FCC completely deregulated highspeed Net access in 2002 (and 2205,6,7). They took away the “regulated” level but reserved the right to reregulate it (via “ancillary jurisdiction”). The courts have found that labeling a service as deregulated but then regulating it (as in the Comcast case) makes no sense. So, the FCC is proposing to re-regulate, but free of the heavy-handed elements: No rate regulation, etc. But, carriers would be required not to discriminate among bits [= Net neutrality]. This is the FCC’s “Third Wave.” The carriers claim that this is the “nuclear option.”

The FCC needs to regulate to fulfill its mandate to enable Net access to all people. E.g., they need to gather data. And they want to make sure that it’s open for innovation. Also, to keep privacy of packets. It’s great that AT&T is part of this conversation at PDF. But AT&T has spent $6M this quarter for lobbying against any form of regulation. There have also been personal attacks, she says. Comcast spent $29M in the first quarter, she adds.

By 2012, the FCC says, most Americans will have only one choice of provider. [June 5: Susan's slide actually said that by 2012, 75 to 85 percent of Americans will have one choice of wired provider for 50 to 100Mbps speeds; sorry for the gross gloss. This comes from the National Broadband Plan.] Verizon has backed off on its plans for FIOS. So there will not be another competitor to cable. We should therefore be concerned about Comcast’s plans to merge with NBC, giving it an edge against other major video providers, but also against the growth of online video. Comcast could put content behind an authentication wall, so to see it you’d have to be a cable subscriber. The tech community should watch this merger carefully.

The content providers believe in “vertical integration,” so we’ll see many more mergers.

She says 100 yrs ago, Americans hated Standard Oil which was able to control regional production of oil. Small business people and farmers were enraged by them. Standard Oil required railroads to ship their stuff cheaper, and if the RR’s shipped competitors’ stuff, SO got paid. They also carried out espionage about competitive shipments. Like the electric grid, like the Net, the future of highspeed access depends upon government creation of a level playing field. The tech community should be working together to make sure we retain the ability to innovate.

[I interviewed Susan about the FCC's Third Way on a Radio Berkman podcast] [Note: On June 5, I made some very minor edits, cleaning up typos and unclear referents, etc., in addition to the insertion noted above.]

19 Comments »

May 22, 2010

Understanding spectrum

Christian Sandvig explains spectrum and spectrum policy in this Radio Berkman interview.

Christian — who is both brilliant and a wonderfully generous colleague — textifies the main points here.

1 Comment »

May 7, 2010

Harold Feld’s FCC explainer

Harold Feld explains the FCC “third way” reclassification decision. He goes into a moderate amount of detail, but this is perhaps the takeaway:

…I call this a “legal reset.” Basically, Genachowski is saying “Back in 2002, when we moved cable modem service (and later other forms of broadband access) into the Title I/information services/ancillary authority box, we thought we would still have authority to protect consumers and do other necessary policy things. The Comcast court told us we were wrong. So now we’re going to move broadband access service into the Title II/telecom service box. But nothing substantive/policy changes. We’re just doing what the DC Circuit told us to do by articulating a different theory of authority.”

2 Comments »

May 5, 2010

FCC to announce a “third way”

The FCC has said it’s going to announce on Thursday a “third way” to regulate the broadband access providers to make sure that they leave the Net open and neutral. The first two ways are (1) to give up on protecting the Internet, or (2) to reclassify the Net as a communications network that counts as a common carrier (i.e., it has to let all bits go through equally, regardless of the app, origin, content, etc.).

The Washington Post headline of the AP story unfortunately reads “FCC to impose some new regulations on broadband,” thus reversing the actual meaning, which is expressed in the lead sentence: “Federal regulators plan to impose additional rules on broadband providers.” Big, big difference.

Anyway, this is a happier day than two days ago. For how happy, we’ll have to wait until Thursday’s announcement…

8 Comments »

April 23, 2010

FiberFete and Plenums

I gave the closing talk at FiberFete on Thursday. FiberFete was a celebration of the complete fiber-ing of Lafayette, Louisiana — an impressive story of a city struggling to overcome entrenched interests with a vision of how low-cost bandwidth can bring about major benefits in education, medical care, and the economy. The Fete was organized by Geoff Daily and David Isenberg as a celebration, and as a way to stimulate interest and enthusiasm in what a fully connected city can do. The day was impressive and even moving as we heard from the CIOs of San Francisco and Seattle, technologists, visionaries, and an awesome group of Lafayette teachers and students.

David wanted me to talk about what we could do if we had ubiquitous, high speed, open, symmetric (i.e., roughly the same speed for uploading and downloading) connectivity. Since I don’t know what we could do, I tried to beg off, but David insisted. So, here’s a summary of what I said in my twenty minutes.

The important thing about ubiquity is not the percentage of people connected, but the ubiquity of the assumption of ubiquity. E.g., we assume everyone has access to a phone, even though “only” 95.7% of American households have one (including cell phones). Nevertheless, the assumption creates a market for innovation.

The core of that assumption is an assumption of abundance…an abundance of information, links, people, etc. Our brains have difficulty comprehending the abundance we now have. There are so many people on line that the work of 1% can create something that boggles the mind of the other 99%. As more people come on line, that rule of 1% will become a rule of 0.01% and then 0.001%. The curve of amazement is going straight up.

The abundance means we will fill up every space we can think of. We are creating plenums (plena?) of sociality, knowledge and ideas, and things (via online sensors). These plenums fill up our social, intellectual and creative spaces. The only thing I can compare them to in terms of what they allow is language itself.

What do they allow? Whatever we will invent. And the range of what we can invent within these plenums is enormous, at least so long as the Net isn’t for anything in particular. As soon as someone decides for us what the Net is “really” for, the range of what we can do with it becomes narrowed. That’s why we need the Net to stay open and undecided.

These abundances are not merely quantitative. They change the nature of what they provide. And they refuse to stay within their own bounds. For example, we go online to get information about a product, probably through a mobile device. There we find customer conversations. These voices are not confined to giving us product reviews. We are also ubiquitously connected to pragmatic advice, to new businesses and institutions that compete with or make use of the item we’re engaged with, to governmental and legal information. If people are unhappy with the product, they may use their online meeting spot as a way to organize an activist movement.

In other words, Clay Shirky is right: The Net makes it ridiculously easy to form groups. In fact, when your information medium, communication medium, and social medium are all precisely the same, its ubiquity will make it hard not to form groups. For example, if your child has a bad cough, of course you’ll go online. Of course you’ll find other parents talking about their kids. Your information search has become a communicative enterprise. Because you’re now talking with other people who share an interest, your communication is likely to spawn a social connection. These plenums just won’t stay apart.

Furthermore, many of these networked groups will be hyperlocal, especially within localities where connectivity is ubiquitous. As we get more of these locations, hyperlocal networks will connect with other hyperlocal networks, creating superlocal networks (although I have no idea what I mean by that term).

These plenums will affect all of our institutions because they remove obstacles to our being more fully human.

37 Comments »

« Previous Page | Next Page »