Joho the Blog » broadband

August 27, 2012

Big Data on broadband

Google commissioned the compiling of

an international dataset of retail broadband Internet connectivity prices. The result was an international dataset of 3,655 fixed and mobile broadband retail price observations, with fixed broadband pricing data for 93 countries and mobile broadband pricing data for 106 countries. The dataset can be used to make international comparisons and evaluate the efficacy of particular public policies—e.g., direct regulation and oversight of Internet peering and termination charges—on consumer prices.

The links are here. WARNING: a knowledgeable friend of mine says that he has already found numerous errors in the data, so use them with caution.

Be the first to comment »

July 13, 2012

How Google may turn its Kansas City broadband project into a business

As you likely know, Google is in the midst of providing ‘ultra high speed fiber’ access to the residents of Kansas City (MO and KS). (‘Ultra high speed‘ means at least 1gb, which is 50100x faster than your 10mgb connection.) This has been positioned as an experiment, and as a poke in the eye to the incumbents to “show ‘em how it’s done.” And it has apparently made the incumbents nervous enough to offer residents a bounty for tips about the deployment.

Now Bill St. Arnaud speculates about how Google is going to turn this into a business. I have zero idea if he’s right, simply because I don’t know enough to have an opinion, but it sure is some interesting speculation.

Bill’s post is very readable, so I suggest you not rely on my summary, but here goes. First, Bill wonders how Google could hope to make back its investment in the physical infrastructure, since providers need about 40% of the market to subscribe to drop the per-user cost sufficiently. But (Bill figures), the incumbents will never let Google take 40% of their market. So, Bill figures:

Google will offer a basic free high speed Internet to each and every home, perhaps bundled with Google TV using their new set top box. A variety of premium services will also be offered for additional fees. I would not be surprised that Google decided to offer a basic 1 Gbps service to every home. This would clearly differentiate Google from the cableco or telco and make it almost impossible for them to compete without undertaking a massive investment themselves.

But, Bill guesses that the premium services will still not make the venture profitable. So, he speculates that Google…

…could offer to peak manage the customer’s power usage, by briefly turning off air conditioners and hot water tanks. They could also install smart thermostats and other devices to further reduce energy consumption. The money in the energy savings would be used to pay for the fiber or premium services, rather than being returned to the customer as piffling amount of energy savings.

So, the deal to users would be: We’ll give you incredibly high speed connectivity (or we’ll give you some great premium services) if you’ll let your energy company install a smart thermostat and manage your peak energy consumption in ways you won’t much notice. The user’s energy bills don’t go down (or don’t go down proportional to their energy consumption decrease), and the energy company shares the money with Google.

I’m not convinced that users would take the deal positioned that way. Maybe I’m positioning it wrong, but it seems like a pretty complex offer. I think I’d rather take a deal with my energy company to lower my usage and my costs, and then decide if I want to pay Google for fiber access or for premium fiber access. I already resent the cablecos for making their “triple play” (telephone, tv, Internet) pragmatically a requirement to get any one of the three. A double play of Internet and energy savings would be even weirder.

But, Bill knows approximately 50x what my own poor brain fiber does. The key is, I believe, in the energy company making the claim that the decrease in energy consumption will be minor, the noticeable impact on the user will be negligible, and the monetary savings would be “piffling.” If he’s right, it’ll be fascinating to watch.

 


This isn’t right, is it?

Seemingly wrong WolframAlpha result

9 Comments »

July 7, 2012

[2b2k] Big Data needs Big Pipes

A post by Stacy Higginbotham at GigaOm talks about the problems moving Big Data across the Net so that it can be processed. She draws on an article by Mari Silbey at SmartPlanet. Mari’s example is a telescope being built on Cerro Pachon, a mountain in Chile, that will ship many high-resolution sky photos every day to processing centers in the US.

Stacy discusses several high-speed networks, and the possibility of compressing the data in clever ways. But a person on a mailing list I’m on (who wishes to remain anonymous) pointed to GLIF, the Global Lambda Integrated Facility, which rather surprisingly is not a cover name for a nefarious organization out to slice James Bond in two with a high-energy laser pointer.

The title of its “informational brochure” [pdf] is “Connecting research worldwide with lightpaths,” which helps some. It explains:

GLIF makes use of the cost and capacity advantages offered by optical multiplexing, in order to build an infrastructure that can take advantage of various processing, storage and instrumentation facilities around the world. The aim is to encourage the shared use of resources by eliminating the traditional performance bottlenecks caused by a lack of network capacity.

Multiplexing is the carrying of multiple signals at different wavelengths on a single optical fiber. And these wavelengths are known as … wait for it … lambdas. Boom!

My mailing list buddy says that GLIF provides “100 gigabit optical waves”, which compares favorably to your pathetic earthling (um, American) 3-20 megabit broadband connection,(maybe 50mb if you have FIOS), and he notes that GLIF is available in Chile.

To sum up: 1. Moving Big Data is an issue. 2. We are not at the end of innovating. 3. The bandwidth we think of as “high” in the US is a miserable joke.


By the way, you can hear an uncut interview about Big Data I did a few days ago for Breitband, a German radio program that edited, translated, and broadcast it.

2 Comments »

July 28, 2011

Gig U

The plan to provide ultra high speed Internet connectivity to universities (mainly in the heartland) is exciting. And it’s got some serious people behind it, including Lev Gonick and Blair Levin.

The NY Times article, seeking to find something negative to say about it, finds someone who doubts that providing significantly higher speeds will lead to innovative uses of those greased-lightning pipes. Does history count for nothing?

1 Comment »

February 18, 2011

National Broadband Map: What the incumbents hath failed to wrought (wring?)

The National Broadband Map is now available. We had wanted to bask in the sunlight provided by the incumbent access providers, but instead we just got freckles. Want to laugh like a broken-hearted clown? View only the places that have fiber to the home.

The map was controversial from its inception, since it at least initially was relying on data coming from parties interested in exaggerating the extent of coverage. (I interviewed Steve Rosenberg at the FCC about his agencies contribution to it, in November 2009.) It does not link to its sources. It did not list my access provider (RCN) as available where I live. Also, the map is very clunky to manipulate. (Hint: Turn off all overlays until you zoom into where you want to see.) (Harold Feld provides a balanced perspective.)

Now want to cry like a generation watching its future slip away? The Republicans seem set on throwing The Master Switch to turn the Internet into a corporate content delivery system. Let your Senators know that you want the Internet to remain the engine of innovation and a public square for free speech.

1 Comment »

August 11, 2010

Why we aren’t online

Pew Internet & American Life has a fascinating report on why Americans are not adopting broadband. Here’s some highlights Pew is circulating:

  • Broadband adoption has slowed dramatically in the overall population, but growth among African-Americans was especially high last year.

  • By a 53%-41% margin, Americans say they do not believe that the spread of affordable broadband should be a major government priority. Contrary to what some might suspect, non-internet users are less likely than current users to say the government should place a high priority on the spread of high-speed connections.

  • In addition to their skepticism towards government efforts to promote widespread broadband adoption, the 21% of American adults who do not use the internet are not tied in any obvious way to online life and express little interest in going online.

  • They do not find online content relevant to their lives. Half (48%) of non-users cite issues relating to the relevance of online content as the main reason they do not go online.

  • They are largely not interested in going online. Just one in ten non-users say would like to start using the internet in the future.

  • They are not comfortable using computers or the internet on their own. Six in ten non-users would need assistance getting online. Just one in five know enough about computers and technology to start using the internet on their own.

14 Comments »

August 4, 2010

Akamai report: U.S. broadband speeds continue to fall behind

Akamai is in a unique position to judge actual broadband speeds around the world. Its latest “state of the Internet” report says that the U.S. is continuing to fall behind.

BroadbandBreakfast‘s takeaway is:

…Only 25% of the US has access to a connection above 5Mbps.

The fastest city in the world is Masan, South Korea which has an average Maximum Connection speed of 40.56Mbps; the first showing of the United States is at number 57 with Monterey Park, CA with a speed of 25.2Mbps.

When looking at the average connection speeds the United States again lags behind the rest of the world. Monterey Park, CA having the fastest connection again possesses the fastest average connection of just 7Mbps.

Within the United States, Delaware boasts the fastest average measured connection speed of 7.6Mbps, with the District of Columbia being the next fastest with a speed of 5.9Mbps. The slowest states in the nation are Alaska and New Jersey.

Doc Searls has a different take-away. He notes that Akamai only reports on download speeds, not uploads, because Akamai is among the set of institutions — which includes the U.S. access providers and, alas, too much of our government — that thinks the Net is primarily for the passive consumption of content. Doc has written about this here (and I recommend the discussion in the comments as well).


Meanwhile, as a single data point that proves nothing, but does let me vent: Our daughter is moving into an apartment in Brighton, Mass. The landlord has done a deal that gives Comcast exclusive rights to provide Internet access, freezing out Verizon and RCN, both of which are available next door. So, we’ve been looking at Comcast’s service plans. Herewith a rant:

My daughter only wants Net access, not TV or landline, but Comcast makes it as hard as possible to buy unbundled service. E.g., across the page of Comcast service offerings are four tabs: Net, TV, Phone, Bundles. We are on the Net page. But, guess what? All of the offers on that page turn out to be for bundles. There is functionally no way to buy unbundled Internet service from Comcast over the Net. Or, if there is, they’ve successfully hidden it. Well done!

The Comcast Web site is a mess. On the same page (here, but I had to go through an address verification to get there) the same offer — “Performance” — is listed twice, at different prices. Further, the service description notes “This special price is for customers who currently subscribe to Comcast Cable or Comcast Digital Voice® service,” but the Terms and Conditions make no mention of that. Further, there is no information about what the price would be for non-subscribers and people who don’t want to buy a bundle.

A long phone call revealed that the price for “Performance” is about $60/month for 15mbps down and 3mbps up. (Of course, those are maximums; there is no guarantee of what actual speeds will be if, say, there are “broadband hogs” — i.e., people who use more of what they’ve paid for than Comcast wants). My daughter would prefer to pay less for a lower broadband rate, but the only lower offer is for a tenth of the capacity — they call it “Economy” but they ought to rename it “The Email Package” — which is too little for her needs.

The landlord’s exclusivity deal has locked out competition, but Comcast’s pricing, packages, and anti-user Web site are its responsibility.

1 Comment »

August 3, 2010

Tough love for Jules Genachowski

Harold Feld, who I consider to be one of the essential commenters on FCC issues, has written a “tough love” post, urging FCC Chair Jules Genachowski to take decisive action and lead the FCC. I agree. I think JG can do great things at the FCC. He should do them beginning now.

My hunch — and it’s nothing more than that — is that JG is trying to lead in the Obama-esque way: according each side its dignity and trying to find common ground. I support that when it has a possibility of working. I supported that even when it failed for Obama, because it was important to remind Americans that strong leadership doesn’t mean contemptuously disregarding those who disagree with you. But I also supported Obama when, after giving reconciliation a more than generous effort, he stood firm and acted.

It’s time for Genachowski to stand firm and act at the FCC. He has a vision for the Internet as a place where small voices speak and where new ideas get a fair chance. He understands the Internet as a potentially transformative force in culture, business, education, and democracy. He will not achieve his vision by compromising with those who view our Internet as their delivery channel for commercial content.

Jules Genachowski can have a transformative impact. It is far from too late for that. The Genachowski FCC can clear the way for the Internet — our Internet — to achieve its transformative possibilities for culture, business, education, democracy. I believe in Genachowski’s vision. I trust his intentions. I hope he will act.

7 Comments »

July 29, 2010

GE pushes ahead with software-defined radio … good news for civilians, too?

In a press release that is barely comprehensible (or, quite possible, totally incomprehensible) to one such as I, GE has announced a new generation of components that can be used for, among other things, software-defined radios. It is unclear to me whether this technology is designed for anything except military use, but …

Software-defined radios (SDRs) are not the next generation of transistor radios or boomboxes (ask your parents, kids). They are radios in the more primordial sense of being devices that can receive radio-wave signals. The radios you and I are used to are hard-wired to do one thing: Tune into specific frequencies and translate the radio signals into toe-tapping tunes or the blather of infuriating talk show hosts. SDRs can be programmed to do anything they want with any type of signal they can receive. For example, they might treat messages as, say, maps, or signals to turn on the porch light … or as Internet packets.

SDRs matter a lot if only because they promise an alternative to the current broadcast medium. The way it works now, the FCC divvies up spectrum (i.e., frequencies) for particular uses and sells much of it to particular broadcasters. So, your hard-wired radio responds to particular frequencies as carriers of acoustic information sent by known, assigned providers: 106.7 on your radio dial, or whatever. This is a highly inefficient use of spectrum, like dedicating particular lanes of a multi-lane highway to a specific trucking companies. It’d be far more efficient if transmitters and receivers could intelligently negotiate, in real time, which frequencies they’re communicating on, switching to frequencies that are under-trafficked when a particular “lane” is jammed. If our radio receivers — not just our in-dash radios, but all devices that receive radio wave transmissions — were smart devices (SDRs), we could minimize the amount of spectrum we assign to a handful of highly-capitalized broadcasters. We would have more bandwidth than we could eat.

So, I think it’s good news that GE is pushing ahead with this and is commercializing it … unless I’m misunderstanding their announcement, the technology’s uses, and GE’s intentions to commercialize it.

1 Comment »

July 20, 2010

The competitive difference

Brough Turner has done some investigative work. Here’s the photo that summarizes it:

The following is an edited, paraphrased version of Brough’s comments on the mailing list I got this from (with Brough’s permission):

In the picture, the building on the right is 111 Huntington Avenue in Boston. It’s served by 7+ separate carriers each of which owns their own fiber into the building. The price quoted on the slide is Cogent’s list price for a 3 year contract (lower prices and/or shorter terms are available to those who can wait for an end-of-quarter special).

The building on the left is 170 Huntington Avenue in Boston. There is Verizon fiber into this building, but apparently no other carrier has their own fiber into this building. The price quoted is what a friend’s IT department signed up for less than 45 days ago.

In both cases we are comparing “dedicated” services, i.e. a supposedly committed information rate service. Yes, Verizon’s price per Mbps would be better if the customer had ordered 155 Mbps, but the disparity would still be outrageous.

Gotta love competition. Brough’s case study is one more data point confirming Yochai Benkler’s massive study of broadband around the world [pdf] that found the countries that surpass the US in price and penetration are generally ones with competitive markets for broadband.

Be the first to comment »

Next Page »