June 2008 Archives

Cape Cod is the North Korea of Broadband

| 1 Comment | No TrackBacks

Last week I attended the OpenCape Summit, an annual gathering in Cape Cod discussing the growth of the OpenCape project, which is an effort to deploy an alternative backhaul infrastructure to link up local emergency services and hopefully attract new last mile access providers to deliver residential broadband service.

Despite the many wonderful people I met and the interesting approach they're taking to solve their broadband problems, what stuck with me most was when the project's head - Dan Gallagher - made the proclamation that Massachusetts is the North Korea of broadband.

When he said it projected on the screen behind him was a satellite photo of the Korean peninsula showing South Korea lit up with lights while North Korea was almost completely dark. It was a telling image for how Cape Code residents feel about the broadband services they currently receive.

The problems with broadband on the Cape are manifold:

- The eastern part of the Cape have many areas with little to no coverage.

- In that area and elsewhere there's no redundancy, so if one cable's cut they're off the grid.

- The constraints of a shared network with limited capacity become painfully evident in the mid-afternoon when kids come home from school and Internet speeds for all residents on the Cape drop down dramatically.

- The lack of sufficient upload capacity threatens local Internet innovators like Genevate, a Cape-Cod-based application developer who learned long ago not to locate servers on the Cape, but who still must live in fear of their connectivity going down, leaving them unable to conduct their business.

- And the two incumbent providers, Verizon and Comcast, have not shared any plans to upgrade their infrastructure on the Cape any time soon.

What's remarkable about all this is not these limitations but instead the fact the Cape is suffering from them.

Time and time again when we talk about the challenges of broadband deployment we focus on rural and urban areas. But the Cape is definitely not urban and only parts of it are rural.

When incumbents defend their decisions of where to invest in increasing capacity they often focus on demographics and how they need to focus on areas where there are plenty of customers willing and able to pay for service. But the Cape does not exactly lack for people capable of paying for better service as evidenced by the row upon row of million-dollar homes.

And normally when an incumbent tries saying an area already has sufficient coverage, it can be hard to refute those claims if the community hasn't already come together to compare notes and realize just how bad off they are. But on the Cape, they know how desperate their situation is, which is why they're now pursuing alternatives.

So here's an area that's more suburban than rural, richer than poor, and that desperately wants better connectivity but can't get the incumbents to step up and deliver.

I wanted to point this out lest we continue to talk in absolutes about the problem of broadband deployment. We're kidding ourselves if we try and say it's a rural or an urban problem, that it's a poor vs. rich problem, because as evidenced by Cape Cod, that just isn't the case.

And by accepting this, hopefully we can find a way to break out of the Dark Ages and begin to light up areas like Cape Cod so that we no longer have to feel inclined to equate any part of America with somewhere like North Korea, where its populace continues to get left behind in the fast-moving Digital Age of Information.

I've been up in Cape Cod for the OpenCape Summit (which I'll write about soon) but for now despite all the wonderful things I learned at that event what's sticking out most in my mind happened after the event as I sat down to catch up with my 18-year-old cousin, Lydia.

Sometimes there seems to be a sense that anyone under the age of 25 knows everything there is to know about broadband, that they're all hard wired with an inherently deeper sense of the what, how, and why of the Internet.

But then there's my cousin. Last night while chatting, she made a startling admission: she didn't know what "broadband" meant.

Now, that's not entirely true. As we discussed it further, she acknowledged that she knew broadband meant faster and that it came from the cable company.

And when I followed up by asking her if she used the Internet, she quickly confirmed that while she may not be a power user she does partake in some of the basics of an online existence as a young adult, like watching videos on YouTube and creating a Facebook page.

But I'm totally fascinated by the fact she didn't know what broadband was.

I had to take her through the explanation that the Internet is a bunch of interconnected fiber optic networks and that broadband encompasses the access networks that allow us to reach the Internet. That made sense to her.

Then I tried tackling the reasons why I support full fiber deployments. When I first started talking about bringing the full power of the Internet to her front door I could tell that didn't mean a lot, but when I began talking about how all the world's Internet traffic can be delivered today over a single strand of fiber, her interest perked up. And what really grabbed her attention was when I started covering how all that bandwidth makes possible those next-gen high-bandwidth applications that once seemed relegated to the realm of science fiction, things like communicating via hi-def holograms.

By the end I felt like I'd gotten her to see the light, not enough so that she'll become a champion for broadband but at least enough so that the next time she hears the word she'll know what it means and hopefully whenever she next encounters someone talking about fiber she may be more predisposed than before to supporting the concept.

But for now stories like this should show us that we can't make blanket assumptions like "all young people know broadband" just like we shouldn't assume that all old people don't know it.

And we can't forget that for the vast, vast majority of Internet users today, "broadband" doesn't represent the opportunity to revolutionize society through greater capacity, it just means faster Internet.

The Need for Violent Agreement

| 1 Comment | No TrackBacks

Here's the thing I most took away from Jim Baller's event on Monday:

At one point near the end of his speech, he pronounced with arms lifting above his head and passion filling his voice, that what we need more than anything is the end of petty in-fighting and the beginning of an effort to create "violent agreement" in what our broadband goals should be and the methods to achieve them.

When he said this I had a strong physical reaction, both the urge to nod so vigorously in agreement I risked injury, as well as the tingle one gets when the right combination of message, timing, and opportunity fall into place to give you the sense that now is a time when we can at least attempt to affect real change.

As Baller mentioned in his speech and report, accomplishing big goals in broadband is going to take a tremendous amount of effort, time, money, and political will. Wiring an entire country with fiber, covering everywhere with wireless, and setting in place policies that encourage people to embrace the possibilities of broadband are not simple tasks for a smaller nation let alone one the size of America.

The only possible way we'll ever get anywhere near achieving our goals is by marshaling all of our resources and uniting them towards the common goal of doing what's best for America through broadband.

Yet, we can not ignore the reality that we're a long ways away from anything resembling "violent agreement" on most all of the issues that matter.

You'd think at this point everyone would at least agree that broadband is essential, but that's simply not the case. There are many local, state, and federal officials who don't see that same need or at least the same urgency as national leaders like Baller and Commissioner Copps. Though I don't blame them altogether as for many the issues seem too complicated. They don't know the terms so they can't understand the concepts so it's hard to get on board with the vigor required to constitute "violent agreement." But that's not entirely their fault. So long as they're willing to listen, it's our responsibility to help them understand.

You'd think we might at least be able to unite under the banner of patriotism that our goals should be nothing less than being number 1 in the availability and use of broadband in the world, but there again we're a long ways away from anything resembling "violent agreement." Our incumbent private providers seem satisfied delivering middle-of-the-road speeds and prices, with only Verizon showing any real urgency to deliver the capacity needed to reestablish our position as leaders in the global digital economy. And many others focus too much on the cost and complexity of the work needed to be done and not enough on its promise and necessity.

You'd think at a minimum we could agree on where we're currently at, but in reality here the gap is daunting even if all you're looking for is mild agreement. This is best exemplified by the contrast between the speakers at Baller's event, who lamented where we're at and exhorted the need to move forward urgently to address the situation, and the speakers at NXTcomm, who brushed off the idea that there are problems and that our current international status is untenable.

So we've got a need for violent agreement between everyone to achieve the big goals that a great country like America should be setting for itself, but we've got a reality where we're a long ways away from agreeing on the fact there's a problem, let alone getting everyone headed towards the same solution.

Getting out of this quagmire won't be easy, but it starts by establishing a clear vision for what's possible and then through education and cajoling beginning to increase our ranks of broadband believers who know that the future is now and if we don't get moving we're going to miss out.

And a great place to start this process is Jim Baller's report.

You can find it here, though be sure you set aside an afternoon before sitting down to read it. If you do so you'll be well rewarded as this report is basically a summation of all the research that preceded it, tied together into a coherent picture of where we're at alongside an argument for where we should be going.

Also, if you want to watch video from Monday's event, it's available here.

On this latest edition of App-Rising.com's VidChat I sat down with Eric Klinker, CTO of BitTorrent.

I've been eager to engage them in a discussion as in addition to being responsible for driving a whole lot of demand for bandwidth, they're also a company whose business intersects the heart of a lot of telecom policy discussions.

But too often they're talked about instead of heard from, and as a result I often encounter misconceptions about what they do, and I worry that lack of information is not allowing for informed policy decisions.

So here's an opportunity to get them engaged with the Great Broadband Debate, starting with busting through some of these misconceptions about piracy, bandwidth hogs, and net neutrality.

Enjoy:

Now some followup:

- If we didn't do enough to explain what P2P is and how it works, here's a Wikipedia article that helps flesh things out. And if you want to learn more about BitTorrent in particular, here's the Wikipedia article about it.

- A recent survey by Sandvine, makers of technology that allows network operators to manage bandwidth hogs, showed that P2P is responsible for 44% of Internet traffic. But here's something even more eye-opening: upstream P2P consumes more than twice as much bandwidth as all other traffic combined. (More on this soon as it deserves a post of its own.)

- Eric's point about BitTorrent being a protocol like HTTP is an important one. It alludes to the fact that it isn't just another application, it's a fundamental part of the Internet.

- If you're interested in trying out a BitTorrent client, here's a list of available ones.

- In our piracy discussion, Eric makes the strong argument that the problem with piracy isn't the protocol, it's what publishers choose to do with it. Just because people also use HTTP for piracy doesn't mean we should be going on a witch hunt to fight its use, instead the focus needs to be on affecting how people use it.

- Also important to note is that BitTorrent is being used to legally distribute all sorts of content. As Eric mentions, everyone from small artists needing an economical way to get their content out there, to large government institutions like NASA to move around large datasets, even to the big media companies who recognize that P2P can be a great way to reach their audience. Plus he mentions the fact that the BitTorrent protocol is beginning to get embedded into other apps, like the popular online game World of Warcraft, which uses it to distribute software updates to its users.

- In talking about their users as "bandwidth hogs" Eric suggested a similar sentiment to what I did recently in this post that today's bandwidth hog is tomorrow's average user.

- I also loved his comment on how what was considered a bandwidth hog in 1987 is probably laughable today. And I concur: the thought did make me laugh.

- Then we got into BitTorrent's stance on net neutrality, which is basically that they support the spirit of NN, acknowledging that BitTorrent wouldn't have been created without the principles of an open network in place.

But when I asked him about whether or not BitTorrent needed NN legislation in order to protect their business interests and make their model work, he deferred, first claiming that as a technologist the world of legislation and regulation is a bit beyond them, but then striking a cautionary note.

He postulated that if legislation is the right approach, the question becomes whether or not it can keep up with the pace of technological change. He then went on to state that he believes it to be preferable to let market forces keep working. He bases this stance on his belief that consumers want an open network, so if there is sufficient competition the market will play out accordingly.

So this all suggests that despite BitTorrent's name being used to justify the need for NN legislation, that BitTorrent itself has a more nuanced stance that embraces the spirit of NN but questions the need for legislation.

- Finally, I found the final bit of this interview very interesting, when Eric acknowledged that not all traffic is created equal and shared that BitTorrent has been hard at work developing congestion control technology that allows P2P and more latency-sensitive applications coexist in harmony.

He went on to share that where he sees the future isn't in making decisions in the network regarding what bits should have priority over another, but instead empowering individual users to decide what bits are more important to them than others.

Inside the Minds of FCC Commissioners

| No Comments | No TrackBacks

One of the more thrilling parts of attending Jim Baller's event yesterday was to hear back-to-back presentations by FCC Commissioners Copps and Adelstein.

Up first, Commissioner Copps made an impassioned plea about the need for broadband now. He went to suggest a number of interesting proposals:

- Creating a White House broadband czar. One his main themes is that to accomplish these goals it's going to take a lot of work and need a lot of coordination, which can be best provided by the White House, as they were effective in doing when preparing for Y2K.

- Making sure that every low-income housing project has broadband. He used this as an example of how we need to make sure that every part of government is taking a lean-forward approach in pushing the availability and use of broadband.

- Combining the connectivity in libraries with wireless mesh networks to increase broadband deployment. While I'm not sure how feasible this is given that most libraries don't have enough capacity for their own users let alone those in the community around them, it does suggest that he's eager to think outside the box to find new solutions, which is likely what we're going to need if we're ever going to reach 100% deployment.

He then focused on how to improve the FCC.

First and foremost he spoke strongly in favor of redirecting the Universal Service Fund towards supporting the deployment of broadband, which is such an obviously good idea that I'm still shocked that we need to have a debate about this.

Secondly he lamented about how the FCC isn't doing everything it could be doing to serve as a national clearinghouse of information, collecting best practices, case studies, and technology reports so that individual communities and states don't have to learn everything on their own as is largely the case now.

And he finished with a passionate plea for us all to approach these issues with a sense of urgency. That broadband is a revolution, and like any revolution it will have winners and losers, and we can't afford to lose.

He then warned against allowing ourselves to have this debate hijacked by the age-old liberal vs. conservative, regulation as a good thing or bad thing debate, imploring everyone to "grow up" and recognize that on these issues we must find a way to all work together.

Commissioner Adelstein picked up on a similar theme, advocating that it's not just the White House we need leadership from but all levels of government, from the local on up.

He also firmly stated his belief that we need to make sure that whatever we do benefits everyone, and that broadband is undeniably essential to productivity. He went on to give the interesting example of how through broadband we can flip the outsourcing paradigm on its head and instead of having jobs leave rural areas for overseas we can instead insource jobs and bring employment back to these communities.

I couldn't have been more excited when he cut through the debate around America's international broadband ranking with the simple statement that there shouldn't even be a debate: the US needs to be number one.

I also really appreciated his pragmatic stance that we can talk until we're blue in the face, but that we need to start working towards implementing real action, and that when setting forth on this journey we need to establish clear benchmarks to help gauge our progress.

At one point, he seemed to contradict himself slightly. While he clearly voiced his belief that we need to rely on the private sector to deploy these networks, at the same time earlier he admitted that the profits a private company generates don't reflect the full societal benefits broadband brings about. How we resolve this tension was not addressed.

Two other good points he made is that there's no point in having broadband if you don't have a computer, highlighting the need to focus on adoption and use alongside deployment, and the fact that the metric he looks at to gauge America's global competitiveness is price per megabit, which I also think is the truest indicator of how we're doing.

He seemed to agree with Copps exhortations to transform the FCC into more of a national resource for communities and states by pointing out how unfortunate it is that states have largely been left on their own to date to figure these complex issues out for themselves.

And finally, he put out a call to have a National Broadband Summit, which I think could be a highly effective way to move this debate forward. Let's get everyone in the same room and start hammering out where we're going and how we're going to get there. If we do this right we can take advantage of the opportunity to not only frame the debate over national broadband policy but hopefully we can also cobble together the start of specific policies that can be presented to the next President in such a way to allow him to take the ball and run with it.

The last thought I'll share from their presentations was their reaction to this question that I asked at the end: In terms of what's best for America, are we better of trying to encourage competition between last mile access technologies or to establish a national common infrastructure on which competition between services can thrive?

Unfortunately, I didn't get any great answers to it. Admittedly, it's a complex question, but their responses dealt mainly with the need to increase competition, how we've paid a price for not having more competition, and how wireless will be able to enable us to finally have that third party that can bring about true competition.

I think they were saying that they support increasing competition between the characteristics of last mile access technologies, but I'm hopeful that they'll continue to ponder this question as I believe it is at the heart of what we need to determine in terms of how we're going to move forward to create the best, most robust, most competitive broadband marketplace possible.

100Mbps Nation Here We Come - Who's Got the Map?

| No Comments | No TrackBacks

One of the most resounding messages driven home at yesterday's event that introduced Jim Baller's new e-NC report is the growing consensus that our goal in spurring broadband deployment should be to achieve a 100Mbps Nation.

Setting that goal is one of the primary recommendations of Jim's paper.

At the event, EDUCAUSE president and CEO Diana Oblinger spoke and we were reminded of their recent report setting the same goal.

Also speaking was Stan Fendley, who lobbies for Corning and was representing the FTTH Council. He cited the fact that both the Senate and House have had resolutions introduced that reaching a 100Mbps Nation should be a top national priority.

The reason I find this exciting isn't so much the speed of 100Mbps but instead the fact that we're starting to talk specifics about where we want to be in the next few years. Far too often we seem to talk about policy in action-oriented terms, but if we don't know where we're headed how are we supposed to get there?

This push to a 100Mbps Nation isn't limited to ivory tower discussions inside the Beltway and on Capitol Hill, it's already happening out in the real world.

Verizon's FiOS network has proven capable of delivering 100Mbps to the home, and they seem likely to start offering service soon at that speed.

Comcast has claimed that their networks will soon be able to reach 100Mbps, though it'll be extremely asymmetric and still shared.

Some municipal fiber networks like that being put in by Lafayette Utility System will feature 100Mbps intranets, while others like that of Jackson Energy Authority have the capacity to turn up speeds of a 100Mbps if only they could get cheaper access onto the Internet.

And 100Mbps isn't the be all, end all, as the small greenfield fiber deployer Paxio currently offers its customers speeds up to 1Gbps.

So that's all great. Lots of people talking about bringing 100Mbps to the home, but there's also something missing: how are we going to get 100Mbps to every home?

In areas where the market is working, perhaps we don't need to do anything. If competition is sufficient to incentivize the deployment of 100Mbps, then so be it, we've got nothing to worry about.

But what about everywhere else? Will private dollars get us there or do we need public? If public, how should they be spent: subsidizing private providers, building open access networks, and/or becoming competitors to private industry?

And don't think this is just a rural issue. What about the communities where FiOS is being deployed to some but not all of the homes. How do we reach that goal of 100% deployment?

Even in communities that have competition from private providers or municipal networks, there's still the little problem of making 100Mbps affordable to consumers. I, for one, don't get all that excited about 100Mbps to every home if 100Mbps service still costs hundreds of dollars a month.

Again, it's terrific that we're starting to agree on where we're going, but now we face the real challenge: how are we going to get there?

Penetration of Internet Access Not Enough

| No Comments | No TrackBacks

I just got back from a thrilling event up on the Hill that marked the introduction of a new paper commissioned by the e-NC Authority and put together by Jim Baller and Casey Lide. I'll have lots more coverage of that event and analysis of Jim's paper later today and tomorrow as there was an unbelievable number of interesting things said, but for now there's one point I want to share.

In his remarks, Jim referred to the SETDA report that I wrote about this morning. He made a point that I totally missed in my initial reading of it.

In the report they state that 98% of schools have Internet access, but that that isn't enough.

That shows remarkable foresight on the part of the SETDA. They recognize that simply having access is only one part of the equation, and that what's as important is the need for a strategy to not only reach 100% of schools but to get in place networks that can support speeds that are exponentially faster than the current standard of T-1s.

They don't spend a lot of time proclaiming "Mission Accomplished" because they realize that the goals for their initial mission of connecting schools have expanded.

It's not enough to set one goal a decade ago, work towards accomplishing it, and then wiping our hands and walking away. These goals must evolve over time alongside the demands that new applications are putting on network capacity.

The challenge, of course, is does that mean we'll never reach a state where we can be satisfied with the connectivity we have? To be honest, I don't know. But I do know that if we accept what we have today -- access that is generally universal with speeds that are somewhat tolerable -- that we'll miss out on the opportunity to realign our goals with the challenges and opportunities of the 21st century.

And in schools in particular, we can not allow that to happen.

Meeting the Demands of 21st Century Education

| No Comments | No TrackBacks

Here's a great new white paper report from the State Education Technology Directors Association.

In it they talk about how important broadband is to education and how much broadband is needed in schools to support the demands and opportunities of 21st century education.

One startling note found herein is that most schools today are getting by with speeds equivalent to a T-1. That means 1.5Mbps, rough. And that's for an entire school.

The high school I went to had 2500 kids in it. I don't think I need to do the math to show how grossly insufficient those kinds of speeds for that many users is. Of course, the biggest issue this creates is when lots of people want to go online simultaneously, but unfortunately that's not necessarily a problem in every school as many are lacking in other areas like the availability of computers as well.

But this report focuses primarily on connectivity, and it has some interesting things to say about the goals that school districts should be setting:

- In the next 2-3 years, schools should have 10Mbps to the Internet per 1000 students at a school, and a 100Mbps intranet available in between the schools within a district.

- In the next 5-7 years, schools should have 100Mbps to the Internet per 1000 students, and access to a 1Gbps intranet that connects schools together.

There are many things to like about these.

First off, I love that they're setting near-term goals. Two to three years is not a lot of time to do anything related to the deployment of broadband, so hopefully these goals will help instill a sense of urgency in school districts around the country to get something done sooner rather than later.

Secondly, I'm extremely excited that they're acknowledging both the need for and possibilities of high-speed intranets that connect schools together. If we're going to make things like sharing teachers with specialized know-how between schools via videoconferencing a reality, they can't be constrained by bandwidth. It just won't work if a videoconferencing stream has to fight for bandwidth with what everyone else is trying to do over the Internet. Plus having 100Mbps plus between schools opens up a lot of possibilities for experimentation, which will hopefully lead to innovation.

But I do have some reservations, mainly over how we actually accomplish these goals.

In their suggestions, while they do a tremendous job laying out the issues at stake, providing some case studies, and giving suggestions at the end for how to pursue these initiatives, I'm not sure if there's anything in this report that clearly shows a school "this is how you get all this connectivity."

Now, I'm not blaming them for this at all. One of the biggest challenges any public entity faces in getting bigger, better broadband is that there does not yet seem to be one silver-bullet model for how to get this done. And many will argue that there never will be because the challenges and opportunities of deploying new networks are unique to each and every community.

But if we're going to take these goals and transform them into a key component of a national broadband strategy, then there seems to be a need to establish a highly proactive approach to getting everyone on board and headed in the right direction.

We need to make sure that the early adopters who already have networks are using them to their utmost in order to help serve as the shining city on the hill to help guide others who aspire to get there.

We must help clear that way for any district ready to take the plunge so that nothing slows them down from achieving their goals.

And we need to make a concerted effort to educate, equip, and inspire those districts that are dragging their feet, whether it be because of ignorance, disinterest, arrogance, or the inability to muster the resources and know-how to make this happen.

If ever there were a time that we needed a national broadband policy, it's for getting schools wired. I say this because while you can argue that perhaps it's not our responsibility to push late adopting consumers and businesses to embrace broadband if they're not doing so already, we can not allow any school administrators to hold back the availability of broadband to children in schools.

Because when it comes to empowering students, there truly should be no child left behind in having broadband enable richer and more dynamic learning environments.

Hey Mr. Obama - What About Fiber?

| 1 Comment | No TrackBacks

Earlier this week I wrote about how the next president has to be the first broadband president, followed by a brief analysis that showed why my vote is going to Barack Obama, though almost by default given McCain's utter lack of awareness about broadband.

In that post, though, I criticized Obama for not being vocal enough about broadband. Well, he must've been listening as he just said something about it in a speech at Kettering University. Let's take a look at the key line:

"So as president, I will set a simple goal: Every American should have the highest-speed broadband access, no matter where you live or how much money you have. We need to connect libraries and schools and hospitals."

There's endless debate about how much bandwidth do we really need, how much capacity can we squeeze into copper or over the airwaves, and how important competition is between multiple pipes.

But there is one thing that is absolutely and totally undeniable: if your goal is to "have the highest-speed broadband access" then the means of getting there has to be fiber.

You can argue that copper and wireless will be able to realize tremendous speeds that will support all of our demand. But there's no arguing the fact that nothing has as much capacity as fiber. And it's not really not even close.

Copper and wireless are touting their ability to hit 100Mbps. Fiber's already topping 100Gbps, or a thousand times more capacity.

So Mr. Obama, if you're going to call for "the highest-speed broadband access" can you please stop pussyfooting around like just about everyone else and start acknowledging that that means getting a fiber strand strung to every building in America?

I'm proud to introduce App-Rising.com's latest VidChat. This time I sat down with Christopher Mitchell, director of the Telecommunications as Commons project for the Institute for Local Self Reliance.

In this episode we dive into some of the challenges, misconceptions, and opportunities of municipal broadband, and in particular fiber. Watch on for what I hope you will find to be a fascinating discussion.

Now for the followup:

- Here's the website for the Institute.

- I highly recommend checking out this report Christopher wrote and released back in January about municipal fiber and wireless options. It provides a wealth of information.

- When Chris made the point that "owners make decisions" when it comes to network deployment, management, etc. I think he really hit on something important. However we decide to best pursue our fiber future, we can't forget that whoever owns the pipes controls the pipes.

- That said, when he commented on how cities have limited power to regulate these networks, I found myself somewhat torn. On the one hand this is certainly not a good thing when it comes to allowing communities to determine their own future and what's best for them. On the other, I know there are advantages to elevating some of the levers of government to the state and federal level as otherwise network operators will be forced to reinvent the wheel with each individual community, which isn't necessarily a bad thing unless it adds inefficiency into the system that slows down deployment and investment.

- When it comes to the size of the investment needed for fiber, I loved Chris's observation that the billion dollars that Minneapolis is putting into building a light rail system is probably enough money to wire the whole city. That's proof positive that it's not that we can't afford fiber, it's that we aren't willing to make it a priority.

- Another worthwhile observation Chris makes when questioned about government's ability to drive innovation moving forward is that if you have a full fiber network in place, innovation will happen.

- Love that he hit on the point that anyone who's convinced themselves that wireless is a long-term replacement for fiber is probably kidding themselves.

- It was interesting to hear his passion against what he sees as the misinformation being disseminated against muni-broadband. I will say, the general sense out there is that muni-broadband is an extremely risky venture that has led many communities to failure. And to be quite frank, there are more muni-builds I can think of that have some doubts about their success than there are those that jump out as unqualified successes. But even still I agree with Chris that the FTTH Council's numbers are startling. I think just about any network operator would kill for takerates above 50% within one to four years of deployment. So they obviously must be doing something right.

In the end, I'm still in the same position I was going into this call.

I worry about the expense, complexity, and ramifications of municipal broadband on the overall marketplace, especially as it relates to private operators.

But at the same time I don't see any way we're going to be able to wire the entire country with fiber without some involvement of government dollars.

And I can't see how we can stand in the way of any community willing to make the significant investment in upgrading their infrastructure and their future.

No One Wants Fiber to Every Home

| 2 Comments | No TrackBacks

So anyone who reads App-Rising.com on a regular basis knows that I'm a fan of full fiber networks. I just can't get around the fact that no other access technology has the capacity of fiber optics, and ultimately I believe we will have sufficient demand for high bandwidth applications to not only justify but require the biggest possible pipes into every home.

Unfortunately, when I suggest that the goal is realizing a future where fiber touches every home as quickly as possible, too often I get met with disbelief about its feasibility, doubt over its necessity, and, most troubling of all, lack of support from the very people responsible for helping achieve that goal.

The feasibility camp has a point: deploying fiber is expensive, some argue too expensive for rural areas. But that's simply not true. I don't know the specific stats, but I'm guessing that if you take Verizon's FiOS out of the picture, that rural areas are getting as much fiber as anywhere else, if not more. Much of this has to do with desperation; these areas are dying off and have to do something. Regardless of why it's happening, the undeniable reality is that it is happening. And with regards to the overall cost, I don't see how we can shirk our responsibilities to the future of this country by not investing a couple hundred billion into something that can help us realize trillions of dollars in savings and growth. It's not that we can't do it, it's that we won't.

The necessity camp is one that frustrates me greatly. Despite these being learned people, they've bought into the hype of a wireless world, that eventually we won't even need wires. Why invest in fiber if 100Mbps wireless is just around the corner? To that I say: wireless is not yet proven at 100Mbps let alone anything faster whereas fiber's already capable of handling 1Gbps+, and you can't have superfast wireless without a lot more fiber as it's the fiber that will make those speeds possible. And to anyone who says copper's sufficient, I'll simply say that when the day comes that 3D TV and HD videocalls become feasible for consumers, I want a pipe big enough to support those high-bandwidth applications, and from everything I've seen that means fiber.

But the main inspiration for this post is the camp that's responsible for our broadband infrastructure but that for various reasons have not yet embraced the goal of 100% fiber deployment.

One faction of this camp has obvious reasons not to support it: they want to continue making money of their initial investment in a copper infrastructure. Cable companies and many telcos don't want to see fiber laid to every home because it's not likely that they're the ones who'll be laying it, so if fiber gets laid that means not only do they have to face a new competitor but that competitor will be able to offer a vastly superior network. Making matters worse is now everyone's talking about their fiber networks, which muddles the message of why we need the ultimate goal to be full fiber networks.

But there's too other significant groups that you'd think would be supportive of a full fiber nationwide deployment, but aren't necessarily so.

The first are equipment manufacturers. Without naming names, I've had conversations with at least one major manufacturer of fiber optic cable who admitted that they'd prefer a slow and steady deployment of fiber to the home rather than a large upswell in demand. For them, it's simple economics. They worry that if they increase their manufacturing capacity to support a nationwide deployment, what happens once everyone gets their fiber? They may be left with a bunch more capacity to build than there is demand to buy. So while you'd think a manufacturer of fiber optic cable would be one of the biggest supporters of the speedy deployment of fiber everywhere, in actuality while they do ultimately want fiber everywhere they'd prefer if it takes us a while to get there.

Then there's Verizon. I've long applauded their efforts to deploy fiber to the home through FiOS, but I've also been critical of their decision to cherry pick the most well-off neighborhoods instead of covering entire cities. But there's another twist to this that's not often talked about: they don't want too many people demanding FiOS.

My understanding of their situation is that they're deploying FiOS just about as fast as they can. I've even heard at various points that their demand is so great it's gobbling up all the available hardware and manpower for deploying fiber in some parts of the country. Because they can't deploy any faster than they already are, they don't necessarily want everyone in their footprint to start demanding it. In fact, if all the mayors in all the cities in which they operate were to wake up and realize how important having fiber is to the future health and viability of their communities, that'd be one of the worst things that could happen to Verizon as then they'd have to deal with managing expectations in a major way since they're years from deploying in many communities.

While they undoubtedly want everyone in a FiOS community to know about and want the product, elsewhere they'd rather people not go crazy demanding to have their networks upgraded so they can take their time getting to them. Plus, Verizon still seems to have no plans to support 100% buildouts across their footprint.

So here we are. Any network operator not deploying fiber doesn't think we need it. The biggest network operator deploying it doesn't want too many people to demand it and has no plans for 100% deployment. And the equipment manufacturers would rather the deployment of fiber to the home takes its time rather than happening as quickly as possible. Plus many learned people don't see the need for fiber with high-speed wireless on the way.

What does all this mean? For those of us who believe that fiber is our future, it means we've got some work to do. We're all out there trying to convince people that fiber is where it's at, but in some ways we're all alone. And that worries me. How can we create a groundswell of demand for fiber without everyone working together? And when we create that demand, who's going to help satiate it?

It's a question for which I don't have an immediate answer. All I can say for now is that I, for one, do believe in the need for fiber. Others may doubt its feasibility and necessity, but I want us to realize all that a fiber-enabled future can make possible. And in order to achieve that, we're going to have to find a way to flip the script and open the floodgates of demand for fiber. Otherwise, the deployment of fiber will continue to lag behind not only other countries but behind what America is capable of once we set our mind to it.

Who wants fiber? I do!

The discussion over metered bandwidth has officially spilled over into the mainstream media as evidenced by this New York Times article about charging by the byte.

Rather than rehashing the details of what's covered therein, I want to pull out a single quote they pulled from a recent Cisco report: "Today's 'bandwidth hog' is tomorrow's average user."

I believe that this is one of the most important realities in broadband as it alludes to so many aspects of this overall debate.

First off, heavy users aren't "bandwidth hogs," they're just early adopters.

Secondly, all it takes for an average user to become a "bandwidth hog" is for them to start using the Internet and/or more bandwidth-intensive applications, and that's certainly not a bad thing.

Third, the focus shouldn't be on demonizing heavy users today but instead making sure that the networks being built can support the heavy users of tomorrow, which will hopefully be everyone.

More and more I'm finding myself irritated by this term: "bandwidth hog."

It implies that those customers who have adopted the Internet first are greedily gobbling up more than their fair share of bandwidth, but is it really their fault that they love the Internet so much?

Until recently, nothing they were doing was against the terms of their contract for broadband services, so it's not like they're criminals, they just like to use the Internet.

I sometimes worry that using inflammatory terms like "bandwidth hog" is part of what's preventing us from having a more robust dialog about the challenges we face in expanding and enhancing our Internet infrastructure in this country.

Even if it wasn't the intent of broadband providers to demonize their heaviest users, they have to be more careful, especially the bigger ones, as everyone's already expecting them to be evil and not have the interests of their customers at heart, even though in reality one of the biggest things driving this witch hunt is the need to insure high quality service is received by all.

Of course broadband providers are first and foremost interested in making money, but they also have to insure they have enough capital to support building out capacity and that they're able to provide a consistent service to all that isn't degraded by the few.

But instead of saying, "if only these early adopters weren't using our service so much," I wish they'd go down the path of saying, "we are facing legitimate issues updating our business model and upgrading our networks to support the heaviest users, especially during this time when there's such a large gap between light users and power users, and because of this we must consider all options, keeping in mind first and always the interests of all our customers."

There...isn't that better?

My final thought on this is that we should never, ever, ever tell someone they're bad for using the Internet too much. What we need to do is discover a way whereby that usage can be encouraged as much as possible.

Because for me, it's not a matter of if the exaflood is coming or not, but instead what can we be doing to bring it about ASAP. And with this mindset the goal becomes not "how do we manage the heaviest users" but instead "how do we increase capacity to support a future where everyone is a 'bandwidth hog.'"

It's inescapable that at least part of the reason the US lags behind other countries in the deployment and adoption of broadband has been the lack of leadership in the White House over the last eight years.

While it's possible one can argue that the Bush Administration's hands-off policies didn't hurt anything given the growth that has been realized, it's also clear that it did very little to proactively help spur the kind of growth that's been realized in countries like South Korea, Japan, and Sweden.

To some degree I can almost forgive them. In 2000 when Bush took office, despite its promise one could argue that because of the Bubble many perceived the Internet as being a passing fad. By 2004, broadband was being deployed widely and applications like YouTube and BitTorrent were being created, suggesting that the Internet was doing just fine without any need for leadership from our country's highest office.

But today, things have changed dramatically. We live in a time where we can no longer afford to have a void in leadership in the White House on broadband.

Broadband's available almost everywhere, but it's going to be a serious challenge reaching those last homes without it.

Broadband's faster than ever, but we're lagging behind other countries with no real plan for how to catch up.

Broadband's penetrated half of American homes, but it always takes more work to get the second half of subscribers than the first as you can no longer rely on early adopters finding their own way.

Broadband's enabling all sorts of new opportunities to improve healthcare, education, government, and business, with many people relying on it more than ever, yet only through strong national leadership will we be able to take isolated success stories and find a way to extend them all across the country.

These are all situations that will not resolve themselves quickly if left alone. Don't get me wrong, I'm a huge believer in the ability of the market to innovate and find new solutions, and I'm almost never a fan of legislation, but at this point what we need more than anything is leadership.

If we are to reach the goal of 100% penetration, 100% adoption, and 100% use, we need someone to step up and say this is where we should be taking the country in the 21st century. And no one person or office has the potential to do that more than the President of the United States.

Because of this it's my fervent belief that one of the biggest responsibilities our next president will have in their first term is to claim the mantle as the leading champion of broadband in this country as no technology offers more promise to impact so many facets of society than broadband.

Which brings me to my final point: why these thoughts lead my vote to Obama.

To be honest, I've been a bit disappointed by Obama's support of broadband. At least publicly, he's blindly thrown his support behind net neutrality. He's begun to talk about the possibilities for incorporating online tools into making government run better, but he hasn't done near enough to talk about its application to improve healthcare, education, business, and so on. In fact, if you go to his website and search for the term "broadband" there are very few specifics it mentions and most of them were said in speeches in 2007 with very little said in 2008.

But on this issue, he almost gets my vote by default because of how far out of touch McCain is.

Go to his site, search for broadband, and you get a whopping 5 results. One of which is his support for a bill that will build a broadband network for first responders, which is great, but another is text from a speech of his from 2006 that cites the fact that broadband is impacting all facets of communication, yet if he believed that to be the case, then why hasn't he embraced talking about it elsewhere?

That's actually an easy question to answer: he doesn't know the Internet. He's admitted that he doesn't use computers, that he doesn't use the Internet or even understand it.

Now, I'm not expecting our next president to be available for me to ping via email, and I'm not trying to start a witch hunt for anyone who's left that doesn't use or understand the Internet.

But even still, we're living in the 21st century. There's no greater tool to improve our quality of life than broadband and the Internet. So how can we trust our future to someone who doesn't know about and believe in the transformative power of these new tools?

What this all boils down to for me is that it will be the responsibility of the next president to carry the flag for broadband, to provide leadership so that we may take the greatest advantage of what it enables, and to seize the opportunity to harness its power to transform society. But the only way these things can be realized is if we have a president who has at least some level of understanding about why it's so great.

Now if McCain's staff wants to embrace the opportunity to learn more and incorporate broadband-centric messaging into its campaign -- which they absolutely should -- then I'm more than happy to talk with them or any politician, regardless of their political makeup, about what's possible and what we can be doing to achieve those goals.

But for now, this broadband believer's vote is for Barack Obama.

Real World Sports Meets Virtual Gaming

| No Comments | No TrackBacks

Here's a really neat story about the future of racing games.

Basically what it says is that we're soon going to see games that allow players to race alongside real-life races in real-time.

I can't imagine anything cooler for fans of racing games, plus it's a unique example of what broadband is making possible in the gaming world beyond creating massive open worlds with increasingly higher quality graphics.

I don't think this model will work for many sports, but between NASCAR, F1, Indy, and maybe even some day horse racing and the like, there's going to be an audience that's plenty large enough to create demand for this, so hopefully we'll see this concept become reality sooner rather than later.

What Can PEG 2.0 Mean in the 21st Century?

| 1 Comment | No TrackBacks

PEG channels. Public access. Community TV. While all feature slightly different definitions, they all stem from the same idea: to enable more robust local community communication.

But like the TV medium that distributes this content, the future of local media is both uncertain and filled with unlimited promise. As I've written before, TV isn't necessarily the best distribution mechanism for local media; it's a broadcast technology being used to try and reach a narrowcast audience.

But what could PEG 2.0 mean in the 21st century? If you were given the opportunity to build a new system from the ground up in an environment rich with bandwidth, what would you do?

That's exactly the question that was posed to me yesterday by John St. Julien, a well-known pot stirrer and thought leader in Lafayette, LA, where the local utility is building a full fiber network. He's been asked to help figure out what should be done, and he came to me for advice.

The first thing I thought was how exciting a challenge this is given the nature of the opportunity. The unfortunate truth is that in most communities PEG channels are seen as a burden by the incumbent network operator, something they're forced to do in order to secure franchise agreements. In this case, though, the network operator is the local utility, and while their budget is far from overwhelming, their intent is quite simply to do everything they can to maximize the positive societal impact of the network they're putting into place.

Also worth mentioning is the fact that once built, the LUS network will have all the bandwidth a PEG system could ever want. As covered previously, every user on the network will have access to a symmetrical 100Mbps intranet.

While not an expert in the details of technology to make this happen, I wanted to share some of my initial thoughts of what PEG 2.0 can mean in a community with plenty of bandwidth and a network operator eager to cater to the needs of its constituents.

- Let PEG onto the intranet. When last I talked with Terry Huvall, the head of LUS, he shared that at least initially the intranet would be limited to consumers and small businesses. As local media is created by consumers I'm hopeful this would already be covered, but if not then this is an essential first step in order to allow for producers to reach consumers without any constraints on bandwidth.

- Enable multicast on the network. As I wrote about here, it's my understanding network operators can enable multicast on their network thereby empowering all of their users to be able to stream live from anywhere on their network to an audience of any size. This would mean that video from any live event could be made available to the community to watch without any great expense or complexity.

- Find a way for video to reach all screens. A primary part of this would be to get Internet video onto the TV through LUS's set-top boxes, but it also might include making sure video can be viewed on a cell phone.

- Get some storage for on-demand playback. Assuming you can get IP video onto the TV, then forgo the need for LUS to put the channels into their traditional TV headend and instead focus on getting some basic IP video storage that can be used to deliver video to all screens. I'm not sure exactly what's needed, but I do know that by doing this it should save some hassle while simultaneously improving functionality.

- Allow all content producers to make their video available in one fashion or another. There'll almost certainly have to be some sort of filter in order to avoid offensive content making it into the system, but the key is to find a way to not only embrace all current local producers but also to encourage new producers to pick up the camera and join in on contributing to their local media.

- Empower individuals to create channels. Giving everyone their own channel might be a bit excessive, but what about giving people the ability to program their own channel by pulling from a giant library of content? In many full fiber networks, the ability to add channels is near infinite, so there should be room to support a large number of voices.

- Establish an avenue for capturing local creativity in technology development. It's unlikely that LUS will have sufficient resources to devote to developing new, innovative, interactive features that extend the functionality of their service. But that doesn't mean they can't enable that innovation by creating a sandbox and inviting the open source community in to play. By doing this they may be able to realize all sorts of innovations that could extend the functionality of local community media.

These are just some ideas to get the conversation started.

Now I want to turn the issue over to you all out there. I know there are a number of great, creative minds focused on the task of making PEG great. What would you do if given the opportunity to build a local community media system on a network with infinite bandwidth alongside a network operator who's top interest is in making its city great?

Is there specific hardware you'd recommend? Or are there features and functionality not mentioned above? Or do you have thoughts that build on this initial list?

Whatever mindshare you can add to this equation will be more than welcome and it has the potential to guide decisions being made over the coming months in Lafayette.

I believe we have an incredible opportunity to imagine what the future can be down in Lafayette and make it come to life. And in so doing I believe that the lessons learned there will be able to help solve problems elsewhere as we all work together to make our country even greater than it already is through supporting stronger local community media.

Dallas, We Have a Problem - Too Much Broadband

| No Comments | No TrackBacks

Shocking news in the world of fiber deployment: Verizon is planning to overbuild AT&T;'s U-Verse fiber-to-the-node network with their own fiber-to-the-home FiOS service.

The implications of this move are staggering.

First off you've got two $100 billion a year corporations fighting over the same wireline customers. While AT&T; and Verizon have long battled for wireless subscribers and business customers, this is the first I've heard of them going head-to-head offering triple play services to consumers, which in and of itself is significant.

Secondly most of the rest of the country must be jealous over the fact that these communities are going to have more options than just about anywhere for broadband. I know I am.

Third it shows how competition is being encouraged by a statewide video franchises, which Texas passed and therefore enabled Verizon to do this.

Fourth it's potentially devastating for AT&T.; In terms of capacity, FiOS trumps everything they're trying to market as being new and improved. It was hard enough trying to convince people that U-Verse was better than cable, now they've got to compete with the fact someone else is offering a full fiber network. Plus, once Verizon gets that fiber in the ground and a customer signed up, I think it's going to be hard for AT&T; to ever get them back.

Fifth, this will be a fascinating situation to watch as it, perhaps more than any other FTTH build in the country, will demonstrate how vulnerable incumbents are to a FTTH deployment stealing their customers. I say this because everywhere else Verizon's deploying they are the incumbent, and most of the other deployments are either to greenfield developments or by municipalities, which offer an inherently different value proposition as consumer decisions are influenced at least in part by individual's trust or distrust of the government.

But in the end, I hate this news.

Why? For one simple reason: if our goal is a fully wired country, than this is an inefficient use of resources.

Don't get me wrong, I'm very excited to see more communities get wired with a full fiber network. But it's upsetting to see some communities continue to get more investment when many others aren't getting any.

Making matters worse is what is likely to happen as a result of this decision by Verizon: more investment from Time Warner (the primary incumbent cable operator) and AT&T;, which will not only further the gap between the haves and have nots but may also directly divert money from being invested in less competitive communities in order to help defend the customer base in more competitive ones.

If Time Warner and AT&T; don't invest in upgrading capacity, they're likely to lose customers to Verizon. But if they do, we could eventually end up in a situation where there are multiple fiber pipes running to the same home, which is totally redundant.

Fiber optics are so robust that all the world's Internet traffic can run over a single hair-thin strand. The reason we have multiple pipes today is that they were first put in for separate uses: cable TV and telephone. But once you have fiber, all you need is one pipe to the house to support all the world's video, applications, and services.

Because of this, the fact that we might see some homes get two fiber connections before so many others can't get one suggests to me that there's something very wrong with this picture.

But I have to admit, this whole line of thought has me really torn.

I always support the further deployment of fiber, especially if it's all the way to the home.

But I also can't help worrying because since wiring the country is such a massive, capital-intensive job I'm not sure we can afford to waste money allowing competitive markets to continually overbuild the most attractive neighborhoods, especially if it means leaving other communities behind.

Is it possible for a community to have too much broadband? We might soon find out.

A new study just came out based on a survey of 1500 technology workers that shows they'd be willing to accept up to a 10% cut in pay if it meant being able to telecommute and avoid the hassle of coming into the office.

The sarcasm in the title of this post wasn't intended to be too thick, I just find it funny how often we seem to forget the basic truth that more people than not would rather work from home than go into an office if they could.

But at the same time, the fact people are willing to take pay cuts in order to do so was eye-opening.

So what this is basically saying is that employers willing to aggressively pursue telecommuting programs are not only likely to attract more and better applicants, but those job seekers might be willing to take less pay.

But wait, there's more. With employees working from home, that means no more needing to pay for office space for them. Plus, no more leaving the office early to beat the traffic. In fact, one could argue that it's likely that employers will get more hours spent working out of telecommuting employees as they don't have to spend that time on the road.

And for the telecommuters, it means less money for gas, less wear and tear on the car, less risk of injury while on the road, and more time at home.

I mean, with all this staring us in the face, how can any company or governmental organization not want to embrace telecommuting immediately?

I know there are still issues to work out, different processes that need to be established, some assurances that people working from home are actually working, and so on. But none of these are insurmountable.

It seems to me like those companies willing to take the telecommuting plunge are going to have a leg up when it comes to attracting new hires and therefore they'll improve their competitive edge. And hopefully studies like this will start to open the eyes of companies of all shapes and sizes about the potential positive impact of adding telecommuting to their businesses.

Giving Credit Where Credit's Due

| No Comments | No TrackBacks

I admit I'm sometimes hard on cable companies, whether it's their P2P shaping, their lack of bandwidth, or the shared nature of their networks. But I have to give credit where credit's due.

While my Comcast cable connection only promises me 768Kbps of upload, it's often been topping 2Mbps recently. In fact, I just uploaded a 50MB VidChat (here's a teaser: it's with BitTorrent about some misconceptions of P2P) in minutes with the network supporting a sustained throughput of 2Mbps+ almost the entire time.

Don't get me wrong, I still have issues with them. Too often during busy times the network feels really sluggish when downloading, let alone uploading.

But 2Mbps upstream really ain't too bad. In fact, it's better than 95% of connections out there. (OK, so I admit I just made that stat up, but the truth still holds as the vast majority of consumer broadband, be it DSL or cable, is asymmetrical, offering no more than 1.5Mbps upstream.)

Also, I've heard sporadic reports from across the country that at least in areas where someone's deploying a full fiber network that the cable companies are deploying fiber of their own to increase capacity.

Even better is that while before cable companies poo-pooed the need for fiber, now they're touting their own fiber optic networks wherever and whenever they can.

While this last part isn't my favorite thing in the world as it muddles the messaging of anyone deploying a true fiber-to-the-home network, at least we've got more people than ever evangelizing for the awesome power of fiber optics.

And I've had discussions with multiple people that point to the fact that if cable companies wanted to, they could move to an IP-based video delivery system and open up a ton of bandwidth to be used for broadband. While it doesn't seem likely they'll do so any time soon, it's still interesting to know that if some outside force pushes them enough that they are at least capable of delivering big broadband speeds.

Though whether that'll ever happen in our lifetime to even the majority of America let alone the whole country, well that's another issue entirely...

The Significance of The Colbert Report on Hulu

| No Comments | No TrackBacks

Big news in the content world: Viacom has made some of its shows available on Hulu.com, the joint venture between major TV networks to offer full-length TV shows online.

In particular for me, two of my favorite shows are both now available on Hulu, The Daily Show and The Colbert Report.

But this isn't a story about how a show I like is available elsewhere. Instead it's a tale of the internal dynamics of the business of online video delivery.

To frame this, I've been watching full episodes of these shows for a while on ComedyCentral.com. In fact, they've been online as long as pretty much any first-run TV show.

But watching them was always an exercise in frustration. The reason for this was the placement of ad breaks. Not only were there ads during the natural commercial breaks, but they'd often appear after every comedic bit rather than after every segment.

Making matters worse was the haphazard nature of the ad breaks. I can't tell you how many times the punchline of a joke gets cut off while watching on ComedyCentral.com.

Now, trying to squeeze more ads in make some sense, but it always felt awkward and forced. It just didn't make any sense why Comedy Central was sabotaging the online viewing experience of these shows.

Until I read an interesting tidbit somewhere a couple of weeks ago: Viacom had made the decision to cut everything up into short clips so as not to upset cable operators.

It's important to understand that the current cable TV system is setup so that your cable TV provider is paying a fee to carry channels like Comedy Central. Needless to say, when media companies make that same video available for free online it doesn't make the cable guys all that happy. In fact, some are starting to wonder why they're paying for content that's being given away elsewhere.

By cutting shows into short clips, the thought was it would help placate these concerns.

But something's changed--my guess is the financial success of Hulu.com--and now you can watch full-length episodes of The Daily Show and The Colbert Report on Hulu.com in their original form with their original ad breaks.

This is a big win for consumers, especially those like me who have foregone paying for cable when so much video's available online for free.

Where this all leaves us I'm not entirely sure. Content owners realize that if they don't get in the game online they're likely to be left behind. But by doing so they're calling into question their relationships with cable TV operators.

The final variable in all this to consider is the financial model of delivering content. On cable TV, content owners are getting paid to have their shows distributed. But online, content owners have to pay to deliver their show. Additionally, on TV they're able to run 6-8 minutes of ads for every half hour show whereas online at best it's 3-4 minutes, so less revenue.

A great piece to read about these conundrums was written by Mark Cuban a few weeks back.

The old model of media distribution is rapidly breaking down, and while I'm happy to report that at least in the battle over The Colbert Report that we, the viewers, have won, the future is entirely uncertain.

Andrew Odlyzko is the leading researcher of demand for bandwidth on the Internet. While some have been talking about explosive growth, he's quietly and continually shown through his research that while at one point the Internet was doubling, that at this point it's growth is pegged closer to 50%.

I just came across another article about him and his presentation at a recent conference where he shared that in a place like Hong Kong, where a lot of people are subscribed to ultrafast broadband connections, that while users consume six times the bandwidth as we do in the US, that their growth rate is less than 20%.

That's an interesting thought to consider as it suggests that the presence of big pipes alone will not drive exponential growth in demand for bandwidth.

One question it raises is whether or not these lower growth rates are upper end limits or merely temporary plateaus.

Of course, some of this has to be attributed to the fact that the higher the overall demand the harder it is to realize a high percentage growth rate.

But even still I find the position we're in kind of odd. There's no denying that demand for bandwidth is growing. Yet Odlyzko's studies show that perhaps it's not expanding as fast as we first thought.

But then there's the reality that there are a host of high bandwidth applications on the horizon. Plus the fact that most Internet users today don't use high bandwidth applications. So the possibilities are there to ignite another round of exponential growth.

Not only that, but that growth in demand seems almost inevitable given the presence of ever-larger pipes and increasingly powerful applications.

For example, when a user goes from watching a YouTube video to watching a full-length episode of Lost in HD, that's not just a doubling in demand for bandwidth, it can be a tripling or quadrupling.

Or what happens when we start moving from small window videocalls to fullscreen? That again is an order of magnitude more bandwidth, say from 200Kbps to 1Mbps, and I'm not even talking yet about HD or uncompressed HD.

And the reality is we might end up having capacity run ahead of demand, so once more users start waking up to the possibilities of rich media Internet applications they may skip YouTube and go straight to HD; they may bypass small-screen videocalls for full-screen if for no other reason than they can.

Yet at the same time, a theme I consistently hear is that while getting the early adopters on board is easy, convincing the next group of people to join in the broadband revolution takes a lot more work. So even though we've got more people going online and getting broadband every day, equipping them to take advantage of high bandwidth applications won't be easy.

But yet again there's a potential for explosive growth as I'm a firm believer that if you can just sit down and show someone what's possible, a light bulb will go on and they'll start transitioning into the mindset of a power user.

So here we are, in a position where the reality doesn't match up to the promise and an uncertain future lies ahead. What would happen if a bunch of celebrities or politicians started pushing the benefits of video-on-demand online and videocalls? It's quite easy to imagine a tidal wave of interest crashing onto the Internet.

The reality is that any upsurge in demand can take down individual servers quite easily and could even threaten large swathes of networks as everyone who knows about this will tell you that the Internet can't yet support TV-sized audiences.

But at the same time, all trends point to modest growth that will gradually increase over time.

The scary thing is that ultimately we have no control over this. Part of me wants to encourage this huge upswell in interest as that's ultimately one of my big goals, to get everyone engaged with using the Internet to better their lives. But another part of me worries about what might happen if I'm successful in that goal as not only are the networks today not capable of handling unlimited demand but the long-term trends in demand that are driving investment don't suggest the near-term need for all that much more capacity than we have today, making it less likely the networks will be there to support a significant uptake in demand should it materialize.

What troubles me most about all this is that this issue totally frames the decisions being made in how much to increase capacity, and yet there seems to be a giant gap between the reality and the potential, and what's so uncertain is how quickly we're going to reach that potential. Will it be slow and steady growth from here on out? Or will something happen to ignite a firestorm of demand?

Fundamentally I'm a big believer that the demand for bandwidth is there it's just bottled up in the fact that most people don't understand today what broadband can mean for their lives. And I do believe we can realize a future soon where it's not just the early adopters using high bandwidth applications.

But the most worrisome part is we really don't know what the future will hold. Some predict even less than 50% growth, but what if the opposite is true?

What happens when someone decides to put the Super Bowl online and everyone shows up at the same time?

What happens during the next big national emergency and everyone's going online to find out more information?

Will the Internet be able to support these surges in demand? The truth of the matter is no one knows.

And unfortunately the only way we're likely to find out is to have a surge that overwhelms the Internet, leaving us without its power potentially during a key time.

I wish there were a way around this but I'm not sure there is. But I do know that unless we can decide where we're at, either riding a rising tide or poised on the precipice of a crashing wave, we can't properly prepare to survive the aftermath.

And it's part of my life's work to prove that the wave is not only cresting but it's ready to crash in many splendid directions as we all start to realize the full power of broadband in our lives.

Visualize How Obama Won

| No Comments | No TrackBacks

Here's a neat tool from the New York Times that breaks down how demographics broke down in the recent Democratic primary.

Click on categories or arrows to switch views. Rollover individual states to get their specific numbers.

Not only is this tool informative, it's also kind of cool in the way boxes fly around as you switch from one stat to another.

It's a terrific example of how computers and the Internet are expanding the ways we can visualize information. Just think of how you used to have to look at info like this before: a long list of boring numbers.

Now that info can be presented in a way that's visually appealing and that makes digging down into it easier than ever.

How P2P Piracy Enforcement is Like Iraq

| 2 Comments | No TrackBacks

In reading about the ongoing battle over P2P piracy, I can't help but draw analogies between this devolving situation and Iraq.

First off, both are now littered with hired guns. Iraq has Blackwater, Hollywood has MediaDefender, "the leading provider of anti-piracy solutions in the emerging Internet-Piracy-Prevention industry," according to their site.

Instead of guns, MediaDefender maintains 2000 servers and a 9Gbps connection that they use to seed fake files into illegal torrent tracking sites. These trackers are what help P2P users find illegally distributed content. It's MediaDefender's intent to both frustrate users by introducing fake files as well as to bring down the tracking sites by overwhelming them with denial of service attacks.

But like in Iraq, sometimes innocent bystanders are getting caught in the crossfire of mercenaries, as happened over Memorial Day weekend when MediaDefender's system brought down the website, RSS server, and internal corporate email of Revision3, a legitimate content producer that legally distributes their content via BitTorrent.

Read the article linked to above as the situation was a bit more complicated than this, but the gist is that these overzealous mercenaries attacked a site simply because of the company it kept.

But these Iraq analogies just got deeper in a disturbing way. Researchers at the University of Washington have recently shown that the efforts by content owners to identify pirates may be fundamentally flawed as any user can be framed for copyright infringement today even if they didn't do anything. Even without being explicitly framed or even having ever used P2P software, users can attract attention from these systems. And to top it all off, it's not just users that have to be wary, even things like networked printers have been shown to trigger a response from the systems searching for pirates.

Here too the analogy to Iraq stands, where our armed forces are having trouble separating the terrorists from the average law-abiding citizens.

So in Iraq we have an overwhelmed system of enforcement (the military) that's had to rely on bringing in mercenaries but which can't properly identify the real targets and in turn is resulting in innocent bystanders getting caught up in the crossfire.

Drawing this analogy even further, you've got a prevailing sense amongst those in power (the government and content owners) that regardless of whether or not these policies are misguided, if they were to be abandoned now that withdrawing the troops would cause utter chaos to ensue.

And while the analogy breaks down somewhat when discussing solutions, the spirit seems to be the same, namely that we need to find a way where we don't turn everyone into criminals and where law-abiding citizens can get what they need legally.

My apologies to anyone with family in Iraq or the military if this analogy seems flippant as obviously the issues being dealt with over there are matters of life-and-death as opposed to an industry geared towards creating fantasy worlds to help people escape from the horrors of these real-world situations.

But the analogy still stands, and with both I feel the same sense that there's got to be a better way than staying the course. That we need to find a way to step beyond this status quo and start finding real solutions that not only protect the rights of both consumers and content owners but that are also sustainable over time.

Otherwise I worry that the efforts to fight the threat of piracy will continue to mirror that of our attempts to defeat terrorism, leaving us stuck in quagmires and not focused on directing all our energy towards making the world a better place for everyone.

I just read this Washington Post interview with Steve Ballmer head of Microsoft.

At one point he stated his belief that within the next 10 years all media will be delivered over the Internet.

While I agree with this generalization in general, he didn't go on to address the fact that a few other things have to happen first.

The biggest of which has nothing to do with the Internet and new media distribution models. The only way we're ever going to replace paper is by enabling a display technology that mimics paper, a concept generally referred to as epaper.

The simple truth is that consumers like the feel of opening up a newspaper and holding a book. And despite promises of the Internet and computers getting rid of the need for paper, the opposite has been true as they've arguably created more paper usage than ever before. Why? Because people like paper.

I think sometimes we online advocates suffer from a bit of conceit, assuming that since the Internet is so wonderful that it can cure all ills. But the truth is that there are almost always other pieces to this complex tapestry that need to be resolved first before we're able to take full advantage of the distribution possibilities of online delivery.

All this being said, even though Ballmer's citing of 10 years is kind of generic and almost a cop out, at the same time within the next 10 years epaper should be a reality.

Already technologies are on the market like the Amazon Kind ebook reader, and working prototypes exist for bendable screens that can roll up inside of a cell phone, as well as epaper screens capable of displaying color and video.

Eventually epaper promises to be produced almost as cheaply as regular paper, so much so that we're going to start seeing these screens everywhere, from cereal boxes to pill bottles to posters to fruit labels.

We can't just say that the Internet will replace a traditional form of media without acknowledging the role of enabling technologies like epaper, otherwise we risk sounding like we're repeating the over-hyped promises of the Internet that led to the bubble a decade ago.

Tune In To Media Reform Online

| No Comments | No TrackBacks

There's going to be an interesting conference starting today and running through this weekend called the National Conference for Media Reform.

They've got a number of well-known speakers talking about the challenges and opportunities of media in the 21st century.

You can tune in and watch many of the keynotes, some of the sessions, and an unknown quantity of complementary coverage by going to this page and clicking through the links found therein.

Moving forward I'm going to try and do a better job linking to live webcasts. I have to admit, I'm not a big watcher of them as they don't always make for the most compelling video, but at the same time if they're talking about things that interest you and you can't physically be at the event, then there's no better way to be able to stay informed.

On the latest edition of App-Rising.com's VidChat, you're in for a real treat as I sat down to chat with Michael Johnston, VP of IT for Jackson Energy Authority (JEA), to discuss the challenges network operators face dealing with so-called bandwidth hogs.

What's interesting is that while JEA is a municipal utility in Jackson, TN that's deployed a full fiber network, they're facing the same issues as the multi-billion dollar for-profit enterprises we normally hear from on these issues. Additionally, Michael is a dynamic speaker who in addition to understanding the technology also recently got his MBA so he can speak to the business challenges just as eloquently. Plus he's one of the nicest guys I've met.

So enjoy!

Now for the followup notes:

- Here's a link to JEA's website.

- "PON" stands for Passive Optical Network, which describes the kind of technology used in their full fiber deployment.

- He makes the good point that there are many different types of systems that aren't designed to handle the traffic of everyone showing up at the same time. If everyone picks up their phone at the same time, the system will crash. If everyone drives to work at the same time, you'll have massive traffic jams. If everyone turned on their appliances at the same time, it'll overwhelm the grid. All analogous to broadband networks, for which assumptions were made initially that not everyone would be using it fully at the same time.

- When talking about high-bandwidth users, he cuts through the clouds and firmly states that it's not gamers or YouTube that's causing strain on these networks, it's purely P2P as those apps are overwhelming the assumptions that not everyone will be using their connections all of the way all of the time.

- But he also clearly asserts that what those users are doing isn't wrong, it's just that P2P apps are at odds with the business models of network operators.

- When Michael mentioned an article of mine coming from left field, he's referring to my riff on tiered pricing from earlier this week.

- He shares the simple fact that the big for-profit service providers are dealing with the same challenges he is as a smaller non-profit entity.

- He then goes on to say that the network they've built is capable of delivering 100Mbps symmetrical to every home, but that he can't afford to deliver those speeds because of the cost of backhaul bandwidth and the fact that P2P users upset the economics of a shared network.

- I was really glad to hear him make the point that the cost of bandwidth within their access network is essentially free, but as soon as you upload a video to YouTube or participate in a global peer-to-peer network they have to pay to support delivering that traffic outside of their network.

- He also makes an interesting point when he admitted that today their only answer to these problems is to squeeze margins, which is especially painful given that they're trying to pay off the massive investment they made in building a full fiber infrastructure.

- One thought I haven't heard much before is that to start metering bandwidth requires yet another large investment and adds a lot of complexity to a system that's normally only interested in moving information through a router as quickly as possible. This makes it a challenge for smaller operators. Plus there are issues surrounding how you prove to users that they owe more money when you send them the bill. Those weren't arguments he seemed eager to get into. And even if the model's successful, it's not like they can turn this up overnight.

- I loved his reaction to my point that everyone's trying to figure out how to get people to pay for things online, which was, "Heck yeah, the network operators are trying to figure it out too."

- He went on to highlight the disparity in investment between network operators and applications developers, citing that while an application developer can put a few million into R&D; and then go after a nationwide audience, JEA has had to put in $50 million to only reach 30,000 customers. While he's trying to foster innovation by providing capacity, the reality is he has a lot more invested per customer and they need to get that money paid back. And again, Michael's not talking about this through the lens of trying to drive a profit; he just wants to pay for the network.

- Paxio is a small deployer of FTTH to greenfield developments in the San Francisco area. They're a really interesting provider who I'll be trying to get onto another VidChat soon.

- In talking about net neutrality, Michael voiced strong support for letting the market decide what should happen. He pointed to the fact that the market already seems to be working in instances like the Comcast case as he's seen reports that have shown their subscriber count took a hit following the news breaking that they were interfering with P2P traffic. He's also terrified of what's going to happen once anyone tries to turn this into a law as then it'll be left up to the lawyers and the legal system to determine what the law really means, which is often a messy, expensive, drawn out process.

- You can look forward to many more conversations with Michael as I really appreciate his perspectives on these issues as someone who's in the trenches in a smaller community with a full fiber network that provides new insight into the challenges network operators face that's free from the question of how much profit motives are driving the decisions he makes and positions he takes.

Metered Bandwidth Not New

| No Comments | No TrackBacks

I've been reading a lot of reactions to Time Warner's decision to test out bandwidth caps with overage charges, and there's one common thread I notice in many of the comments that isn't always included in the articles: the concept of metered bandwidth is nothing new.

You used to get dialup access by buying minutes until consumer demand was found to be greatest for the all-you-can-eat model.

And many satellite providers have daily caps that will even cut your service off entirely for the next day if you go over, though unfortunately there aren't any solutions for most of these users as the only people I know using satellite are doing so because they can't get connectivity any other way.

So while this does set a precedent in the cable world and possibly even the wireline broadband market, it's not new.

[Update: I hit publish a little too early as I just read that while this does set a precedent in the US market, internationally some countries have been using bandwidth caps for quite some time. In the UK, for example, apparently if you agree to a bandwidth cap you can pay less for service. Now that's metered bandwidth I can get behind, where caps are used to lower the cost of service rather than raise.]

Sometimes I'm too critical of government and its inability to craft effective broadband legislation, so it's important that I give credit where credit's due and celebrate those times when government gets it right.

Here's just such an example: the House passed the Telework Bill, which will afford qualified government employees the opportunity to work from home and telecommute two days every two weeks.

Now the bill is not yet law, but all signs point to it becoming so soon as a similar bill has already passed through a Senate Committee on Homeland Security and Government Affairs.

I love this effort for a lot of reasons. It's helping legitimize telecommuting. It's proving that the government is opening up to the possibilities of using broadband to drive new efficiencies in its operations. And by being proactive I believe government is going to set a tremendous example for the private sector, which will hopefully spur adoption there as well.

One other thought worth noting is the relevance of telecommuting to homeland security. Some might wonder how it's relevant, but there's actually a very good reason: what happens in the event of another terrorist attack, or a natural disaster, or just really bad weather? What happens when the people who help run the government can't come into work? Do the gears of government come to a grinding halt?

Not if everyone's empowered with the ability and know-how to telecommute from home.

Of course, not every government service is effectively administered from home, but if we were to have a robust, decentralized system for working it'd make us much more resilient and flexible when it comes to reacting to large-scale emergencies.

So great job, Congress! Keep up the good work in realizing the power you hold to help drive the broadband revolution.

On Thursday, everyone who signs up for broadband service from Time Warner Cable in Beaumont, TX will be subject to a new metered bandwidth policy.

Bandwidth caps will range from 5GB a month for the low-end 768Kbps service up to 40GB for anyone signing up for service at 15Mbps, with overage charges being $1 per additional GB.

There's been a fair amount of controversy surrounding this topic.

Advocates of metered bandwidth cite the need for network operators to find ways to recoup the costs associated with the heaviest users, or bandwidth hogs, which the all-you-can-eat model doesn't allow for.

Critics claim this is just another example of network operators getting greedy and that metered bandwidth has proven unpopular since the days of dialup.

I've long advocated the position that the business model of broadband does need to be reworked, but I'm hesitant to embrace models like the one described above.

First off, the cap is too low. If all you're doing is email and web surfing, than even 5GB is plenty, but as soon as you start watching, sending, or using video in any sort of intensive way, you'll be quickly butting up against those limits, even at 40GB.

As I've written about before, with my wireless connection, which doesn't have near the capacity of my cable modem, I've managed to top 1GB in a day just by watching an hour or two of TV online. No high-bandwidth P2P apps and no HD, just watching videos on websites.

But my issue isn't so much the size of the caps, it's what happens once you go over.

$1 a GB holds the potential to severely damage a number of online business models, especially those aiming to deliver high quality video.

Take Netflix, which has been pushing a new streaming service that allows you to watch movies on-demand rather than waiting for them in the mail. As of right now, there's no additional cost to watch these movies. But in a metered world, once you're over the cap, every time you watch another movie it's going to cost you at least a dollar or two.

Or any of the many online rental outlets. Renting a movie for a couple bucks may seem like a deal, but what happens if its cost doubles because of overage charges? You're not getting more value, you're just paying more money.

In some ways, this reality may bode best for download-to-own as adding a couple dollars to a $10 purchase isn't quite as painful and you end up owning what you're downloading.

That said, what happens as we move further into an HD world? Even compressed, an HD movie takes 6-8GB, meaning an extra $6-8 every time you download one after you're over your limit.

The reason this worries me so is that everyone on the Internet is already having a helluva time getting consumers to pay for anything, and now we're going to be potentially burdening them with additional charges. Even worse is that these charges apply to all traffic, whether consumers are paying for it or not. So now all of a sudden, watching videos on ABC.com or YouTube isn't free anymore.

Soon parents won't be worried about their kids talking too long on their cellphones but instead they'll have to keep an eye on how much video they're watching online.

How exactly is this going to spur innovation, adoption, and use of the Internet?

Now, I don't want to ignore the plight of network operators. They have the legitimate right to make money off of their networks, but I wonder if there might be a better way. Maybe tiered overage charges that have separate costs from speeds.

So you can get 1Mbps, 5Mbps, or 10Mbps service for $10, $30, or $50 a month.

And you can get a bandwidth cap of 10GB, 50GB, or 100GB for $10, $20, or $30 a month.

Then let consumers mix and match as they see fit.

That still leaves the question of overage charges unresolved.

Ideally instead of charging per GB what I'd like to see is a system that'll just automatically bump a user up into a higher bracket when they go over. Or alternatively they could be given the option of bumping up to a higher bracket or have their service degraded. You could even have an option to pay by the GB, but have it be an option not a mandate and offer other avenues as well.

The key is users must always know where they're at relative to the limit. Time Warner's solution is to have a meter on their website. I'm not sure if that's sufficient as I'd prefer to see them offer a widget that can reside on the desktop as a constant reminder of where you are.

My biggest fear of all about this topic is that wireline network operators will try to make these overage fees into a profit center in the same way that wireless carriers charge for going over your monthly minutes allotment.

There's no reason I should be charged a quarter per minute just because I'm over my limit other than to line the pockets of my service provider.

That can't be what this turns into.

So long as efforts at metered bandwidth are put in place simply to balance out the scales in terms of making sure light users aren't paying more to subsidize heavy users, we're going to be fine.

But if broadband providers start getting greedy, I worry that we're going to end up not only stifling innovation online but also harming people's trust in the Internet and dissuading their usage, which may ultimately lead to users not just jumping to another provider but potentially walking away from the Internet altogether.

I do believe we need some solution related to metered bandwidth, but we must be vigilant in insuring that whatever system is put in place not only addresses the needs of network operators but also puts the interests of the user at the forefront.

Telepresence Goes 3D

| No Comments | No TrackBacks

Wow! That's the best word to describe this.

Cisco's Telepresence has always had a high wow factor, but go watch the video on the page linked above to see something really incredible: Telepresence in 3D.

Cisco partnered with Musion, a UK company that has developed a 3D projection technology, to create a version of telepresence that instead of using large HD screens projects the images of the people you're talking to in 3D.

The effect it creates on stage is fantastic. In fact, if you didn't know it was a 3D projection of a remote video stream, you might not know that the two guys on the right aren't actually physically there on stage.

They specifically refer to this demo as being intended for the stage, so I don't know how effective it is today in terms of enabling a robust dialog, but I couldn't help but be intrigued by the main speaker's comment that what we're witnessing here is the future of communications, and that the intent is to take this technology to the mass market some day.

So the promise of having a remote conversation that looks like the person's in the room with you isn't something reserved for the distant future but is already possible today.

What a world we live in!

Video: The Power of the Grid

| No Comments | No TrackBacks

While I've written about grid computing before, sometimes video is the best way to tell a story.

Here's a YouTube video I just found. Though it doesn't necessarily go into any great detail nor does it provide all that much new information, it's short, well-produced, and help gets some good thoughts across regarding what grid computing is and why it's significant.

Check it out!

Cool Visualization Makes Me Hungry!

| No Comments | No TrackBacks

I've linked to interactive 3D panoramas before, but never one that made me hungry. That is...until now.

Check out this spread. That's one tasty looking buffet!

Also, you can click on the button in the lower righthand corner to blow it up to fullscreen. (Press "esc" to go back to the browser.)

I'm still not sure if anyone's figured out how to use this technology for anything other than creating gee-whiz demonstrations, but gosh darn it I still can't help myself from saying "Gee whiz!" every time I come across another one.

About this Archive

This page is an archive of entries from June 2008 listed from newest to oldest.

May 2008 is the previous archive.

July 2008 is the next archive.

Find recent content on the main index or look in the archives to find all content.