August 2009 Archives

Why We Need Fiber For All

| No Comments | No TrackBacks

A site call FiberforAll.org is holding a contest whereby they're encouraging people to write blog posts explaining why every American deserves to benefit from the power of fiber. The winner gets a FiOS Triply Play bundle of a TV, phone, and netbook, though just the hardware no services.

Given my position as one of the leading advocates for a full fiber future, and in particular my strong belief that all Americans deserve equal access to the world-class infrastructure that is fiber, I thought I'd throw my hat into the ring.

For me, the reasoning for why we need a fiber pipe to every last home is simply: a full fiber future represents the ultimate realization of the Internet's full power and potential.

Let's think about this for a second. First we must understand that the Internet is a series of interconnected fiber optic networks zigzagging across the country and spanning the globe. Second we define broadband as those last-mile access networks that provide users a way to get onto the Internet at speeds faster than dialup.

Traditionally broadband was delivered over copper wireline (like DSL or cable) or wireless networks, but these technologies all have clear limitations. For DSL it's overall capacity and service that degrades the further from the central office you are. For cable it's the inability to support lots of simultaneous usage and the lack of upstream capacity. For wireless it's an amalgam of all these, including lack of capacity, lack of reliability, and issues with distance.

Fiber, on the other hand, has no such limitations.

When it comes to bandwidth, fiber's unbeatable as it's already delivering 100Mbps and even 1Gbps to homes today, and we still don't know how much data we can fit through a fiber pipe as in the labs today a single hair-thin strand of fiber can support all the world's Internet traffic.

But fiber's advantages aren't just limited to bandwidth, fiber also features the lowest possible latency, which is important for real-time applications to not suffer from lags and delays, as well as unmatched reliability since there are fewer electronic components in the field that could break, plus fiber can handle more simultaneous usage than anything else.

The arguments against committing to a full fiber future, where every American can benefit from this world-class infrastructure are twofold:

- Fiber's too expensive.
- We don't need that much capacity.

But let's consider these assumptions a bit further.

First off, today it costs the same to lay a new fiber network as a new copper network. Therefore there's no reason to continue deploying new copper given how much better performance fiber can deliver.

Secondly, it's way cheaper to maintain and upgrade a fiber network than pretty much any kind of broadband, so even if it's more expensive upfront it can be the same or even cheaper in the long run.

Thirdly, even if fiber is more expensive it provides the piece of mind that we have an infrastructure in place that can keep up with the demands of the 21st century. Every other broadband technology can only hope to be relevant for the next 5-10 years before we'll have to make massive investments in upgrading network capacity again. Better to make the right investment the first time.

Finally, the irony in all of this is that every other broadband technology relies on getting fiber closer to users to upgrade their capacity. So why just keep taking incremental steps that permanently leave us behind the curve when we can get out ahead and take a giant leap into the future?

On the bandwidth front, the arguments against fiber are based less on fiber and more on the fact that many operators are trying to protect their legacy investments in 20th century broadband infrastructure, who don't want to believe that so much bandwidth will be required so as to make their old networks irrelevant.

But the data clearly shows that the future will require fiber. In the Need for Speed report put out by ITIF, they postulated that the average home of the near future will require at least 90Mbps of symmetrical bandwidth. Currently only fiber can deliver that much upstream bandwidth.

And these demands are set to grow exponentially in the mid-term as while the Internet of today can't handle true HD video delivery, HD isn't the top end of video quality. There's QuadHD, which has four times the resolution of HD, and there's UltraHD, which has 16 times the resolution of HD. It's going to provide an experience within the next ten years where your entire wall becomes a computer and the video takes up the whole thing and almost looks 3D without special glasses. To deliver one UltraHD stream even with a ton of compression, will require at least 100Mpbs symmetrical. And if a household wants to use multiple streams, that mean hundreds of Mbps are required, which only fiber can deliver.

Even if you still question the need for a fiber pipe to every last home, there's no denying fiber's importance to America's broadband future.

For example, all community anchor institutions need to have access to fiber today. Many are currently stuck paying hundreds if not thousands of dollars a month for T1s that don't provide adequate access. Anywhere there are dozens or hundreds of people trying to get online, they need fiber.

Also, the next generation of wireless requires fiber be laid to every last cell tower. Without fiber, we can't have truly next-gen wireless.

And given this reality, there's no doubt in my mind that eventually the demands for bandwidth being realized in bigger buildings will trickle down to households as that's what's always happened in the past.

For me the question isn't if we need fiber or if we can get it, it's when will it happen. Will it take us five years to wire the country with fiber, or fifty? Because some day everyone's going to need it. We just have to decide if we want to be a leader in the development of this next generation of the Internet, or a follower.

And finally, while we're making these decisions, we cannot afford to allow groupthink to lead us down a path where we accept setting lower standards for rural areas. Where we throw up our hands saying "This is too hard/expensive" and essentially doom rural communities to a future as second-class digital citizens.

Every American deserves access to world-class broadband infrastructure. And the world has chosen fiber.

So the question becomes: "Is America ready to step up, be bold, and embrace the simple reality that fiber is our future? And if so, are we then prepared to start taking the bold action that's required to transform this from a good idea into reality?"

I say, "Yes! The time has come for us to seize the day and show the world why America has been an economic superpower for the last 100 years so as to lay the groundwork for us to maintain our dominance for the next 100 years."

Because without fiber, our broadband future is bleak.

Let's Be Patient With The FCC

| No Comments | No TrackBacks

I've been thinking about something recently: I think we're being too hard on the FCC and the deliberate pace they're setting crafting a national broadband policy.

Take for example their not including a representative from a municipal broadband deployer on their broadband deployment workshops. I, like many others, got worked up into a bit of an uproar over that perceived slight.

And yet just yesterday I learned through a comment to that article from Chris Burns, GM for Burlington Telecom, a municipal full fiber operator in Burlington, VT, that the FCC did in fact invite him to participate in those workshops.

Now, this doesn't completely wash the FCC's hands of the matter as I still have to question why they couldn't find another municipal representative, I'm not sure why they didn't just say what happened when they started getting criticized, and I still think they should be facilitating a deeper analysis of the good and bad of municipal broadband.

But at the same time this highlights that we shouldn't assume anything. For me, this suggests that it's time we take a step back, give the FCC the benefit of the doubt, and let this process play out a bit further before jumping to conclusions that the issues that are most important to us are being ignored.

Don't get me wrong, I completely understand where this pent up frustration comes from.

For years the FCC has been perceived to prioritize the interests of incumbents over the interests of communities. For years advocates of municipal broadband have felt marginalized by everyone in DC. For years we haven't had a national broadband plan that clearly defines the public interest in this space.

And I certainly have my doubts about the process of crafting a national broadband plan that's taken place so far. I sometimes wonder if these workshops are more dog-and-pony show than substantive discussions, and I worry that we're still stuck with the same dynamic of everyone shouting their positions at government while tearing down the arguments of others and the FCC's left stuck having to play referee and to sort through the static to find underlying truths.

I'm also feeling some pressure over the limited timeframe we have left to craft this national broadband plan. I can only imagine what seeing that clock tick down must be like at the FCC.

Yet, I also can't ignore that we're really only just getting this process started. The FCC's admitted that they're in full fact-finding mode right now and that they will be for at least the next month or so. In other words, they don't even have all the facts on the table yet from which to make decisions. So we're a ways away from them establishing anything concrete in terms of what's going to go into the plan.

And on the upside, I've gotten a clear sense that they truly are open to any and all ideas. Basically they're at a point know where they're willing to listen to anyone that can make a coherent, data-driven, justifiable argument.

So in many ways I see the onus as not just being on them but also that we all share the responsibility of crafting a national broadband plan.

Because of this I'd like to suggest we stop gnashing our teeth over perceived inaction by the FCC and put on hold our fears that this process has already been derailed so as to focus all of our energy and attention on developing our own arguments and building coalitions of support behind concrete ideas we want to see included in the national broadband plan.

Let's be patient with the FCC and give them a chance to do right by the country. That's not to say we don't hold them accountable, but instead to suggest that they've really only just begun this process and we should let it play out a bit longer before calling foul no them.

Because we can't afford to waste time lamenting when there's so much work to be done in pushing the intellectual ball forward to insure we end up with the best possible national broadband plan forward.

Most People Don't Understand Broadband Or Bandwidth

| 2 Comments | No TrackBacks

A couple of days ago I was chatting with a friend when something he said stopped me in my tracks and made me realize something profound: most people don't understand broadband or bandwidth.

Our conversation dealt with plans for this weekend's fantasy football draft, which will be at his place in Minnesota. Given that we were going to have a dozen guys with laptops all wanting to get online at the same time, I asked him what his broadband situation was.

His first response, "What do you mean?"

So I continued, "How do you get onto the Internet?"

Now this friend has gone through periods of his life where the answer would've been by jumping onto any neighboring unsecure network, but I knew he was paying someone for access nowadays, so his answer was somewhat surprising: "Wireless."

I had to inquire further, "Do you mean the Wi-Fi network they're building in Minneapolis? A 3G provider like Verizon or AT&T;? Or what?"

His response? "Comcast."

As I know Comcast isn't offering a wireless product in his area, it dawned on me that the "wireless" he was referencing was his Wi-Fi router. But he's actually getting his broadband through a cable modem.

By this point I was able to breath a sigh of relief as pretty much all cable networks give you at least a few Mbps of downstream bandwidth. I was dreading the possibility of everyone trying to share a basic DSL line at less than 1Mbps.

But I had to ask further: "So what speed do you subscribe to?"

He paused and said, "I don't know. Whatever they give me."

I guffawed, gasped, and asked incredulously: "What do you mean you don't know how much bandwidth you're paying for?"

He says, "Give me some options to pick from."

In other words, he really had no idea.

What's so remarkable about this conversation is that most people would assume my friend would know this kind of stuff, that since he's under 30 he'd be hardwired for the Internet and completely comfortable with talking about it.

And to a degree he is. He's completely comfortable using the Internet, whether it be to watch videos, follow live sports scores, look up information, purchase things, and communicate with others. He's been using the Internet for more than a decade now, so he should be.

Yet, he's pretty much totally oblivious to this language of bandwidth, bits, and bytes. And he certainly doesn't know anything about bandwidth caps or traffic shaping.

What strikes me about this is the ramifications it holds for how much work there is to do in getting the public to understand the significance of the issues we're debating in broadband policy, which I consider to be important given that it's the public that we're all ultimately trying to help through encouraging the availability and adoption of broadband.

This friend of mine is at least in the top 60% of Americans in terms of understanding the value of broadband because he's a paying subscriber.

While I harped on him for not knowing how much broadband he has, he then asked the question: "Isn't there somewhere I can go to test my speed?" The fact that he knew to ask this combined with the fact that he does use the Internet a lot probably puts him upwards of the top 25-30% of households in terms of broadband awareness, especially when you factor in that many households only have one power user and multiple non- or low-users.

So if he doesn't know how much broadband he has, that probably means the vast majority of Americans are completely unaware of the concept of bandwidth and that they're paying different amounts for different levels of access to the Internet, other than the basic distinction between broadband and dialup.

The reason I found this conversation so staggering is that I hadn't realized just how far we have to go in terms of raising public awareness about the principles of broadband to the general public.

But it was also a good thing to have to come to grips with as it's reshaping my perception of and adding urgency to the need to overcome this gap in the American public's understanding of broadband.

It shows that when making broadband policy, we can't assume that the public knows what we're talking about, or even that they're acting as well-informed buys in the broadband marketplace. For most people, broadband's just a faster way to get onto the Internet than dialup.

And because the public doesn't understand bandwidth and why they'd need/want more, it makes it that much more difficult to spark a nationwide movement demanding networks be built with greater capacity.

So a key cornerstone of any effort to change America's broadband future, must be recognizing how much work needs to be done educating the public about broadband and committing ourselves to overcoming these challenges.

So the deadline for submitting applications for the first chunk of money from the broadband stimulus programs is basically today. NTIA and RUS have also announced that they're giving themselves a Sept 14th deadline for when they're going to unveil the finalists who made it through the first round of vetting. At this point they'll accept input from governors to help determine which projects in a state to fund.

But let's pause for a moment and consider something: is three weeks enough time to read let alone vet let alone weigh the relative merits of what is almost certainly hundreds of applications? Because between today and Sept 14th that's basically all the time that's left.

I'm wondering this in particular because I've had a chance to review a couple of applications and they don't make for lightweight, breezy reading. It seems like par for the course for each application to be at least 50 pages. And therefore I think it's safe to say that it's going to take at least a couple of hours to initially just read through an application.

Now let's do a little math. I don't know how many applications NTIA has received, but I think it's safe to peg that number as north of 500. I say that because I've heard Calix tout that they alone have at least 100 of their customers they know to be applying, and those are just a portion of the fiber projects looking for funds, not even counting what's been reported to be a much larger number of wireless projects. So really the number could easily be 1,000 or more. Heck, I wouldn't be surprised if it ran up into the thousands given that the government's servers couldn't even handle the load of people rushing to submit their applications last week.

But let's just stick with 500 for now. So 500 applications at two hours a piece, just to read through once. Now we need to multiply that by three as each application is supposed to be handled by three volunteer reviewers. With this in mind we're already at 3,000 man hours. And this number could be much higher given that my guesstimate of 500 total applications is likely woefully low. I could see this being 5,000-10,000 man hours just to handle the initial read-throughs if the volume of applications is as high as I think it is.

Then you've got to add on some amount of time to do some real vetting. I'm hoping that includes things like crunching any numbers that are given to verify that they add up, doing some research both online and on the phone into the deployers' respective histories and to confirm with the communities that they know about the projects and are supportive of them, and then doing some qualitative analysis about whether or not a project passes the smell test and that their financial model appears sound.

To do this right I'd think you'd need at least 5-10 hours. Anything less than that feels like you'd just be rubber stamping these applications without fully understanding what they're proposing. I'm going to give the process the benefit of the doubt in assuming that they're able to do this deeper research with just one person or through the coordination of multiple people, rather than each of the three reviewers having to do their own independent verification and analysis. Even still, we're talking about another 2,500-5,000 man hours, and again, if the total number of applications goes far beyond 500, this time commitment just increases, upwards of 5, 10, maybe even 20,000 hours.

I don't know if they're going to do this here as they may leave this up to the governors to decide, but at some point there does need to be a consideration and valuation of the relative merits of competing projects. I'll leave that out for now, though it will need to be factored into the calculus at some point.

So already we're up to a minimum of 5,000 man hours of work being required to do even the most basic of reviewing on the smallest number of applications I can imagine they've received. But I could easily see that number go as high as 30,000 man hours depending on how many applications they actually receive, how long it takes to actually review, and whether or not they do the deeper analysis just once or multiple times.

To put this into perspective, let's assume each reviewer has 60 hours in his or her week to work on this, which is likely high given that these are volunteers. So 180 hours total available per reviewer over the next three weeks that we have before the deadline. On the low end that could mean we'd only need 30 reviewers to get the job done. But on the high end we'd need closer to 200.

This then begs the question: are there 200 people out there willing and able to volunteer their time that have the in-depth expertise on the technology and business of deploying both wireline and wireless networks necessary to do a proper vetting of these applications that aren't tainted by ties that are too close to a service provider, equipment maker, or otherwise to have unbiased opinions about which networks should get funding?

Even if there are that many out there, has NTIA been able to find them all and get them committed to this process?

And perhaps even more important, what assurances do we have that these volunteers are qualified and unbiased? How do we know they're going to be capable of doing proper, thorough analysis of these applications?

The reason I bring all this up is because I'm worried that while I, as well as many others, grew quite frustrated at how long it took for government to put out the rules for what projects should qualify for funding, we now seem to be in a position of rushing through the next steps.

This is very worrisome as I think it's as if not more important to pick the right projects as it is to get the rules right in the first place. And yet while it took six months to get the rules written, we're now looking at only three weeks to vet the projects.

Unfortunately, I'm not sure what to do at this point. I'm hesitant to suggest pushing the date back for when they announce finalists as that likely means pushing back even further when projects can start getting awarded funding. And yet I can't help worrying that this vetting process is not going to be as thorough as it needs to be to make sure we're giving money to the projects that deserve it most given the time constraints.

As it stands, I feel like we're stuck having to closer our eyes, take the plunge, and hope that everything works out OK in the end. But given that we're playing with billions of dollars of our money as well as having a huge part of our broadband future at stake, this isn't a comfortable position to be in, and I can only hope that we can find a way to do this better for the next round of funding.

While this news was a surprise to nobody, I still found myself shaking my head about it: Comcast is suing over the FCC's ruling against their treatment of P2P traffic.

This isn't surprising as there's been an argument brewing for a while that the FCC doesn't have the authority to deal with specific issues related to network management. Also, it's not like incumbent broadband providers to take any government ruling against them lying down. So in many ways it seemed inevitable that this P2P decision was going to go to court.

And yet I can't help but wonder about the timing. I'll admit some level of ignorance about what it takes to prepare for a lawsuit like this, but given that this decision came out a year ago that seems like an awfully long time to file a lawsuit.

Making matters worse is that Congress just introduced a new net neutrality bill a couple of weeks ago, worse for Comcast, that is.

I find it very strange that they've based this lawsuit on an argument that the FCC doesn't have the authority to regulate network management techniques at basically the same time as Congress gets another opportunity to pass a law that would specifically empower the FCC with these responsibilities.

If I were Comcast, I would've tried getting this lawsuit underway months ago, because now as it stands I can't help but feel like this lawsuit is bolstering the case for those in support of passing new net neutrality legislation.

I mean, don't they realize that by saying the FCC doesn't have the legal authority you're basically telling Congress that they need to take action to give them that authority? And they're doing so at the exact moment that Congress has a new net neutrality bill in their hands to consider passing!

So does that mean Comcast wants Congress to pass a net neutrality bill? It sure looks like it given the substance and timing of their actions.

In fact, assuming Comcast doesn't really want Congress to pass a net neutrality bill, I'm not sure why they didn't just keep their mouth shut about all of this. It seems like it'd be better to just deal with the P2P traffic shaping limitations then potentially open up the Pandora's box of new wide-ranging net neutrality legislation.

But the opportunity to keep this quiet has passed. Pandora's box is now open. And what happens as a result of this is anyone's guess.

All I do know is that if Congress needed another push to justify creating a new net neutrality law that puts this issue within the FCC's purview, well Comcast just gave it to them with a bow on top.

Also, as a quick add-on, notice how the FCC's first significant net neutrality ruling has been taken to court. One of my biggest fears is that whatever new net neutrality legislation we pass or regulations we create will end up ultimately being shaped by lawsuits. This seems like a very inefficient way to make policy, though, as it's going to waste a lot of time and money while also introducing a lot of uncertainty into this process.

Because of this I think we all need to be thinking seriously about trying to find ways to push the net neutrality ball forward in such a way that can be palatable to incumbents. This doesn't mean watering legislation down, but instead doing everything we can to work out our differences before anything's set in stone up on the Hill. I can't see it as a victory to have net neutrality legislation passed that just ends up stuck in the courts for the next decade.

Of course, this may be inevitable too, but we should at least be aware of this and have this reality inform our decision making moving forward. While we may need to protect consumer interests from the monopolistic tendencies of some broadband providers, let's see if we can do so without having to battle it out in the courts.

Korea Frames America's Wireless vs. Wireline Choice

| No Comments | No TrackBacks

The more I learn about what South Korea's doing related to its national broadband strategy the more impressed, inspired, and intimidated I become.

Impressed by how long they've been focused on the many issues surrounding broadband deployment and adoption, and how diligent and thoughtful they've been in working through whatever problems they faced to find creative solutions.

Inspired by the example they set for the US about how much progress can be possible when government is ready, able, and willing to set bold goals and have the programs in place to help achieve them effectively and efficiently.

But also intimidated by how far ahead of us South Korea already is in the race to be global leaders in the new digital economy, and how that gap grows wider the longer it takes for us to get into a more forward-looking mindset that goes beyond trying to answer the basic question of how to get everyone online and using the Internet.

That being said, I'm also confident that as Americans we can tackle any challenge and achieve anything we set our minds to. And to make sure that our minds are set properly, it makes sense we should be listening to countries like South Korea that have clearly progressed beyond us.

This line of thinking was inspired by watching today's FCC workshop on international lessons that have been learned, at which Young Kyu Noh from the Korean Embassy spoke.

During the Q&A; he laid out an argument that was stunning to me in its clarity and which if adopted would have a profound affect on our national broadband strategy.

He started by saying simply that taking a wireless-centric approach to broadband makes sense for developing countries that don't have any existing infrastructure to build off of and that need to use the most inexpensive option available.

But then he suggested that in developed countries there's a clear need for the capacity and reliability of wireline. To exemplify this, he shared how his kids complain about how slow the Internet is in the US, how in South Korea they're used to spending most of their time online watching videos and playing high resolution games, but that in the US they've had to limit their use of these bandwidth-intensive applications because our broadband networks are lacking compared to what they have on the other side of the Pacific.

I had never thought of it that way before, that developing countries get wireless, and developed wireline, in much the same way developed have advanced electricity grids whereas developing often must get by with local generators.

This then led me to think about the impact the availability of cheap, reliable electricity has had on economic development in the 20th century industrial economy, and how having access to cheap, reliable bandwidth will create the same enabling force for economic development in the 21st century digital economy.

But then that leaves this thought: by relegating rural areas to a wireless-only future we're basically treating them as developing countries and robbing them of their ability to be competitive in the global digital economy.

That doesn't sit right with me, doesn't feel very American, like our can-do spirit has devolved into a can't-do malaise, where we strive to be good enough but not the best.

Mr. Noh made clear, though, that this is not just a rural issue. He also shared how by Korea's estimation there may not be enough economic incentive for broadband providers to upgrade their networks much past 10Mbps. That private operators are more likely to try and squeeze as much revenue out of their existing networks with only small incremental upgrades for the foreseeable future. So, if we want to have a broadband future that's competitive globally, we need government to step in with the vision and resources to push us beyond what the market alone will provide.

Yet that doesn't just mean government writing checks. Rather, Mr. Noh shared how South Korea created a certification for aspirational broadband networks that lived up to certain standards. With that certification came a government stamp of approval that apartment owners could use to attract tenants and developers to sell homes. It's been such a success now that if your property doesn't have this certification then you'll be hard pressed to get anyone to buy/rent it.

That seems like a really smart way for government to start using market forces to the public's advantage in order to help drive investment in the next-generation wireline networks that we need.

The more I learn about what South Korea's doing, the more I think we have many lessons we can learn from their experiences that we can apply to formulating our own policies. Not that we should necessarily copy what they've done, but that we can combine their best practices with our own unique circumstances and talents so that we can do it even better than South Korea or anyone else ever has.

Why Aren't We Talking More About Price Per Mbps?

| No Comments | No TrackBacks

Bridging the gap between the hot topics of broadband availability and adoption is that of affordability. Advocates across the spectrum of opinions point to the need to not just have broadband that's available but also that's affordable, and that a key factor to driving adoption is cost.

But unfortunately, most of the conversations surrounding affordability only deal with the overall cost of service, which skews this discussion in a couple of ways.

For one, those that want to prioritize realistic goals over aspirational ones cite affordability as a key reason not to strive for America to take a quantum leap in its broadband capacity. They say that we shouldn't set goals like a 100Mbps Nation if it means mandating services be offered that aren't affordable for consumers. While I agree with this sentiment in general, I don't think we should discount the possibility that people are willing to pay for top-shelf service plus it doesn't do anything to address how we can enable a world where 100Mbps and beyond can be affordable, which is what I'm focused on.

Secondly, I don't think we can solve the adoption problem by just looking at how much broadband costs without any acknowledgment of how much service is delivered for that price. I have a hard time blaming someone in rural America who's unwilling to spend $50+ a month to get "broadband" that's less than 1Mbps. And I don't necessarily think that means that person is unwilling to spend $50+ if they got a bigger bang for their buck.

Put simply, I think it's impossible to identify consumer motivations about what's affordable by only looking at price. We also need to be taking into account what they're getting for what they're paying as consumers are often driven as much by value, perceived or actual, as the overall dollar amount.

That's why I think it's time we start focusing more of the debate over affordability on the metric of price per Mbps.

Let's consider a simple example: one rural community only has a 50% penetration rate, whereas one suburban community has an 80% uptake. In both communities you can buy broadband for $50 a month. So does this mean that rural users can't afford or aren't willing to pay the same as their suburban counterparts? Well let's take this a step further. In the suburban community they're getting 10Mbps for their $50, whereas the rural town only gets 1.5Mbps for their $50. Couldn't that disparity have an impact on adoption rates?

Beyond just divining what's holding some back from getting online, the price paid per Mbps is vitally important to the health and future outlook of our digital economy.

In this way we can think about bandwidth relative to the digital economy the same as electricity was to the industrial economy. In the 20th century, whoever had access to the cheapest electricity had the greatest opportunities to harness its power to innovate and drive new efficiencies. Having access to the cheapest bandwidth will have the same kind of impact on countries striving to be economic leaders in the 21st century.

But we should add some nuance here as the point again isn't just about getting the cheapest service. What good is an electric grid that can sell service at $1 a month but only deliver power an hour a day, or in an unreliable manner, or with a quota that if exceeded triggers huge overage fees? What's really needed is an electrical grid that can deliver the cheapest price per watt, just like broadband needs to be about the cheapest price per Mbps, not the cheapest overall cost of service.

Looking at affordability through this perspective also helps with better understanding the pros and cons of various broadband technologies. While some technologies may be cheaper to deploy, they invariably deliver less capacity and yet they don't necessarily charge less for service. On the other hand, other technologies cost more to deploy but they deliver a lot more capacity for what they charge.

For a great visualization of this, check out the graphs in this article.

The first shows the average price per Mbps of bandwidth on DSL, cable, and fiber networks. While DSL has been dropping, it's still at $15 per Mbps. Cable's holding steady at a bit over 5Mbps. And fiber's in the clear lead at about $1 per Mbps.

The second graph then shows the average speeds of these technologies, with DSL at about 5Mbps, cable at about 12Mbps, and fiber up over 60Mbps.

Now let's take this a step further and multiply the average cost per Mbps vs. the average capacity:

DSL - $15 per Mbps x 5Mbps = $75/month
Cable - $5 per Mbps x 12Mbps = $60/month
Fiber - $1 per Mbps x 60Mbps = $60/month

Notice how despite fiber offering at least five times more bandwidth than the second best technology it's able to do so for roughly the same cost to consumers.

So what this is all showing us is that by looking at the issue of affordability based on price per Mbps and how much service consumers are actually getting we can begin to get a more granular understanding of how much value service providers are delivering.

By going beyond just looking at the overall cost of service, we can start to see how even though laying new broadband infrastructure like fiber may be more expensive to deploy than simply upgrading existing cable or telephone lines, the end result of doing so is that consumers get a whole lot more bandwidth for their buck.

And given the overarching importance of cheap bandwidth to enabling the establishment and growth of our digital economy, we need to make sure we're not making shortsighted decisions based on skewed perceptions of things like affordability that don't take all of the facts into account.

America needs as much bandwidth at as low a cost as we can possibly manage. But to fully understand the issue of affordability, we need to look beyond the overall cost of service and start considering the price per Mbps of broadband.

Hey FCC: Stop Ignoring Municipal Broadband!

| 5 Comments | No TrackBacks

The FCC's workshop on broadband deployment featured speakers from across the spectrum of broadband providers, except for one notable omission: they didn't have a single representative from a municipal broadband deployer.

To give them some credit they did include a couple of non-traditional providers who started as public non-profits and eventually evolved into private for-profit operations--namely Hiawatha Broadband Corporation and LARIAT. But in terms of having representation from a city that built it's own broadband network, they had nothing.

While I'm not one who necessarily believes that municipal broadband is the ultimate solution to solving our country's broadband woes, to leave them out of the deployment discussion entirely seems irresponsible.

There are communities across the country that have found success building and operating their own broadband networks. Despite the caricature that municipal broadband invariably leads to boondoggles, that's just simply not the reality.

That's part of the reason why I think the FCC needed to include municipal representation on these panels. There's a lot of fear, uncertainty, and doubt that's built up around municipal broadband that the FCC needs to be addressing on a factual basis. By not including municipal broadband on these panels I couldn't help but wonder if either the FCC was buying into these falsehoods or if they just didn't think municipal broadband was a significant enough player to include.

Another problem with not including municipal broadband in these discussions is that it really seemed to skew the discussions that were had. Everything was focused on how do private providers financially justify buildouts and how can government aid in that process. Very little was said about how some areas may never be economically viable for private providers and therefore municipal broadband may make more sense. Also missing was any exploration of how some communities want more bandwidth more quickly than private providers are willing/able to provide, and what these communities can/should do in this situation. And on a related more general note, what should be doing if government can't incentivize sufficient private investment to enable broadband networks that live up to the needs of the public good?

The FCC has given an initial response to their decision to omit municipal broadband from these panels, but their only excuse is that there will be a workshop on local government issues coming up that will touch on municipal broadband as one of the options available to municipalities. But quite frankly that doesn't seem like enough attention.

For one, it still doesn't rectify the FCC's decision to leave municipal broadband out of discussions surrounding deployment. The only thing I can think of that may justify this decision is that the FCC didn't want their deployment workshops to get lost in the back-and-forth over whether it's fair for public networks to compete with private. But even then, I'd rather they just come out and say that then try to gloss over their reasoning.

The other reason I find this response inadequate is that municipal broadband isn't about local government so much as it is about deploying broadband. As we've got a massive challenge ahead of us to get America wired with 21st century broadband, we can't afford to leave any options off the table. And as we're going through the process of crafting a national broadband strategy, I think we'd be remiss if we didn't at least consider municipal broadband right alongside private providers as one of the tools at our disposal to overcome these challenges.

That said, in no way am I'm trying to say that municipal broadband is the right solution for all of America. Instead what I'm stressing is that it is a solution that has proven viable in many communities and therefore it deserves serious attention.

Because of this, if the FCC's serious about their claims that they're pursuing a fact-based approach towards getting a holistic understanding of what's happening in the broadband marketplace, then they can't afford to marginalize municipal broadband.

Instead of trying to squeeze it into a workshop on local government issues, they should be breaking it out and holding an entire workshop on just the issues surrounding municipal broadband so that we can really drill down into the many opportunities and challenges that it presents and get a better understanding of when it works and when it doesn't. Also, by doing this we can work on identifying opportunities where municipal broadband can be synergistic with private providers rather than only being seen as competitive.

So my message to the FCC today is simple: stop ignoring municipal broadband and start taking it seriously!

Hopefully you already are and both myself and others are reading too much into your decision not to include a single representative from a municipal broadband network on these deployment panels. But we can't be reassured of that until we start seeing the FCC engage with this topic directly. And so far as it relates to these workshops, that hasn't happened.

I have to admit suffering from a fair amount of trepidation heading into yesterday's FCC workshop on broadband deployment. As an advocate for fiber, I worried that we'd end up seeing another discussion where the desire to be technology neutral obfuscates a more specific, concrete discussion about what our nation's broadband infrastructure needs.

So needless to say, I was pleased as punch when what happened instead was a conversation dominated by fiber.

Everyone on the day's first panel talked about how more fiber's a good thing, how having more fiber enables better broadband, and how we need a national plan to get more fiber everywhere.

One constant refrain that bled into the following session on wireless deployment is the need for more, better access to middle mile and backhaul networks. Thankfully, these panels didn't beat around the bush and instead declared fiber the technology of choice to power these networks, except in limited scenarios where microwave's a more appropriate choice.

There were clear calls for the need to get fiber laid to every cell tower, how if we want to realize the full benefits of next generation wireless technologies like 4G that T1s for backhaul won't cut it. So if we want better wireless, we need more fiber.

It was also interesting to hear how every wireline technology today touts how it's powered by fiber. Cable guys stress their hybrid-fiber coax networks, with the representative from Cox going so far as to boast that "if you look at the percent of time a bit travels on our network, over 80% is on fiber." Telcos not doing full fiber networks tout their fiber-to-the-node technology, choosing to reference fiber more often than DSL in describing what they offer. So everyone seems to now be acknowledging that the key to our wireline broadband future is getting fiber closer and closer to end users.

Finally, while they weren't able to address the question I posed on this subject directly, I do think there's a growing awareness around the fact that the bandwidth demands of larger institutions necessitates the use of fiber. We can't afford to leave schools, hospitals, businesses, and other institutions stuck paying hundreds of dollars a month for T1s; they need fiber to support lots of simultaneous usage.

What this all leads me to is a new hope that we're going to strip away the fog of technology neutrality and embrace the idea that focusing on fiber has to be at the core of our national broadband strategy. Even if we can't get agreement on whether or not every last home needs a fiber pipe laid all the way to its front door, we should be able to all agree that we need more fiber in middle mile and institutional networks, and that as a core piece of our national broadband plan we also need to be crafting a national fiber plan that can help coordinate the deployment of a nationwide fiber network.

Because the simple truth of the matter is that without a plan to get more fiber available to more people at more affordable prices we have no future as a global broadband leader because the future of broadband will be powered by fiber.

The debate over net neutrality is set to roar back onto center stage after the recent introduction of a new bill that mandates broadband networks remain open.

But despite the fact that I support the notion of protecting consumers from the anti-competitive practices of monopolistic broadband providers, I think we haven't been addressing one of the core challenges of making net neutrality a reality, namely how do we balance our desire to see more money invested in broadband capacity with private providers' need to turn a profit on that investment?

Unfortunately, too often this argument is completely dismissed by net neutrality supporters. When the industry says that overreaching net neutrality rules will dissuade them from investing they're accused of greedily putting their hunger for private profit over the public good. They're branded as evil, and these claims are ignored.

Yet, let's consider them with a fresh set of eyes. Let's not assume anything and break this argument down a bit further.

First off, we must acknowledge that there's a finite amount of capacity in any broadband network.

Secondly, there's a finite amount of money any private provider is able and willing to invest in upgrading their capacity, and to justify any investment they need to realize a return on those dollars.

Third, broadband networks have two primary purposes: offering access to the public Internet, and supporting the delivery of private, managed services like TV and phone.

Fourth, private, managed services can offer higher profits to operators than just offering bandwidth.

Fifth, we want private providers to maximize their investment in the capacity of their networks.

Sixth, we want to make sure that as much of that capacity is going towards open bandwidth that grants unfettered access to the public Internet as possible.

With these basic ideas on the table, let's do some further analysis.

There are many net neutrality supporters who believe that broadband should just be a big dumb pipe. That there's little need for private, managed services to insure quality of service. And that there's more harm than good to come out of allowing incumbents to offer prioritized access to their networks.

The challenge with this attitude is how can we ask incumbents to invest billions more in upgrading their network capacity at the same time we're limiting how much money they can make off that investment? I don't see how we can spur the private sector to put up the massive investment that's needed to continue upgrading our nation's broadband infrastructure if we limit the broadband business to selling bandwidth. The returns are just too low to justify the big money that needs to be sunk into networks.

But many net neutrality supporters have come to realize this, as evidenced by the fact that this latest net neutrality bill allows for private providers to offer private, managed services alongside broadband. I'm supportive of this idea not only because it retains greater economic incentives for private providers to invest, but also because I think there's a lot of innovation that's made possible in networked applications once we're able to do things like guarantee quality of service, which is currently impossible on the open Internet.

Unfortunately, by doing this we may be creating another problem, namely how do we make sure that private providers don't spend all their money upgrading the capacity of their networks for these private, managed services instead of offering more bandwidth to the public Internet?

This isn't an easy question to answer as private, for-profit businesses want to invest their money in the areas where they'll get the highest return. So if they're going to make more money off private, managed services, then what's their incentive to invest in the part of their network that turns them into a dumb pipe?

In some areas competition from multiple broadband providers has been enough to drive investment in increasing open bandwidth. But that dynamo of competitive investment isn't working everywhere.

At the same time, we can't ignore the reality that without competition driving this investment, private providers actually have a disincentive to invest in open bandwidth as the more open bandwidth that's available the more opportunity over-the-top services have to compete with the private provider's own managed services.

This most recent net neutrality bill does at least touch upon most of these issues, but its only solutions to insuring private providers are investing in their capacity to offer access to the public Internet are vague calls for providers to promise to invest in delivering as much bandwidth as possible to their subscribers.

At the same time, while it does allow for the possibility of private, managed services, it specifically bans the practice of speeding up some Internet traffic, which gets to the heart of what private providers want to be doing as a way to drive sufficient revenue to justify their investments in new capacity. Again, while I support the notion of preventing providers from slowing down traffic, I'm not quite sure why we're banning them from selectively speeding some up, especially when that marketplace hasn't even been given a chance to work yet as no one's really doing any of this. I fear policy's getting ahead of the market and technology on this issue, which is rarely a good thing.

So while it looks like we're starting to get to the point where we've at least got the issues that matter on the table, it doesn't seem like we're doing enough to acknowledge and address the basic realities of how any profit-driven enterprise works.

At the same time in the language of bills like this latest one on net neutrality, we're not doing enough to precisely define goals and set strategies for achieving them, but instead are wasting time merely wishing for good things to happen.

But I believe the time has come for change, for us to stop demonizing private providers for making decisions driven by the desire to generate profit--like it or not, that's their nature--and for us to stop writing legislation that's imprecise and more wishful thinking than specific policies.

Notice that I'm not saying we don't need net neutrality legislation. Nor am I saying we shouldn't be holding private providers accountable for how well (or poorly) they're serving America's communications needs.

Instead what I'm suggesting is that if we're going to continue down the path of relying primarily on a market-driven approach to private providers deploying and operating our country's broadband networks, then we need to be crafting policies that are informed by how they will impact the profit motives of these private providers. And then we need to find policies that reward the kind of good behavior we want to see happen with opportunities to generate higher profits.

Because as it stands right now, I'm worried that this net neutrality bill is taking away profit motive rather than adding to it, while at the same time not doing enough to protect the public's interest by insuring that a substantial portion of that investment is going to increase our nation's open bandwidth.

So despite the many hours of hard work that I'm sure went into crafting this latest attempt at net neutrality legislation, I'm afraid that we're currently set up to achieve a lose-lose situation.

That doesn't mean we should scrap the thing entirely, though. Later on this week I'm going to give some suggestions for how we could consider tweaking this to make sure that we're protecting both the public's interests as well as the profit motives of private corporations so we can insure they continue investing in our nation's broadband infrastructure.

When Is A Community Fully Served By Broadband?

| 1 Comment | No TrackBacks

One of the hottest topics related to the broadband stimulus is how to define what it means for a community to be underserved.

But there's been a huge piece missing from that conversation as we have yet to address the question of: when is a community fully served by broadband? When can we say that a community has enough and doesn't need government intervention to get more connected?

Based on the government's definition of underserved, a community's served if half of it can get terrestrial facilities-based broadband at 768Kbps down and 200Kbps up or if any wireless provider advertises service at 3Mbps down.

Apparently that's what it means to be fully served by broadband in today's America.

But I have a different way of looking at it. Let's break this down a bit.

The first thing to do is acknowledge that while politically appealing, it's less precise to base whether a community's fully served simply on how much bandwidth is advertised as available. We need to be tech aware in determining who's being fully served, especially if the goal is to have a robust marketplace of facilities-based competition.

So let's take a look at how you can get broadband to a home today.

There are three pipes going into most homes that can potentially be used to deliver broadband: cable TV, telephone, and electricity. There's a fourth pipe that can be installed in the form of fiber. And there's a variety of different wireless technologies that either deliver broadband today or could deliver it tomorrow. Plus there's satellite.

That's it. These are all the ways to get broadband.

Now, one could argue a community's not fully served until it has all of these options available to it. There are some communities on the verge of achieving this level of hyperconnectivity, like Lafayette, LA, which already has DOCSIS 3.0 cable available, the local utility's in the process of building a world-class full fiber network, there are rumors AT&T;'s building out it's U-Verse VDSL service, and there is some 3G coverage. The only things missing are BPL (which frankly doesn't look viable as a competitive broadband technology) and more flavors of wireless, which may come in the future.

I would argue a community like Lafayette, LA is fully served by broadband, or at least well on its way to being so.

A similar trend is seen other places where full fiber networks are being deployed. Like anywhere Verizon has FiOS, which are inevitably the communities that are ground zero for cable companies' DOCSIS 3.0 deployments.

And yet on the flip side, those communities not yet getting fiber tend to be stuck with last generation DSL and cable, often with only one provider able to offer 10Mbps down and no one offering more than 2Mbps up.

These communities don't have enough capacity to use all that today's Internet has to offer nor do they have enough competition to drive the engine of continuous investment and innovation between competitors. Because of this I can no consider them to be fully served.

But if you look back at the government's definition, they're not underserved. So what are they?

It's questions like this that are brought to the forefront when we start with the question of when is a community fully served and work backwards rather than attempting to define "underserved" in a vacuum.

One last comment on this issue for now, I do think we need to have a flexible definition of what fully served means as we should acknowledge that rural areas will likely never have multiple facilities-based broadband providers as there isn't a high enough density of subscribers to justify building out multiple networks.

But rather than being resigned to rural America becoming a broadband monopoly, I'm intrigued by the possibilities of open fiber networks that not only bring world-class infrastructure to all Americans but that also preserve the possibilities of competition by allowing multiple service providers to compete on the same network.

I'd argue a rural community's fully served once it has an open fiber network in place.

If we are to craft an effective national broadband strategy, we have to have some sense for what we're shooting for, what our goals should be. We can't afford to muddle around only worrying about who's not served. We also need to be defining a clear sense for when a home is fully served, and then base our actions on plans that can help all homes get to that status.

Because only then can we ensure that all American homes are fully served by broadband.

Death To T1s (And Maybe DSL, Too)

| No Comments | No TrackBacks

I've been meaning to say this for a while, so here it goes: Death to T1s!

Before going any further let me pause to give them some credit. They've been a great way to get reliable access to the Internet for years.

But their time must come to an end ASAP. Quite simply: we have too many people paying too much for T1s that deliver too little bandwidth all across this country.

I cannot tell you how many times I've come to find entire hotels running off of a single T1, schools getting by with only a couple of T1s for hundreds of students, government buildings that are lucky to even have T1s.

I have a hard time fathoming how anyone can do anything with this little bandwidth. A single T1 delivers only 1.5Mbps of bandwidth. While this is guaranteed bandwidth, unlike cable or wireless networks, it simply isn't enough capacity to support the dozens of people in a typical school that may be trying to access the Internet at the same time. And I can tell you from experience, that I'm not sure any number of T1s are sufficient to handle a hotel's traffic once everyone's back in their rooms at night, trying to check their email before going to bed.

But what's even more flabbergasting about the prevalence of T1s are how expensive they are. Businesses and hospitals are paying hundreds of dollars a month for a single T1 line. A big part of the reason why they don't have more bandwidth is they can't afford to pay even more money just to add bandwidth 1.5Mbps at a time.

T1s were great a few years ago, but the fact that they're the only business-class service available to many buildings is a clear sign of the inadequacy of America's broadband infrastructure.

Every building that houses any number of people needs to have access to a fiber line, which can not only deliver exponentially more capacity but at a price that's often roughly the same as they pay now for T1s. I've heard some of my fiber providing friends offering 100Mbps service at similar prices to what incumbent providers charge today for a few T1s that offer less than 10Mbps.

It's time to take our heads out of the sand and recognize that when it comes to supporting the bandwidth needs of large buildings, we need fiber to be ubiquitously available. T1s just aren't up to the task of supporting the demands of the 21st century digital economy.

Which brings me to DSL. I'm a little more cautious about calling for the death of DSL as I fully acknowledge that at least when it comes to upgraded DSL networks, today they are delivering the kind of bandwidth most users need. And there is the potential for them to evolve further to deliver speeds of 100Mbps, though whether or not they'll be able to support that much throughput to all users or just those located geographically close to the central office is another matter entirely.

But at the same time, I can't help but be frustrated by DSL's impact on discussions regarding our national broadband policy, namely how many people want to limit the bandwidth goals we set to numbers that DSL can reasonably manage.

If you have a thorough conversation about how much bandwidth a typical house of the near future will need, the numbers quickly add up to at least 100Mbps.

But when DSL defenders talk about how much bandwidth we can reasonably get to most people, they don't think universal 100Mbps is realistic.

Don't get me wrong, I don't blame them for this. How can they be asked to support something that makes their technology irrelevant?

Yet at the same time, I don't see how we're allowing ourselves to limit the goals we set that are supposed to be in the public's best interest just to protect a technology that most everyone I know, even some DSL providers, acknowledges will be inadequate in the not too distant future. That doesn't make any sense to me.

That said, I wouldn't go so far as to call for the death of DSL as it does have a lot of utility to provide for the foreseeable future. Instead, what I'd ask is that if policymakers want to remain technology neutral, then let's set goals that are truly technology neutral, let's set benchmarks based on the public's interest and not on protecting DSL's relevancy.

Then by doing so, perhaps we can spur further evolution of DSL technology to be able to support universal symmetrical 100Mbps across its entire footprint. Just because we set a more aspirational goal doesn't mean we're ruling out technologies that are currently inadequate. We're just clearly stating what level the public needs and that these technologies should be striving to achieve.

So let us all say together, "Death to T1s! And death to DSL's current technological limitations as the benchmark for our national broadband goals!"

In conversations I was having last week something dawned on me: every single community that considers itself unserved or underserved should be working to aggregate demand for broadband among its citizens immediately.

The reasons for doing so are manifold.

For one, it's important to acknowledge that un/underserved communities likely won't have a competitive broadband marketplace, where multiple entities operating multiple last-mile facilities compete for business. If a market hasn't been able to support or attract a single broadband provider, then it likely won't be able to support the deployment of multiple networks.

And in fact I'd argue that it doesn't make much sense for us to waste money spurring the deployment of multiple networks in a market that can't support them. Instead it's better to focus on bringing them one network that can live up to their needs.

But in order for any network to be financially sustainable, it needs sufficient demand for the services it delivers, which is why I'm advocating that all un/underserved communities start aggregating demand immediately.

The best part about doing this is that it can set the stage for encourage all models of deployment.

By being able to say a community has X number of people willing to spend Y amount of money for broadband that can be enough to entice a private deployer to invest in building out their network to that community.

Or that same aggregated demand can be used as the basis for constructing a business model for a municipal broadband deployment and/or it can lead to other models like coops.

On the flipside, if a community hasn't aggregated its demand, then it's unlikely a private provider will ever show up to save the day and it's less likely a municipal broadband project will succeed without all local anchor tenants on board with the project.

Another great aspect of aggregating demand is that it's something that can be started today. You don't have to wait to raise tons of funds or to draw up big plans or to put together all the pieces needed to start on a deployment. Aggregating demand can be as simple as creating an inventory of every entity in a community and starting to call them up to gauge their demand for broadband and if it's being met. With a handful of volunteers the process can get started.

And while the overall process should be more complex than this -- including things like gathering how much people are willing to pay for connectivity, how much bandwidth they need, and working on educating them as to why they need more -- the basic principle is pretty straightforward and is something that everyone could be working on right now rather than waiting around to see if the magical stimulus fairy rains money down on them.

So regardless of how your community ultimately gets wired, there's stuff you can be doing now to improve your chances, like aggregating local demand for broadband.

Uh oh, I may be making a mountain out of a molehill but I can't help worrying that there's a significant disconnect between the FCC and Congress on one of the most important issues related to broadband policy.

Last Thursday the FCC released its schedule for a series of intensive workshops to be held intended to bring FCC staffers and the public at large up-to-speed on a wide range of broadband-related issues.

Last Friday Congress introduced its latest attempt at crafting a net neutrality bill (more in depth thoughts on this to come).

And yet, what's missing from the FCC's roster of workshops? Any particular emphasis on net neutrality and/or the related topics surrounding open networks.

First off, I can't understand why the FCC wouldn't seem to have decided to devote any of its workshops to what is arguably the most controversial issue in the world of broadband policy.

Secondly, this omission makes even less sense in light of Congress introducing a new bill. It makes me wonder if the FCC knew this bill was brewing. If they didn't know, then why didn't they know? I'd think Congress would want to at least be consulting the FCC's expertise on matters like this. If they did know, then why aren't they devoting any time to analyze the issue?

Now I fully recognize that this may not be an oversight but instead could simply be proper protocol. I know that there are some limits as to what the FCC can do in terms of trying to lobby Congress.

But at the same time, a frank discussion about how open broadband networks should be is so incredibly crucial to the overall broadband policy debate because it touches on so many other issues that I don't see how we can afford to ignore it.

And despite any old limitations between the FCC and Congress, I think the time has come to start taking a more holistic approach to crafting broadband policy. We can no longer afford to have Congress crafting policies it doesn't fully understand and then punting them to the relevant agencies in the hopes they're able to figure things out on their own.

Instead we should be engaging all affected parties around the same table as we're crafting policies surrounding net neutrality so that we can try and craft the best possible kinds of policies, those which protect the public's interest without trampling on the ability of private industry to manage their networks in the real world and make money doing it.

Now, I'm not yet passing judgment in this article on this particular piece of net neutrality legislation (that's coming in a later post), but I do have to call out Congress and the FCC and ask that they try to get on the same page.

To Congress, if somehow the FCC didn't know you were working on this bill then I'd encourage you to do more to make them aware and seek out their guidance in the future.

And to the FCC, I'm sure you all want your workshops to be as productive as possible and may be worried about taking on the most complicated issues like net neutrality, but we can't put our heads in the sand about this topic, especially not at a time when specific legislation is about to start being considered. So let's add a workshop about openness in networks and take this contentious broadband policy by the horns.

About this Archive

This page is an archive of entries from August 2009 listed from newest to oldest.

July 2009 is the previous archive.

September 2009 is the next archive.

Find recent content on the main index or look in the archives to find all content.