July 2009 Archives

I've been meaning to take a moment to shine a light on the recent story of AT&T; blocking 4Chan.org and how it highlights how absurd some elements of the debate on net neutrality have gotten.

As a quick background, last weekend AT&T; was caught blocking its customers from accessing a large portion of the site 4chan.org.

4chan.org is generically referred to as an imageboard website. It's barebones and offers users the ability to see, post, and comment on images of all shapes and sizes. It's also been referred to as the Internet's Sodom and Gomorrah as there tend to be no rules or limits as to the inappropriateness of what's posted. (Seriously, don't go to the site if you're squeamish, easily offended, or on a computer at work.)

When the news first broke that AT&T; was blocking the site from its users, there was an instantaneous negative reaction across the web, crying foul over this "obvious" violation of net neutrality. Everyone assumed that AT&T; was abusing its position as network operator to clamp down on free speech, to silence the voices of people they didn't want to hear from. To be honest, I assumed the same at first and shook my head in bewilderment that AT&T; would make the mistake of poking this hornet's nest of Internet jokers; this situation didn't seem destined to end well.

And yet, that wasn't at all what happened. Turns out, AT&T; wasn't alone in blocking this site, and they didn't do it to flex their muscles. Instead the truth was that a distributed denial of service attack was being launched from 4Chan's IP address, and network operators like AT&T; decided to cut it off from the rest of their networks to protect their customers from being negatively affected by it.

I find this reality interesting on a couple of levels.

One, it's remarkable how quickly net neutrality supporters are willing to start assuming that AT&T;'s actions are evil without much, if any, critical analysis. Don't get me wrong, I'm completely aware of AT&T;'s history and can see why some might think that AT&T;'s motto might be the inverse of Google's "Do no evil" mantra. But at the same time I think it's the absolute wrong thing to do to immediately assume any company is guilty of wrongdoing as that attitude makes it difficult to see and acknowledge when they do something good. Even worse is that if I were AT&T; and everyone thought everything I did was evil and no one gave me credit for anything good I was doing, at some point I might just stop trying to be good. I mean, why make any effort to do good if not only am I not going to get any credit but I'm also going to have everyone assuming that my good deed has an evil intent?

I think we have to be careful about allowing ourselves to paint any company with too broad a brush, to turn them into caricatures where we see evil in everything they do. If we want to encourage good corporate citizenship, we should celebrate when big corporations do the right thing and give them the benefit of the doubt otherwise, at least initially, because otherwise we risk looking overzealous and overly critical, like we're singling out a single company. I fear that by doing this we lessen the power of when we do have specific, valid points to criticize them on.

But back to AT&T;'s treatment of 4Chan, the other level I find this interesting is that while the immediate reaction to this event was to decry this as an example of why we need net neutrality, ultimately I actually think this is an example of why we have to be careful about pursuing a net neutrality regime that's too far-reaching. Do we really want legislation in place that prevents AT&T; from protecting its customers from a denial of service attack? Because if we go to too far of an extreme in mandating the opening up of networks that could be an unintended consequence of net neutrality. This example highlights for me some of the value inherent in having smart networks that are actively managed.

My overall thoughts on net neutrality are ideology free. I want open networks and I want smart networks. I want to make sure consumers are protected both from the selfish profit-driven motives of big corporations as well as from the bad actors that pollute the Internet with attacks like this one that AT&T; stood up to protect its customers from.

But in order to achieve policies that can protect the openness of the Internet and the rights of both consumers and corporations, we need to make sure we don't muddy up our discussions with any assumptions about any specific companies. Doing so doesn't help us make any progress as it turns our conversations less precise and encourages the rift that exists between the two sides of this controversial issue to drift further apart.

The debate surrounding what bandwidth goals should be set to guide the creation of America's broadband policies suffers from a gaping chasm between arguments for aspirational vs. "realistic" benchmarks.

Those in the "realistic" category often limit themselves to goals as low as 25Mbps, arguing that going any higher potentially rules out technologies like DSL, which may be easier/cheaper/faster to deploy.

On the other hand, those advocating for the most aspirational goals seem to have no end to the amount of bandwidth they think we need, with some calling for speeds beyond 1Gbps heading upwards of 100Gbps, which while seemingly way out there are actually the speeds being deployed today on our nation's fastest research networks.

Then alongside all this is the argument that says setting goals based on bandwidth reflects too much of an obsession on speed, that what really matters more than the capacity of a network is what it's being used for, and that goals should be based on usage. While I agree with this sentiment in general, I do think there's good reason to set bandwidth goals as the amount of capacity you have in large part determines what you're able to do with these networks.

Also worth noting is the argument that we need to take a more expansive approach towards setting goals that incorporates other metrics like latency and jitter. I absolutely agree with these notions, but this post is focused on tackling the challenge of setting goals related to bandwidth.

With this range of issues on the table, let's start by looking at some of the considerations that must be made in setting proper bandwidth goals:

Timeframe: New broadband networks should be thought of as infrastructure and therefore the long view needs to be taken about whether or not they're capable of keeping up with future demand. And yet government doesn't seem able or willing to think about the ramifications of its decisions beyond a few years into the future. Because of this, I think we should start with setting a goal for where we want to be in 2015. That's far enough in the future that it suggests we'll be using a very different Internet with very different bandwidth demands from the one we know today, and yet not so far into the future as to stretch beyond the horizon of policymakers' ability to consider them seriously.

Aspirational: A common refrain heard in these discussions is that we can never know what the future will bring, especially when it comes to the growth of the Internet. Taking this a step further, the Internet always seems to outgrow even the most ambitious projections. Ten years ago no one thought we'd be using as much bandwidth as we do today. So while we do need to be realistic in the goals we set, we should also not shirk from being aspirational as we always underestimate what the Internet will do.

Application-centric: We absolutely need to be able to justify any bandwidth goals we set with an explanation for why that much bandwidth will be needed otherwise this is all likely going to be considered as more of an academic exercise than a realistic attempt to quantify where our national broadband policies need to have us headed.

Globally Aware: If the point of having a national broadband strategy is to set America on a path to be globally competitive, to remain leaders in the development of the next generation of the Internet, then our goals must be informed by what other nations are doing, otherwise we may set ourselves up to be permanently trailing the rest of the world.

After taking all these factors into consideration, I think any discussion of what bandwidth goal we should be setting for America to achieve by 2015 has to start at 100Mbps.

The reasons for this stem from the categories listed above:

Application-aware: Earlier this year I proposed that we set the goal of having broadband capable of supporting a two-way HD video stream for every American, arguing that video applications are what's driving demand for bandwidth, that there's a host of uses for video from healthcare to education to business and beyond, and that we need networks robust enough to support lots of simultaneous usage. Turning back to the discussion of bandwidth goals, a true HD-quality video is encoded at 15-25Mbps today. More compressed video at lesser bitrates may claim to be HD but they're not, and uncompressed HD can get up into the Gbps. Therefore, a family of four needs at least 60Mbps and upwards of 100Mbps just so that everyone in the house can be using one HD video application at the same time. Plus there are a host of other simultaneous uses of broadband that put further demands on the network. So if you look at what the world will look like in 2015, where the use of HD video apps will be the norm, we need at least 100Mbps to every home to support the delivery of these applications.

(As a quick aside, some papers have done more in-depth research into the topic of future demands for bandwidth but have come up with puzzling results. Take ITIF's Need for Speed report, which laid out a realistic vision for how a home of the near future could be using next-generation connectivity, concluding that: "This home would easily consume more than 90Mbps of aggregate bandwidth [both directions]." And yet when it came to recommending optimal speeds for bandwidth goals, it set the bar much lower at 50Mbps down and 10Mbps up. Unfortunately this appears to be a case of prioritizing a "realistic" approach to what's possible in terms of broadband supply over acknowledging realistic analysis of how high broadband demand will grow.)

Globally Aware: While some countries like the UK are setting more "realistic" goals like universal 2Mbps service by 2012, those nations that are considered broadband leaders have set the bar at 100Mbps and are already beginning to go past that, like South Korea, which has charged itself with the goal of becoming a 1Gbps Nation by 2012. Because of this I'll be brutally honest in saying that if we set our goal at anything less than 100Mbps that I will consider that a national embarrassment. I just don't see how we can consider ourselves serious about being globally competitive with goals lower than 100Mbps, regardless of how unrealistic some may consider achieving them to be.

Aspirational: Now some might wonder why I'm not pushing for a goal of 1Gbps by 2015, but my reasoning is simple. Networks capable of delivering 100Mbps by 2015 are likely also going to be able to scale to 1Gbps relatively easily, so if we can achieve a 100Mbps Nation then we should be well on our way to achieving a 1Gbps Nation. Also in the conversations I've had with all but the most aspirational thinkers, support for setting aspirational goals seems to break down in between 100Mbps and 1Gbps. At this time, saying that every home will need 1Gbps within the next five years just seems to be too big of a step for most to consider. That's why I decided to set the aspirational bar at 100Mbps for our 2015 goal.

Now with a goal of becoming a 100Mbps Nation by 2015 in place, we can start working both backwards and forwards to set additional goals, which brings me to the concept of logarithmic bandwidth goals.

Let's start this discussion by simply considering what a logarithmic growth in bandwidth looks like:

1Mbps, 10Mbps, 100Mbps, 1Gbps, 10Gbps, 100Gbps

It's hard to ignore how well this lines up with the evolutionary trends of broadband.

To use most of today's Internet you need at least 1Mbps. (As another aside, I don't see how a decade into the 21st century we're still defining broadband based on Kbps. Talk about embarrassing!)

To use all of today's Internet you need 10Mbps, including watching "HD" video on sites like Hulu.com, which requires 7Mbps.

And to be able to fully participate in the Internet of just a few years from now, where interactive HD video applications are everywhere, households will need at least 100Mbps.

Then looking out further ahead, speeds of 1Gbps, 10Gbps, and 100Gbps line up perfectly with where discussions about future bandwidth goals should be set.

So all that seems to be left is to associate this logarithmic growth in broadband capacity with the years we should set as goals for their widespread availability in the US.

For 1Mbps, we really need to get that everywhere ASAP. In fact, if we really had our act together, I'd argue that having 1Mbps be universally available should have been a goal set for 2006 or even earlier.

For 10Mbps, I'd like to be aggressive and say 2010 but some may see that as being unrealistic given that wireless technologies aren't yet delivering those speeds in the US. If that's the case then we should look at having this benchmark be set for 2011 or 2012 as an intermediary step.

For 100Mbps, I didn't pull the 2015 goal out of my hat as striving to become a 100Mbps Nation by then has already been proposed by Sen. Rockefeller. And the more I think about it the more sense it makes. It gives us an aspirational goal that we can realistically achieve in the next 5-6 years if we set our minds to it.

After that, I'm hesitant to set any additional goals. For one, policymakers will have a hard time seriously considering them. For another, it's somewhat foolish to think we can really guess where things are going to grow to that far into the future. So I'd rather not start making guesses now that will almost certainly underestimate what the future will bring.

Because of this, instead of trying to set out goals that are further into the future, I'd just commit ourselves to readdressing our bandwidth goals sometime in the next few years.

The final point I want to make about this issue of bandwidth goals is the debate over how we should quantify and set goals for coverage. All too often we get stuck in discussions over whether setting goals related to 100% availability are realistic given that even basic telephone service isn't necessarily universally available. But the flip side of this is that if we start setting goals like having 100Mbps available to 90 or 95% of the population, then we're basically saying we're OK with leaving millions of Americans behind. Plus there's also the twist that some of these goals may suggest wasting money bringing these networks to people who don't want them and won't use them.

So instead of getting lost in this debate, I'd suggest we instead simply focus our attention on making these speeds available to all Americans who want them.

The most important thing to remember throughout all of this is that if we want America to continue to be a great nation, we must be setting goals that put us on a path towards greatness. We can't allow efforts to be "realistic" to get in the way of striving towards aspirational goals. And we can't forget that with enough will, unified effort, and careful planning, any goal we set can be realistically achieved.

As I've said before and will continue to say again and again: We are America, gosh darn it. We can achieve anything we put our minds to. The only thing limiting us is our own imagination and self interests.

So let's acknowledge the clear trends of bandwidth-hungry technologies and our aspiring global competitors to set the goal of becoming a 100Mbps Nation by 2015. Because the sooner we're able to do that, the sooner we can start getting down to the real work of figuring out how to achieve this goal.

Last week I had the opportunity to head back down to Cajun Country for another trip to Lafayette, LA. While there's much to share about my journeys, there's one thing in particular I want to highlight today: Lafayette is fast becoming America's most wired community.

Of course you all know about Lafayette Utility System's deployment of a full fiber network, but it's worth revisiting what they're offering. Residents of Lafayette can get 50Mbps symmetrical service for less than $60 a month.

Compare that to the $70 I pay a month for roughly 10Mbps down and 2Mbps up in our nation's capitol (though in fairness Comcast supposedly now offers DOCSIS 3.0 in my area so I may have faster options available, though they still won't reach 50Mbps symmetrical).

And potentially even more significant is that every subscriber to the LUSFiber network, no matter what level of broadband they pay for, gets access to a free 100Mbps symmetrical intranet, or community-wide LAN. To date Lafayette is the only community in the country with anything like this that I know of.

But the story of how wired Lafayette's becoming doesn't begin and end with LUSFiber. Cox Communications has decided to make Lafayette the first community in which it's deploying DOCSIS 3.0 cable. They've claimed that this deployment has nothing to do with the availability of LUSFiber, but considering that Cox has been keeping its prices lower in Lafayette than nearby communities like Baton Rouge that aren't deploying fiber, that's hard to believe. Plus it makes logical sense to upgrade your network in the areas where you're facing new competition.

The same impetus is likely behind news that I actually learned from a Cox representative that AT&T; now has plans to bring its upgraded DSL platform called U-Verse to Lafayette sometime in 2010. On the one hand this news doesn't surprise me at all as if AT&T; were to do nothing they'd likely be run out of town, unable to compete with the arms race between LUS and Cox. And yet on the other hand, I'm wondering if even with this upgrade whether or not they'll be able to compete as U-Verse doesn't offer the same amount of bandwidth as LUSFiber or even DOCSIS 3.0 cable. But regardless of that, this makes Lafayette the first community in the country that I know of that will have a full fiber network, DOCSIS 3.0-upgraded cable, and a next generation DSL network, making it arguably the most wired city in America.

I should add, though, that the one area they're lagging is in wireless connectivity. From personal experience I can share that 3G coverage is somewhat spotty, and as far as I know there aren't any plans by anyone to deploy a next-generation wireless network like WiMAX any time soon. That said, I also want to point out that I continue to be amazed by how many of my friends in Lafayette sport iPhones or some other form of smartphone. So even if the networks aren't being built yet, there's a savvy customer base ready to use it whenever it does.

The biggest takeaway from me on this is that competition in wireline broadband can work, but only if there's someone willing to invest in a truly next generation network. Everyone likes talking about how adding a third pipe into homes will supercharge competition, but the reality is that the only third pipe capable of doing that is fiber. Lesser technologies like BPL and wireless don't have the capacity to spur investment in legacy infrastructure. And yet just look at what can happen when that fiber arrives: everyone starts investing in capacity so they can try to keep up.

In starting to peruse the reply comments to the FCC on formulating a national broadband policy, something NCTA said caught my eye, namely that the FCC should acknowledge that one of the many successes in the broadband marketplace has been the hundreds of billions of dollars invested by private operators in building broadband networks.

While I appreciate the sentiment that a deregulatory attitude towards broadband has spurred massive investment in a variety of networks, I'm having a hard time resolving this with the fact that the US is nowhere near the top five in terms of broadband rankings.

Of course, how one ranks broadband is an area of controversy, but I think it's safe to say that based on sheer capacity or price-per-megabit of service, the US lags far behind leaders like South Korea and Japan, regardless of any other factors.

So with this in mind, instead of being impressed by the hundreds of billions spent by private industry, I can't help but wonder if as a country we've spent that money wisely.

To further put this into context, the highest estimate I've heard for what it would take to lay a fiber pipe to every last home in America is $500 billion. Regardless of the merits of DOCSIS 3.0-enabled cable or next-generation DSL service, there's no denying that full fiber networks are the gold standard of broadband as no other technology has the capacity, reliability, and scalability of fiber.

Taking this a step further, I can't help but wonder: what would've happened if instead of focusing on facilities-based competition, which dilutes investment across multiple technologies, we had concentrated all this investment on creating a Full Fiber Nation?

Given that hundreds of billions have already been spent, I'd think that would mean we could be at least halfway, maybe even further, to this goal by now if we'd been able to focus on encouraging competition between services over a common infrastructure vs. between last-mile technologies.

Of course, many of you reading this are likely guffawing at the idea that we could ever unite private investment, but I should mention now that I have a different way of looking at private vs. public investment. In the end, I see all of those dollars as coming from our pockets. Whether it's money we pay a company for services or products that they can then invest, or that we pay in taxes that the government then spends, the money is ultimately all coming from us, the users.

That's a big reason why I'm largely non-ideological about who's doing the investing in building out broadband infrastructure. All I care about is that this investment is done well and with the public's best interests in mind.

Back to NCTA's boasting about hundreds of billions being invested: I can't get excited about that knowing how relatively little we have to show for it. Hundreds of billions of our dollars have gone to creating a broadband marketplace where competition is inherently limited (because of the high cost of deploying new networks) and very few people have access to world-class broadband.

So rather than being impressed by the amount of money put out by private operators, I'm depressed by the results of that fragmented investment. Rather than seeing that big number as a reason to maintain the deregulatory status quo, I'm left wondering if there's a better way to push our country's broadband future forward.

And despite the fact that I respect the hard work and big risks undertaken by the private sector to get us where we are in terms of connectivity, I can't help but ask myself the question: is it time to reconsider our emphasis on facilities-based competition and start looking at how we can concentrate the investment of our dollars on bringing the best possible broadband to all Americans?

I'm not necessarily saying that building a common open infrastructure is the ultimate answer, just that I think given how much we've already invested relative to how much (or little, rather) progress we've made in getting the country wired that it might behoove us to at least consider new possibilities like this.

Now is not the time to allow tired dogma to trump the serious consideration of new ideas. In going through the process of formulating a national broadband policy, all options should be on the table. There should be no sacred cows. Not even that facilities-based competition is the undisputed best way to spur investment in broadband networks.

Because while taking this approach has led to billions of dollars being invested by the private sector in building out all sorts of broadband technologies, when looked at from an international perspective it's hard not to feel as though that as a country we're not realizing the full return on that investment of our dollars.

Hey RUS: What Happened to Loan Guarantees?

| 3 Comments | No TrackBacks

When Congress passed the stimulus back in February they allocated $2.5 billion to RUS to distribute as loans, grants, or loan guarantees to stimulate broadband deployment.

When RUS released its NOFA a couple weeks ago it included the ability for rural broadband projects to apply for loans, grants, and loan/grant combos.

This begs the question: whatever happened to those loan guarantees? Why didn't RUS listen to Congress and include them as an option? Why did RUS choose to ignore this important component of their funding toolkit?

The short answer is likely because while RUS already has a loan guarantee mechanism in place no one's ever used it before for two primary reasons: the 80/20 guarantee isn't enough to entice private lenders to open their coffers, and it takes just as long to get approved for a guarantee from RUS as it does a direct loan. Since a direct government loan will always have a lower rate than a guaranteed private loan there's little incentive to go for a guarantee. So not only does the current guarantee system not free up private capital, it doesn't give applicants any incentive to use that option.

And yet guarantees hold so much promise. They allow government to lower their risk and leverage private capital without having to write a single check.

That's why the Rural Fiber Alliance, an ad hoc coalition of pragmatic rural fiber deployers, has suggested a simple way of tweaking the guarantee program so that it can be used as a viable tool for increasing the capital available for rural broadband deployment.

The two tweaks to be made are as such:

Have government cover 100% of losses up to 50% of the value of the loan.
Doing this lessens the risk significantly for lenders as they're no longer on the hook for losses starting from dollar one. Plus it can lower the government's overall exposure. We've spoken with a handful of lenders and have confirmed that guarantees that cover 100% of losses up to 50% of the value of the loan would open up the funding floodgates, unlocking and unleashing the frozen credit markets.

Create a fast-track approval process.
With these new guarantees in place RUS can significantly lessen the amount and depth of vetting they have to do. The reason for this is because if a private lender is willing to write a check for the whole amount and take on half the risk, then government can assume that that lender's doing the vetting necessary to insure that only viable projects are getting funded. What this then allows for is a streamlined approval process where government simply goes through checklist to confirm projects have all the components needed to raise funds from the private capital markets.

Getting a bit more specific, the RFA has proposed in comments it has filed that RUS peel off $500 million of its $2.5 billion to be applied to the creation of a fast-track partial loan guarantee program.

By doing this, that $500 million of budget authority could be used to distribute at least $10 billion in partial guarantees (like a loan, guarantees don't count dollar-for-dollar against the budget), which could free up at least $20 billion in private capital to fund rural broadband deployment. All without government having to write a single check.

Even better is that this money can potentially be freed up in a matter of weeks, allowing these subsidies to move at the speed of the market rather than the snail's pace of government.

In fact, if the RUS would've adopted this proposal back in the spring when it was made, we could already have billions of dollars flowing into rural broadband deployments, creating jobs and getting the unserved connected. Instead we're stuck waiting for the gears of government to slowly churn out the money.

Fast track partial loan guarantees represent the smartest kind of broadband policy. They maximize how much we leverage government dollars. They can work quickly to turn dollars into deployment. And they reduce the administrative overhead associated with vetting applications.

So why has RUS refused to embrace them? Why aren't we using this intriguing tool in our funding toolkit? Why can't we think outside of the box just a little bit to realize new and better ways of conducting broadband policy rather than just pursuing the same old, same old?

The time for a new day of pragmatic broadband policy has dawned. So let's acknowledge this new reality and embrace what could be the most powerful tool government has at its disposal to spur the deployment of broadband: fast-track partial loan guarantees.

My message to RUS - It's time to step up to the plate and take full advantage of the resources at your disposal. Realize that there are better ways to be utilizing government resources than just writing checks. Try to see past the old ways of doing business to open your eyes to new possibilities.

Rural America's counting on you to make the most of the resources given to you. To get the biggest bang for the broadband buck. And the best way to accomplish these goals is through offering fast-track partial loan guarantees.

One of the most troubling aspects of the NTIA/RUS NOFA is its requirements for unserved vs. underserved.

Not only do the definitions limit subsidies to the most rural areas cutting out viable urban and suburban projects, but the data applicants need to gather to prove an area is un- or underserved is expensive and time-consuming to gather, plus the rules seem to prioritize protectionism over progress by giving incumbents 30 days to respond to and refute any application that touches upon their service territory.

But at the same time, the data needed to determine un- and underserved doesn't necessarily have to be recaptured as it's already readily available in the hands of incumbent carriers. Additionally, we know the incumbents are very interested in preventing government subsidies from going towards project that will introduce significant new competition to their legacy networks.

Now combining these two threads together, I'd like to propose an alternative solution: why not force incumbents to show which areas are already served by threatening to consider all areas unserved that they don't produce verifiable data for showing that they offer service there?

By doing this we'll save applicants from wasting a lot of time and money collecting data that already exists, plus we'll also enable them to know if the area they're putting together a project for qualifies for subsidies before submitting their application. Additionally we'll be able to take this data and use it to inform the broadband maps we're charged with creating.

On the carrier side, we give them a clear reason for why they should give up their data on the availability of their services so as to insure we're not subsidizing duplicative investment. And at the same time we can remove any appearance of prioritizing the protection of private service provider interests over making real progress in the deployment of broadband.

So there you have it, a simple way to improve how the un/underserved dynamic works in the NOFA: consider all areas unserved until an incumbent carrier provides the verifiable data needed to prove otherwise, either that an area's already fully served or merely underserved.

Top Ten Ways The NTIA/RUS NOFA Fails America

| 1 Comment | No TrackBacks

It's kind of surreal to know that today NTIA and RUS began accepting applications for broadband stimulus grants and loans. After months of waiting for signs of progress all of a sudden we're entering the next stage of the process started when the ARRA passed in February.

And yet I, like many others, can't help but be disappointed by the many ways in which this effort has already been letting us down, frustrated by the lack of vision to set America back on course to be the Internet's global leader and worried that despite the good intentions expressed in speeches and interviews that the Obama Administration's first opportunity to push forward their broadband agenda is fast becoming a boondoggle.

So what's so wrong with the broadband stimulus? Well, let me count the ways. Here's the top ten (though in no particular order) ways in which the NTIA/RUS NOFA has failed America:

1. It's taken too long to get capital flowing so networks can start deploying.
We've already squandered the opportunity to spur any deployment of broadband in 2009. The stimulus passed in February. There was enough time to get something done this year other than paperwork. But unless the RUS embraces our proposal for creating a fast-track partial loan guarantee program in the next few weeks to apply $500 million of its budget authority to free up $20 billion in private capital, no broadband will be stimulated this year.

2. Now things are moving too fast.
After waiting around for five months with nothing to do but comment and speculate, now everything's been compressed. The NOFA came out the first week of July, less than two weeks ago and they're already starting to accept applications. Now potential applicants have hundreds of pages of guidelines to read through, consider, and address in a matter of weeks. The workshops intended to help advise applicants just started last week and will run through the end of July, putting anyone going to one of the last workshops at a potential disadvantage as they have to get their applications in by Aug 14. And there's been no time to consider whether the NOFA's any good as now everyone has to scramble just to understand and apply to it. I like the suggestion of extending the NOFA deadline 30 days.

3. The workshops are disorganized and pointless.
While I haven't attended one myself, I've heard from others that these workshops are basically just people reading from the website and application. There's no great insight that's being shared so many who attend feel like it's a wasted day. Plus they've been handing out workbooks that weren't available immediately online. So in other words whoever was able to physically be in DC for the first workshop got a leg up on everyone who couldn't.

4. Asking volunteers to vet applications invites fraud.
I first read about NTIA's plans to use volunteers as their first stage of vetting applications from this Craig Settles' article, and since then many others have picked up this thread, asking the simple questions of: Who can we find that's qualified and willing to donate their time? How do we guarantee they're qualified? How do we insure they're not being influenced in some way financially to accept or reject applications? The only way this works is if there are sufficient qualified people able to volunteer their time for free and there are mechanisms in place to weed out and punish attempts to game the system. But I have doubts about every part of that last statement having any chance of being made of substance rather than wishful thinking.

5. Basing the definitions of unserved and underserved on advertised speeds encourages lying.
If I'm a broadband provider and I know that I can potentially derail a government-subsidized project that may be moving into my territory by simply advertising higher speeds, that may be too tempting to pass up. I'm not saying that all broadband providers are evil just that most are self-interested and also that I'm not confident there are sufficient safeguards in place to prevent this practice.

6. Giving incumbents right-of-refusal provides opportunity to falsify information to derail projects.
If there is an incumbent provider anywhere in the service area of an application they'll have 20 days to respond to it. Essentially what this gives them is an opportunity to try and derail the project by "upgrading" their speeds or being less than honest about their service areas, which is something I've heard incumbents are known to do, things like drawing a circle around a central office and claiming service is offered everywhere therein even if it's not. And again, it's not that I think all incumbents are evil, it just seems like we're giving them too many opportunities and incentives to lie and seemingly not doing anything to balance that out with potential punishments for bad actors.

7. Too much emphasis is placed on connecting rural unserved areas and not enough on supporting innovative testbeds.
Every time someone from the Administration talked about the broadband stimulus in the last few months they used the word "testbed" to describe the kinds of projects they wanted to see funded. Alongside that they acknowledged that there's not enough funds to solve the rural broadband problem entirely. So instead the point was supposed to be about funding showcase projects we can learn from to inform future investment decisions. But some of the best testbeds may be in urban or suburban areas, which are essentially ineligible the way the rules are written. Because of this, potentially great testbed showcase projects, like what San Francisco has planned out, will be overlooked to instead focus the stimulus on only deploying broadband where it isn't.

8. The minimum threshold for the definition of broadband is woefully inadequate.
I've spoken out strongly against this before and will continue to do so until we change our mindset towards how much broadband capacity is sufficient. The question to ask isn't "is this enough capacity for today?" It's "is this enough capacity for five to ten years from now?" The reason being is that when you subsidize a broadband network you expect that it will be operational for at least that long. And given the growing demand for bandwidth and the number of new bandwidth-intensive applications being developed, we can not afford to spend precious taxpayer dollars on networks with insufficient capacity. Not only is doing so wasteful as we'll have to go through this whole process of subsidizing deployment all over again in a few years, but it also relegates rural citizens stuck with these inferior networks to the status of second-class digital citizens.

9. The requirements for projects don't reflect the realities of building networks.
One overarching theme I'm hearing from a lot of my deployer/operator friends is how disconnected this NOFA is from the realities of building networks. For example, some projects were originally slated to build out an entire city, but now they're having to be scaled back to only deploy where there isn't already someone offering 768/200Kbps service. Not only does this mean that many people won't receive the benefits of fiber because they have the "privilege" of being stuck with DSL, but it's also the case that the most efficient way to build these networks is all at once vs. piecemeal. So by forcing the builds to break themselves up into pieces that fit the criteria of the NOFA, we're potentially adding cost to the ultimate total of what it'll take to wire every home with fiber.

10. In the end, this NOFA fails America in its lack of vision and aspiration.
In large part this stimulus is business as usual for America's broadband policy, or lack thereof. We're continuing to muck around, shuffling our feet when the rest of the world is racing forward. It's not just that this NOFA isn't aspirational enough, it's that it does seem to be aspiring to anything at all. There's no ultimate goal for what it's setting out to achieve other than getting some people some broadband. And there's seemingly little being done to even use this as a learning experience that we can build from and help guide future investments. It feels like NTIA and RUS just took the safest route, followed the same steps that have failed us in the past, and at best only marginally improved the approach. Because of this I can't help but feel pessimistic about what the ultimate impact of the broadband stimulus will be.

Take for example this comment emailed to me from Ernie Bray of US Metro Net about a project his company has in the works and his outlook on their odds of getting funded based on the rules of this NOFA:

"My current project has me working with the Idaho Commerce Department, Panhandle Area Council and the local governments and EDC's on an initial two county project. We have the support of the Governor, both U.S. Senators, their Congressman, all of the local governments, schools, hospitals, public safety, etc. and have the support of the power companies who have provided all their GIS data to the effort. In addition, we have one of the top engineering firms in the nation (they engineered and supervised all the FiOS builds in Washington and Oregon, plus others and are currently engineering Manhattan for Verizon). I have two MBA's on my team, one who is also a CPA with 25 years in private equity and venture capital, two of the top attorneys in the nation (one is Jim Baller) on the team, modeling tools that have taken thousands of hours over the last six years to develop, and which have been fully vetted over that time. And still it will be a miracle if we can meet their requirements.

That's how bad this whole NOFA is."

Note that including this quote is not an endorsement of Ernie's project. I've actually never met him before. But I felt like it really summed up the feelings many hard-working people like Ernie share.

There are a host of truly shovel-ready, truly innovative, true testbed showcase projects for us to be supporting through this broadband stimulus. But based on how things are looking so far, I can't help but feel like this NOFA is already a massive failure and the money hasn't even gone out the door yet.

If I've missed some way in which you think the NTIA/RUS NOFA has failed America, add it as a comment below. There is some hope in that the rules can change for the second and third rounds of funding. But we need to work hard to identify how they can be improved so that the entire broadband stimulus doesn't end up as good intentions wasted by poor execution.

In continuing to review the NTIA/RUS NOFA I'm getting a little confused: wasn't the broadband stimulus supposed to be about jobs, jobs, jobs?

That's what we'd been told over, and over, and over again, that its number one priority and intent was to facilitate the retention of existing and creation of new jobs.

And yet a quick search of the 100+ page document shows that the word "jobs" is only used 6 times:

- As the first purpose of the Recovery Act
- Then five instances where applicants must estimate the number of jobs retained and created by their projects

But here's the funny thing: in the scoring system they laid out for determining which projects get funding, there aren't any points given for job creation. Not a single one. Not even a mention of jobs. It makes it seem like the only reason they're even asking for numbers related to job creation is as inconsequential window dressing, like they're just going to put a checkmark next to an application saying, "Yep, they're going to be creating jobs."

How is it that this NOFA is completely ignoring the Recovery Act's primary purpose? Why aren't we rewarding projects that promise to create the most jobs?

But there's an even more glaring omission in this NOFA: the lack of focus on innovation.

The thing I heard Administration officials cite the most about the purpose of the broadband stimulus wasn't to try and get as many unserved homes connected as possible, but instead was an emphasis on supporting testbed/pilot/showcase projects that we could learn from to help enable more efficient deployments and guide a future wave of government subsidies.

And yet when you search for the word "innovation" or "innovative" in this NOFA, the only references you turn up relate to the broadband adoption piece of the stimulus. Putting a finer point on it, this NOFA rewards no points for applicants offering innovative solutions to solving problems related to broadband deployment.

I have to admit, I'm really confused by this. Every time an Administration official talked about the stimulus it was in terms of testbed projects, and yet nothing in this NOFA supports that idea. It makes me wonder why more wasn't done to craft the NOFA to favor innovative models for deployment.

I truly can't understand where this disconnect comes from. After saying repeatedly that this broadband stimulus was about jobs and testbeds, we end up with a NOFA that's all about getting the unserved online.

Now, perhaps these issues of jobs and innovation will play a larger role in weighting projects just in a less formal, more subjective way, but even then I'm not sure why they wouldn't incorporate these priorities into the scoring system. It doesn't seem like it'd be that hard to do. Give some bonus points for projects that create the highest ratio of jobs to dollars. Then give some more bonus points to projects that are using innovative new business models to help fund network deployment. It doesn't even have to be a lot of points, but at least something to prove NTIA and RUS's commitment to what had been stated officially as the two primary purposes of this broadband stimulus.

While I know most people are now focused on filling out their applications and many don't want to criticize this NOFA lest they risk upsetting those who wrote it thereby lessening their chances at getting funding, I can't help but continue to point out shortcomings when I see them.

And the fact that this NOFA essentially ignores the two most important goals of the broadband stimulus is something that I can not ignore. We can and should be doing better.

VidChat: Introducing MuniNetworks.org

| No Comments | No TrackBacks

Recently, Christopher Mitchell and the Institute for Local Self-Reliance launched a new website: MuniNetworks.org.

This site aims to serve as a repository of information about municipal broadband with two primary purposes:

1. To speak out forcefully against misconceptions about municipal broadband.
2. To provide a resource that can allow municipal networks to learn from each other.

To help get acquainted with the site's mission I conducted a VidChat with Chris Mitchell last week in which we discuss both the site specifically and municipal networks in general. Enjoy!

As a quick followup, the thing I like most about what Chris is doing with this site is that he's not going to just be another cheerleader touting municipal networks as the most perfect thing in the world. He's going to be honest about the ups and downs these networks have experienced.

I also like that while he and his organization do fervently support the rights of local communities to build their own networks, there aren't any other underlying agendas that they're trying to drive forward. Too many other efforts like this on related subjects aren't really about what they say they are, but instead have their messaging skewed by hidden agendas.

I've known Chris for a while and he's one of the straightest shooters I've ever met. And he's already producing some compelling content, like a comparison of the prices and speeds of municipal networks vs. private providers.

I'd recommend that anyone that wants to learn more about municipal broadband to check his site out. While still new, there's already good stuff to be found there, and that should only increase over time. And Chris would love to have this site become a form that can welcome multiple contributors. So if you want to participate in helping create content for the site, email him at: [email protected]

On Wednesday NTIA and RUS released their long-awaited, eagerly anticipated Notice of Funds Availability (NOFA) for the BTOP and BIP programs, respectively, which aim to spur the deployment of broadband through making available billions in grants and loans.

While there's lots to be discussed about what's contained herein, I want to start with the part of this that's already drawing the most ridicule: how they define broadband.

Rather than setting a more aspirational minimum threshold, this NOFA simply adopts the FCC's definition of 768Kbps down and 200Kbps up. So long as a grant/loan application delivers that much capacity, they're eligible to receive government subsidies.

To give a sense for how the Internet's thought leaders are reacting to this, let's consider this quote from a man who's often referred to as the "father of the Internet," Vint Cerf: "The definition of broadband sucks so badly it should be used to sequester carbon dioxide."

Now before I get into my critique of how inadequate this definition of broadband is, let's first give credit where credit's due and acknowledge what they got right, which can be summed up by quoting a single paragraph:

"Networks will be graded on a sliding scale with higher end-user speeds receiving a higher score. Proposed networks with high latency will be viewed unfavorably. Applicants may gain additional consideration if the applicant can demonstrate a clear and affordable upgrade path for the network."

So higher capacity networks and those with clear upgrade plans will get higher scores, and those with higher latency will get lower scores. That's all great, but in the end its still inadequate.

Let's take a look at why they chose 768/200:

"RUS and NTIA favor this broadband speed threshold because it leverages the FCC's expertise, utilizes an established standard, facilitates the use of many currently common broadband applications (e.g., web browsing, VOIP, and one-way video), allows for consideration of cost-effective solutions for difficult-to-serve areas, and is the most technology-neutral option (because it encompasses all major wired and wireless technologies)."

So first off "leverages the FCC's expertise" and "utilizes an established standard" mean essentially the same thing, that it's easier and less risky for them to use the FCC's definition then to try and set out their own. Basically, I read this as them not wanting to stick their necks out and giving pundits someone else to blame for the inadequate definition.

Next up, I have to admit being totally flabbergasted by their claims that this definition "facilitates the use of many currently common broadband applications" and yet they completely ignore entire classes of "currently common broadband apps" like two-way videocalls, uploading video to YouTube, remote computer backups, webcasting video out to the world, P2P applications, and more. They're basically saying that their definition's adequate because the Internet's primarily a one-way medium. Has no one been paying attention to what's been happening on the Internet over the last 5 years?! It seriously feels like they're writing this definition as if it were 1999. The craziest thing about this is that they didn't even need to define broadband as symmetrical to be reasonable; if they would've just bumped the upstream threshold to 500Kbps then there'd be enough capacity to handle uploading low-quality video, but instead they chose an embarrassingly inadequate minimum and chose to ignore the many apps that need more upstream bandwidth.

Then they make the claim that this inadequate definition "allows for consideration of cost-effective solutions for difficult-to-serve areas." I can't help but read "cost-effective" as "cheap." I can't help but read between the lines and think they're saying, "It's too expensive to get rural America real broadband so we'll just give them the cheap stuff." I can't help but read this as them saying that rural communities don't need 21st century connectivity, that it shouldn't be a priority to strive to provide equal access to all Americans. I don't see how anyone who cares about the future of rural America can find this adequate.

Finally, the militant adherence to a regime of technological neutrality is starting to get a bit comical. They're basically saying that they couldn't set the bar too high as that might exclude some broadband technologies, that it's more important that all broadband technologies have a fair shot at stimulus dollars than it is to deliver the kind of connectivity Americans need to participate -- let alone compete -- in the digital economy. Why is it we refuse to acknowledge that some technologies may not be worth investing in? Why do we continue to prioritize protecting the rights of technologies over promoting the rights of citizens? This mindset is absurdly inadequate.

If only these were the only aspects of how they're defining broadband that were inadequate, but unfortunately that's not the case.

For example, check this out: "Applications will be scored for the extent to which the advertised speed for the network's highest offered speed tier exceeds the minimum speed requirement for broadband service."

Notice how they specifically cite "advertised speed" rather than actual. This is bad on multiple fronts. It rewards gamesmanship when it comes to how much speed a network says it can deliver vs. what it can actually deliver. As there are additional points granted to networks that can offer higher speeds, there's now an incentive to over-promise and no penalties for under-delivering. Even worse than that is it opens the door for lesser technologies to promise but not even be able to deliver the minimum of 768/200. By not looking at the capacity a network can actually deliver it renders bandwidth thresholds irrelevant.

But putting all this aside for a moment, the biggest reason I find this definition inadequate is because of how unbelievably short-sighted it is. Any investment in broadband infrastructure needs to be considered with a horizon of at least five to ten years, and preferably longer. We don't want to have to subsidize the expensive and time-consuming process of deploying broadband again in a couple of years because we invested in inadequate technology today.

So regardless of whether or not you think 768/200 is adequate today, what about in 2019? Because the reality is that whatever networks we subsidize today is likely all these rural communities are going to have until we subsidize them again. So by setting the bar too low we're essentially cementing underserved communities to permanent underserved status.

Unfortunately this acceptance of mediocrity, this unwillingness to aspire for greatness, infects the entire document even at a meta level. A quick search shows that Kbps is mentioned 24 times, Mbps only 11 times, and Gbps not once. The fastest speed cited anywhere is 100Mbps for middle-mile networks. How is this adequate when some countries are setting goals of bringing 1Gbps of connectivity to the majority of homes within the next 5 years? How is it in the 21st century, when communities elsewhere in the world are realizing 100Mbps to the home today that we're spending more time talking about Kbps than Mbps? When did America become so afraid of striving for greatness?

I know a lot of hard work was put into this NOFA, and I respect the energy and long hours expended by everyone involved with compiling this document. And there are other things in here that I do like that I'll write about later. But I also know that I'm not alone in being incredibly disappointed by the lack of ambition that their definition of broadband shows. I fear that what's driving much of this is the desire to be politically correct and inclusive, to make sure that everyone has a seat at the table.

But now is not the time for milquetoast attempts to make everyone happy. Now is the time for a bold plan that can catapult America back to the top of the broadband rankings. And we can't have an adequate plan for achieving this if we're still talking about our goals for broadband in terms of Kbps.

About this Archive

This page is an archive of entries from July 2009 listed from newest to oldest.

June 2009 is the previous archive.

August 2009 is the next archive.

Find recent content on the main index or look in the archives to find all content.