December 2009 Archives

What Happens To The Stimulus After The Money Runs Out?

| 3 Comments | No TrackBacks

With so much buzz built up around the stimulus, especially now that awards are finally being made, there's disturbingly little being said about what happens after the money runs out.

The only recent discussion I've seen is this great call to action from Charles Benton and Dr. Kate Williams about "Why We Must Measure the Results of the $7.2 Billion in ARRA Broadband Funding."

They make many salient points about how significant an opportunity the stimulus is for gathering all sorts of new data about what kind of broadband deployment and adoption programs and models are most effective. They also highlight how we need good data to make sure the stimulus is a success if we want any hope of getting more money from Congress in the future for broadband.

But I think this issue of what happens after the stimulus well runs dry goes even deeper than that.

As far as I can tell, the way things currently stand there are no funds appropriated or serious accommodations being made for the ongoing oversight of stimulus fund recipients. This is bad on multiple levels.

It means there won't necessarily be anyone making sure that the money's being spent wisely by recipients, that they're spending it on what they said they would and that they're making sound investment decisions. Stimulus recipients have all sorts of requirements for reporting data like this, but who's going to be actively monitoring it? Also, what are the consequences of misspending money? Other than the possibility of not getting the full amount of funds requested, I've heard of no other ramifications for a recipient abusing their taxpayer support, and I worry that if we do end up with some bad apples that simply turning off the faucet could mean stranding network assets.

Also, without ongoing oversight, it makes it that much less likely that we'll actually learn anything from these billions of dollars being invested in broadband. How can we accept government talking about learning lessons from the impact of the broadband stimulus without actually having someone setup to capture this information in such a way so it can be shared with others? This feels like something that's been mistakenly de-prioritized in the rush to get money out the door.

But perhaps most troubling of all, I've seen almost no talk about oversight of the actions of government-subsidized broadband providers that extends beyond when the last stimulus dollar is spent.

This leads me to wonder: What good will the stimulus have done if the networks it funds go belly up in five years? Or even more troubling, in situations where the stimulus is enabling the establishment of private monopolies, what's to stop them from raising prices indiscriminately in the future to drive greater profits? How can we insure stimulus recipients continue to be good stewards for their communities moving on into the future?

I'm not trying to suggest that private companies shouldn't get funds, or that any particular recipients will do anything other than their best to serve their communities. What I'm just trying to point out is the absurdity of spending billions of taxpayer dollars without adequate oversight.

I know this might not be the way things are normally done in DC, but the lack of attention on these vitally important issues makes me wonder if we shouldn't be seriously considering going back to Congress and asking for more time and flexibility to rework how the rest of this broadband stimulus plan plays out.

I know the point of the stimulus was originally to get money into the economy quickly and stimulus job growth, but that ship has largely sailed and given the long-term significance of any investment we make in our country's broadband infrastructure, it seems like it'd be more prudent to make sure we do this right rather than quickly.

Before we attempt to spend any more taxpayer dollars, we should take a step back and make sure we're protecting their interests sufficiently and that we're learning all that we can from these massive investments. Because otherwise we open ourselves up to too much risk of things going wrong, and we'll miss too many opportunities to do better in the future.

Let's not allow misguided momentum and arbitrary deadlines to prevent us from doing this right. It's time we wake up to the fact that as important as what's been happening leading up to and at the beginning of the broadband stimulus is what happens after the money runs out.

More Bandwidth Means Less Friction Means Greater Usage

| 3 Comments | No TrackBacks

A few weeks ago I met Patrick Chung, managing director for SK Telecom Ventures, at Supernova, a terrific conference in San Francisco focused on communications technologies and their ramifications.

SK Telecom Ventures is the venture arm of SK Telecom, one of the leading telecommunications companies in South Korea.

Any time I meet someone involved in telecommunications who's connected to South Korea I like to ask the question: So what cool things are happening on fiber over there?

Patrick ran through the usual suspects of video-on-demand, videoconferencing, and the like, but I had to stop him. All of the things he was listing could be delivered over lesser broadband technologies. So I asked again: What cool applications are specifically leveraging the power of fiber?

His answer was fascinating. He suggested that it wasn't about any one application, but that the capacity of fiber was providing enough bandwidth to remove all friction that might otherwise prevent users from taking advantage of Internet applications.

I've never heard it described this way before but it's elegant and makes perfect sense. Lack of bandwidth leads to poor performing applications which creates disincentives for usage. But having adequate bandwidth removes this friction. And if apps perform the way they're meant to then users are more likely to adopt them.

Sometimes those of us who advocate for fiber get caught up in trying to identify the killer app, the single application that justifies the need for fiber. But in South Korea it's not about any one bandwidth-intensive application but rather that the availability of fiber is making it easier for everyone to use anything that's currently available.

This also presents an interesting new angle for our argument on behalf of fiber.

Too often we get stuck on the question of "Why do we need all this bandwidth when no individual apps require it?"

But now we can answer: "Because if we want people to adopt broadband then they need to have enough capacity so as not to introduce friction when they're trying to use today's apps let alone tomorrow's."

I'd also like to suggest three other sources of friction: price, service, and usability.

The price of bandwidth is the primary problem most people focus on. It's not new to suggest that the cost of broadband is a barrier to adoption and usage. But it's usually discussed as a standalone issue, not as something that presents the same problem as the lack of capacity, namely that it introduces friction which can limit use.

Along these same lines, I think we have to acknowledge the role that service plays in creating friction. If there are times my broadband connection doesn't work at all or as well as usual, then that's going to dissuade me from relying on it. Similarly, if I feel like I'm not being respected as a customer by my provider, then I'm less likely to keep paying for service, especially if I'm questioning the need for broadband in the first place.

Finally, usability is another key source of friction. The less intuitive a service or app the less likely people are to use it. This is an issue that goes beyond just broadband to also usability concerns surrounding computers and apps, but it's important to note in this conversation as well.

All of these factors can create the friction that suppresses adoption and use of broadband. But what's important to realize is that these last three don't really matter if the friction of inadequate bandwidth isn't removed.

If there isn't enough bandwidth, then it doesn't matter how much it costs, or what quality of service I'm getting, or how usable the interface is. There's nothing more fundamental than having adequate bandwidth. If I can't run the apps I want to smoothly, then I'm not going to use broadband.

So when we talk about why we need fiber, let's not focus so much on finding killer apps and instead put a greater emphasis on the friction-fighting power of fiber.

An Early Christmas And A Good Start For The Stimulus

| 2 Comments | No TrackBacks

Yesterday NTIA and RUS announced the first awardees in the first round of the broadband stimulus.

I have to admit, I was starting to wonder if they were going to be able to actually get anything announced before the end of the year, especially with the holidays right around the corner, but NTIA/RUS delivered and now 18 projects across 17 states get to enjoy an early and very merry Christmas.

I also should point out the obvious that I had some severe reservations about the direction the stimulus was heading, but after this first set of winners I'm feeling more confident that despite the bumpy road to get here that ultimately the broadband stimulus is going to be a big win.

The majority of the awardees sound like solid projects. They all seem to represent coalitions of local stakeholders, which is vitally important to the future success of a network regardless of if its a public or private entity doing the deploying. For the most part, they all seem to be part of coordinated, forward-looking plans. And there weren't any obviously egregious choices made. All of this leaves me much more optimistic for how the rest of this process will go.

That said, that doesn't mean I'm without reservations.

The first I want to point out is the bizarrely opaque way they released the full list of winners. In following my Twitter stream I know I was not alone in having trouble finding a full list of all the winners yesterdays. Most news articles only cited a handful. And yet it appears as though the White House released a full list on Wednesday to reporters. But for some reason that list didn't get distributed that widely. There wasn't even a link to it on BroadbandUSA.gov yesterday, though I just checked and that's changed today. This is a very minor quibble, but I still think it's important to point out as I'd think NTIA/RUS would want everyone to be able to easily access their progress so they can show the world they're on the right track. Moving forward, they may want to think about doing more to make sure this information's accessible right away when they announce the next winners. (Here's that list, by the way.)

My next concern might be a bit more serious, namely one of the major projects seems to primarily involve the deployment of a 4G network with satellite backhaul in Alaska. What worries me is whether or not satellite backhaul will be sufficiently capacious, especially in terms of supporting the potential capacity of 4G wireless. I'd hate to see a robust 4G network be put in place that ends up bottlenecked because of the limited capacity of satellite. I also wonder if these 4G networks will be able to support real-time applications like VoIP or videocalling because of the latency that's inherent to satellite transport. It may be that satellite backhaul was the best or only option in this part of rural Alaska, but I can't help but be concerned that while this network may be great for short-term needs, in the long-term it won't be sufficient.

Another thought I had was wondering if some states are going to get gypped. What I mean by this is I get the sense that NTIA/RUS are focusing first on making sure every state gets their one project and then doling out whatever money's left in the first round to other meritorious projects. What I wonder is what this might mean for somewhere like my home state of Minnesota. They were included in this first batch, receiving a $2.9 million public computing center grant. But does this mean they won't necessarily get an infrastructure grant or loan in the first round? I don't know if they're going to try and split the money evenly between the states, and I actually don't think they necessarily should, but I'd hate to see a state like Minnesota that I know has a number of worthwhile infrastructure projects miss out because someone else in the state got money for computers.

The final disappointment for me in this first round is the general lack of imagination when it comes to the adoption and public computing center projects that are getting funds. They seem to be largely of the "buy computers and teach people how to use them" variety. Now, I'd admit that it's hard to criticize this. For one, the executive summaries likely don't capture the full scope of projects so some of these might be more innovative under the surface. And I can also respect NTIA's probable desire to make sure their money's spent on proven projects they know will work vs. something that's new and might not. But I think we'd be remiss if we didn't use the stimulus as an opportunity to fund some out-of-the-box ideas because we may never get the opportunity to experiment and learn lessons on this scale again. Of course, this may also reflect a lack of imagination across the applicants, but either way I hope we see some more innovative ideas in these areas getting funded as we move through the rest of the first and into the second rounds of awards.

All in all, I'm relatively happy. While I'd love to see more full fiber projects get funded, those could still be coming, plus it was reassuring to see how the majority of the infrastructure projects were focused on fiber in one way or another. So I say kudos to NTIA and RUS! The process to get us here has been much maligned, but in the end all that matters is that the right projects get funded.

Moving forward, I'm going to be spending more time diving into the winners over the coming weeks to try and discern what are the common characteristics NTIA and RUS are looking for. I'm also going to continue exploring if we're administering the stimulus right and if there are any ways we can be doing it better.

Net Neutrality: The Internet's Achilles' Heel

| 1 Comment | No TrackBacks

With the FCC formulating new net neutrality rules, the debate around network management is coming to a head. At this time I think it's important to acknowledge the significance of these issues as in many ways net neutrality is the Internet's Achilles' heel.

I mean this in at least two senses.

The first is that if network operators were allowed to slow down or speed up traffic anticompetitively, it could destroy the very fabric of the Internet. Without clear rules of the road, net neutrality advocates fear that this Achilles' heel of the Internet won't be sufficiently protected.

But there's an important second way that net neutrality is an Achilles' heel, namely that the open, unregulated nature of the Internet is also one of its greatest weaknesses. The fact that the Internet is totally neutral is a large part of what creates security vulnerabilities and leads to performance degradation as data goes skipping uncontrolled between networks.

That's what so many net neutrality supporters either miss or refuse to acknowledge: the Internet as we know it today isn't all that great. Sure there's been a ton of innovation in this open marketplace, but there's also still a lot of issues with the Internet's performance. While many if not most of these issues stem from the inadequacies of last mile broadband infrastructure, even if we had infinite bandwidth in access networks that won't necessarily fix all of these problems.

What sparked this line of thinking was the argument that net neutrality supporters keep making that the key to the Internet's success has been open innovation on the edge. But what that fails to consider is the possibility for innovation in the network being a key driver for the next generation of the internet.

I know I may be thinking about this too simplistically, but why wouldn't we want innovation in the network if it means protecting the Internet's Achilles' heel of performance?

Let's step back from the ledge for a second and realize that we can't be too single-minded about this. Does innovation at the edge need to be protected? Yes! Is the key to this innovation access to open bandwidth? Yes!

But will there also be opportunities to support continued open innovation within the network itself? Yes! To suggest otherwise ignores the basic idea that we should be supporting innovation everywhere.

To put this in more specific terms, many advocates of net neutrality have argued that network operators should not be able to discriminate between traffic. This means prohibiting them from selectively slowing down or speeding up data. So instead of allowing some data to go faster, the position is that no data should have access to a fast lane but rather that networks should only deliver open bandwidth that everyone can access equally.

While I don't disagree with the notion that we need to be encouraging the deployment of more open bandwidth, I don't understand why we'd want to prevent innovation from happening within the network, why we'd rule out the possible benefits of smart networks over stupid networks. Why can't there be a fast lane created for performance-sensitive applications that was open to everyone equally?

Don't get me wrong, the advent of smart networks raises a host of questions about fairness, privacy, competition, and beyond. But I've come to think that this militant attitude towards opposing smart networks is actually the Achilles' heel of the net neutrality movement.

I just don't think its credible to suggest that we should be preventing innovation from happening anywhere on the Internet. I'm not even sure we can say that innovation at the edge is more important than innovation in the network. The point is that we shouldn't be limiting ourselves.

Now that all being said, do we need rules to guide and potentially control the evolution of this innovation to protect consumers and promote open innovation? Quite possibly yes. We may even need to strengthen these protections today.

But let's not gloss over the fact that the Internet itself has an Achilles' heel that smart networks may be able to protect. Let's not allow that area to remain unprotected by leaving any options on the table. Let's not let the specter of big brother corporate interests getting too much power prevent our progress forward as a nation.

Instead let's look at the Internet's shortcomings with open eyes and find solutions to protect its Achilles' heels, wherever they may lie.

Too often in broadband policy circles we allow personal biases to cloud our thinking. We let preconceived assumptions get in the way of the truth of what really matters.

One area where this habit is especially egregious is in our perception of so-called municipal or public broadband as being anti-competitive, as being unfair to private enterprise, as government intervening when the market should be left to work alone.

But what this outlook fundamentally misses is the basic reality that whenever a community invests in building public broadband infrastructure it creates numerous opportunities for private enterprise to partner with people and work with government to get these networks built.

This isn't about public networks unfairly competing with private enterprise. Instead we must recognize that public broadband can be a boon for private companies.

It starts with the feasibility studies, design and engineering of the network, which are typically handled by outside private firms.

The project's funds then come from bonds sold to private capital, allowing them to earn guaranteed interest on their money.

That money's used to purchase a wide variety of equipment from a vibrant marketplace of private manufacturers and suppliers.

The actual build-out, operation, and maintenance of the network can be handled by public or private means, but even if the government handles a lot in-house that still means they're going to be hiring private consultants to help them as they learn the ropes of being broadband providers.

The best example of how public projects can create private opportunities, especially for private service providers, is an open fiber network.

In this model, the public pays for and owns the fiber. Some public or private entity builds and manages the network infrastructure. And then you've enabled a dynamic marketplace of private providers offering services on the network.

For incumbents it means getting a brand new state-of-the-art fiber network without having to invest any capital, and it simultaneously lowers the barrier to entry for new service providers thereby increasing competition and encouraging innovation.

It's also worth noting that public broadband can create tremendous opportunities for local private businesses to gain access to services that previously may have been unavailable. With this capacity they can grow their businesses and compete in the global economy.

What I'm trying to highlight here is that it's misguided to assume that public broadband projects hurt private enterprise. The reality is that no matter how involved any level of government gets in spurring the deployment of new broadband infrastructure, by being proactive advocates for their communities' futures government can create many opportunities for private enterprise to turn a profit.

So let's not limit our thinking that public or municipal broadband is inherently flawed because it involves government playing a hands-on role. Because no matter how involved government may be, by investing in our future we're creating opportunities in the present for private enterprise.

We Should Count All Bandwidth Equally - Up And Down

| 1 Comment | No TrackBacks

One of my biggest pet peeves in broadband debates is the over-emphasis on download speeds and the lack of sufficient attention being paid to upload speeds.

When talking about bandwidth goals, they're download first, upload second. When providers are advertising the capacity of their networks, it's the download number in big font, with upload hidden elsewhere. Often times this devolves into people referring to broadband only based on its download capacity.

This causes a serious problem for consumer awareness about upstream capacity. If providers with inadequate upstream capacity aren't talking about it then the average consumer may not realize the difference in the value they're receiving for their broadband buck, which calls into question the efficacy of a market where customers aren't making informed decisions.

I'm not the first to point out the need to strengthen consumer protections relative to truth in broadband advertising, but I want to make a suggestion that should help resolve this specific issues of the marginalization of upstream capacity: I think we should count all bandwidth equally when defining the service level that broadband delivers.

What I mean by this is simple. Rather than allowing providers to tout their downstream speeds in bold and hide their upstream, they should be required to most prominently display the total bandwidth they're providing. So if a provider offers a service that promises 50Mbps down and 5Mbps up then they'd have to say their broadband service is offering 55Mbps of total bandwidth.

Providers could still have the 50mbps down number in bold and try to hide the 5Mbps up, but at least this way consumers would start to have a better understanding of the true overall value that they're receiving.

Doing this should help networks that offer symmetrical or near-symmetrical speeds to differentiate what they have to offer. For example, in Lafayette, where Cox is offering a DOCSIS 3.0-enabled cable broadband service at 50 down and 5 up, or 55Mbps total, LUS would describe its top end product of a 50Mbps symmetrical package as offering 100Mbps of total bandwidth.

While I'm not one to rush to have government intervene in private markets, I do think there are times when government can play an important role in setting the rules of the road to protect the interests of consumers.

And one specific, small step that government can take in the right direction is if to mandate that all bandwidth be treated equally and require broadband to be defined by the total capacity it delivers, not just what it delivers downstream.

Bandwidth Hogs Are Real, They're Just Misunderstood

| 4 Comments | No TrackBacks

The term "bandwidth hogs" has been much debated in broadband policy circles over the last couple of years. Most recently, Benoit Felten, someone who I respect tremendously as a clear-thinking industry expert, has raised the question of whether or not bandwidth hogs are actually a myth.

I wanted to take a moment today to make the case that bandwidth hogs do exist, and they in fact represent the heart of the evolutionary transitional period that broadband providers face.

It used to be that no one used their broadband connections much. Broadband providers counted on this minimal usage so they could oversubscribe their networks. They also offered unlimited service to gain another competitive edge over dialup Internet access. And this all worked fine so long as no one was consuming that much bandwidth.

But over the last few years the demands on these networks have risen dramatically. This increase stems from the dual factors of power users having more ways to use more bandwidth than ever--like P2P video sharing, Hulu and YouTube, hosted applications, and the use of broadband to enable telecommuting--and more people finding reasons and ways to become power users.

This growth in demand is straining traditional broadband business models, forcing operators to invest in increasing capacity without necessarily adding any new revenue to offset this cost.

This reality is what's leading operators to explore options like metered bandwidth, so they can recapture their investments and drive additional profits from their heaviest users. This is also what's leading operators to use things like traffic shaping and application blocking.

While I disagree with the notion that bandwidth hogs don't exist or aren't a problem, Felton's more specific question of whether or not it's a specific small group of people who are stressing the network is still worth more detailed consideration.

But regardless of this, there's no denying that there are a small group of people who are using broadband networks way more than the average person, and that the heaviest users are straining the capacity and affecting the performance of broadband networks.

That brings us to the real question at the heart of this matter: is it the users' fault?

I've long recoiled at the label "bandwidth hogs" as it implies that users are being overly greedy, that they're doing something they shouldn't when really what they are are power users, or people who are taking full advantage of what the Internet has to offer today.

Put another way, I think so-called "bandwidth hogs" actually represent the leading edge of the next generation of broadband usage and demand. They're bringing to light how in the Internet of the future, users won't just be consumers of data, they'll be producers, and they'll be using applications that cause their computers to act like servers.

To support this evolution, we need broadband networks that can support these trends in usage, that will encourage rather than dissuade this growing demand. Because it shouldn't be a bad thing when people want to use more of a service.

At the same time, I am respectful of the fact that we can't expect private operators to continually invest in upgrading their network capacity without having some way of recouping those costs. Right now, the only incentive to invest is competition. But without a competitor forcing their hand, or a way to drive additional revenue off of offering additional capacity, then they have no reason in invest in expanding and upgrading their networks.

This raises an interesting conundrum that I think can be highlighted by using the analogy of a buffet. Who's fault is it that the crab legs keep running out on the broadband buffet? Is it the hungry customers for eating too much, or the buffet owners for not refilling the bin fast enough?

It may be easy to blame the buffet owners in this scenario, but what if their customers' appetites went up tenfold. What are they supposed to do? Supplying this increase in demand without realizing a subsequent increase in revenue would likely mean putting the buffet out of business. And yet if a customer's paying for an all-you-can-eat buffet then they should be allowed to eat all that they can.

Resolving this tension is a key to unlocking a better tomorrow for American broadband. My hope is that we can move past demonizing power users as bandwidth hogs and network operators as bandwidth misers to instead focus on how we can spur the deployment of networks with enough capacity to supply the demand that's there today and that will continue to grow on into the future.

Because if we don't get this resolved, then we're going to end up with barren broadband buffets and hungry bandwidth hogs at a time when we should be living in a land of plenty with Americans free to consume all the bandwidth the desire.

The foundation of American broadband policy to date has been that the best way to stimulate deployment is to leave it up to the market, to let private, profit-maximizing companies compete to lower prices, improve service, and expand networks.

While this model may be working in some areas--like those with the choice of Verizon FiOS, Comcast DOCSIS 3.0, and a handful of wireless providers--it isn't working in all areas.

In particular, we must come to accept the fact that rural broadband can't and won't be solved through profit-maximizing means. The economics just don't work out.

If I'm a carrier with $X dollars to spend, then I have an obligation to my shareholders to invest that money where it has the greatest chance of making the best return, and that's almost always going to mean picking a city over a remote town or even more rural area.

This becomes especially acute when we're talking about getting our most rural Americans online as it may be impossible to profitably deliver to them high quality service at a price they can afford.

But that doesn't mean rural broadband can't be profitable. A company like Hiawatha Broadband has proven that as it's been profitably delivering fiber to rural towns as a private company for years.

What makes them different, though, is that while profitable they're first commitment is to serving their communities not maximizing profits for their shareholders. This is in part because 40% of their shareholders are local non-profits.

And that's the common thread running through pretty much all of the rural broadband success stories in the US: they've all been projects that grew out of community interests first.

Whether the network ended up private like Hiawatha, or as a public project like in Bristol, VA or Pulaski, TN, achieving sustainable, high quality, reliable, and affordable success in getting rural America wired requires deployers that put people over profit.

It means supporting projects that make the decision to connect people in less dense and therefore less economically attractive areas because it's the right thing to do, even if it may not be the most profitable thing to do.

And yet to date, all of our government efforts have been implemented with a profit-maximizing mentality in mind.

RUS primarily gives out money to private and largely profit-maximizing entities. In fact, during the first round of the BIP, I heard many complaints from my fiber friends who were leading public projects that the information RUS was requesting on past revenue and profit weren't even relevant to their non-profit models.

We've also allowed the massive amount of money the government does spend on telecommunications every year to go largely to propping up outdated, private, and profit-maximizing networks, rather than investing in the build out of new public network infrastructure.

Yet I am more optimistic than ever that we are on the verge of a new age in American broadband policy. Where public dollars will be spent on public projects that can truly strengthen our broadband ecosystem for the 21st century.

The stimulus, for example, has attracted dozens of innovative but proven, sustainable yet dynamic models for how rural broadband can be solved.

The key is to remember that rural broadband can't and won't be solved by profit-maximizing means alone.

There is some gold in them thar hills, but more importantly there are a whole lot of people. So let's focus on people-centric projects first, and recognize that rural markets need solutions that prioritize building communities over maximizing profits.

I spent this week enjoying one of the most remarkable conference experiences of my life at Supernova in San Francisco.

Organized by broadband policy luminary Kevin Werbach, Supernova brought together a remarkable group of like-minded people.

There were network types from Orange, BT, and SKTelecom; major Internet companies like Google, Amazon, and Microsoft; a number of startups with a heavy emphasis on social entrepreneurship; VC and finance types; social media experts; industry observers and pundits; top DC technology policy advisors and leaders like Larry Strickling and Alec Ross; and leading community broadband advocates, like Joanne Hovis and Esme Vos.

Never before have I been to an event with such a rich tapestry of interests and expertise coming together to talk about the present and future of our world in the Network Age.

Reflecting the diversity of attendees was the dynamism of the wide-ranging agenda. I got to jump from sessions on broadband policy to the business of startups to the future of real-time communications.

Even more impressive is that pretty much every single session captured and held my attention for its entirety. Unfortunately, that rarely happens at conferences, where most sessions either cover stuff you already know, say things you don't agree with, discuss matters too technically or too simplistically, don't feature well-delivered performances, and/or don't engage the audience. This all-too-common failing of conferences was almost nonexistent at Supernova.

Not surprisingly, the combination of good people participating in an interesting dialog during sessions led to many great conversations in the hallways during breaks. That tends to be the true value of any conference, and Supernova supplied it in spades.

I'm starting to feel a little embarrassed by my excessive gushing about Supernova in this post but I just really did have that good and productive of a time attending. Even the food was good as there were some terrific sandwiches for lunch.

Part of what I think helps create the Supernova environment is the fact that it's located in San Francisco, which has to be one of the most dynamic, innovative areas on the planet. The creative entrepreneurial energy in this city crackles, driven in large part by the willingness of its innovators to share openly and honestly with each other, and the desire by its community to do right by the world.

This isn't just about closing deals and making money. There's a real sense that there are higher ideals we should be aspiring towards. And yet Supernova in particular also contained a terrific concentration of people with those ideals who are pursuing them through pragmatic, sustainable, and even profitable endeavors.

And this isn't about coming up with an idea of holding it tight to the vest. It's about an open spirit of community and collaboration, one that embraces the idea that by sharing resources we can all achieve more than by trying to do everything on our own.

There's an amazing collection of great people doing great things in this town, and at an event like Supernova it feels like that energy achieves critical mass, setting the stage for even greater things to become possible

I don't yet know what great things will come from having attended Supernova other than having had the opportunity to meet a lot of great people, but I can't wait to see. And I'm officially adding Supernova to my list of must-attend broadband events.

While I've been critical of NTIA over the course of the stimulus to date, I've had limited opportunities to engage in a direct dialog with them, so my criticisms were really about the perception of what NTIA's doing rather than the reality.

In the last few weeks I've had a couple of opportunities to listen to and speak directly with NTIA Administrator Larry Strickling, and I've come to realize that there have been a number of misunderstandings that I've been perpetuating.

Today I want to clear the air on these misunderstandings, giving credit to NTIA for the good they're doing while also bringing up new issues I've encountered and found troubling.

The Good
- The first major misunderstanding dealt with the role of states in the BTOP review process. The impression I and many others shared was that NTIA's review process had short circuited causing them to dump all the unreviewed applications on states' doorsteps, who were utterly unprepared to vet them. In reality, what NTIA intended was for states to simply say which parts of their state were most in need of help, not determine which specific projects warrant funding. While this makes a lot of sense, unfortunately not only did observers of NTIA like myself not realize this, but I get the sense that many states didn't either. But I'm still labeling this as an overall good because at least NTIA's intent was positive.

- The second major misunderstanding was the status of BTOP's volunteer review process. In following along with the saga of Mike O'Connor as an aspiring volunteer reviewer, I and others grew increasingly concerned about if this process was working at all. Mike seemed like the best possible kind of reviewer--highly qualified and yet unbiased--but after being invited to the initial orientation never heard another word from NTIA. My concerns about this have been assuaged recently, though, as I've had a chance to talk with a couple of reviewers who did successfully go through this process. While they did describe it as unsurprisingly bureaucratic, at the same time they felt like the review teams had a good mix of expertise that allowed for a thorough review of all applications. So while there's been a lot of doubt about the efficacy of this volunteer review process, it sounds like it has been working.

- Another misunderstanding has been less about NTIA and more about America's disjointed broadband policy planning, namely that the stimulus has been handcuffed by the lack of a national broadband plan and therefore would not be as focused or effective as it could be. In talking with Mr. Strickling, I get the sense that despite the lack of formal guidance, through his conversations with broadband advocates and experts he's come to adopt a clear and comprehensive vision for how to get the most bang for his broadband bucks. While I still have concerns that I express below, I'm feeling much better that despite the lack of a plan, NTIA does have a proper vision for the future of American broadband.

- This is a very small thing, but at the Supernova conference this week in San Francisco I had the chance to ask Mr. Strickling the question: "You talk a lot about how many good applications you've received and how they won't all be able to be funded, but will you also acknowledge that there have been a number of bad applications submitted?" His response was that he assumes there are but that they don't make it to his desk. It was a great answer that while not as direct as I would've liked also didn't involve him trying to do the government two-step of trying to say that everything's always coming up roses.

The Bad
- The most troubling thing I heard was my second question to Mr. Strickling. I asked him if he planned on living up to the initial promises made about the stimulus that it would be used in part to fund testbed deployments that we could learn best practices from. His response was, and I paraphrase: if I can fund a middle mile network that covers half a state I'd rather do that than spend the same money on an unproven last mile model that serves a limited area. While I agree with the intent of trying to maximize the impact of BTOP, I find it quite troubling that the idea of supporting innovative testbeds has been thrown out the window. NTIA has some tough decisions to make as there are lots of good applications they have to pick winners from, but I hope they don't pass up this potentially once-in-a-lifetime opportunity to plant the seeds of innovative high quality sustainable last mile deployments across America.

- The other troubling thing I heard was something I hope was just a slip and not indicative of government's overall thinking. Mr. Strickling shared that he's focused on supporting projects that can point the way towards a sustainable universal broadband future. That's something I support wholeheartedly. But then he said that since there's likely no more money coming from Congress, that what he's trying to set the table up for is private industry to pick up the slack once BTOP money runs out. What was missing from this statement is the possibility of communities stepping up to pick up that slack. I'd thought in this new administration we'd evolved past the notion that the only way to solve our broadband problems is through market-driven private entities, that there's value in allowing communities to determine their own broadband destinies. Again, my hope is that Mr. Strickling just forgot to mention public broadband, but if he didn't then this is even more troubling than the decision to not support the concept of testbeds.

The Uncertain
- The most surprising revelation I had was when Mr. Strickling shared that NTIA is in the process of setting itself as the lead Internet policy shop in DC. When asked how this would interact with the FCC, Mr. Strickling suggested that NTIA's actions were supported by FCC Chairman Genachowski, and that the FCC would continue to take the lead on regulatory issues but that NTIA would serve as the coordinating policy entity on Internet policy. I love that there's going to be more focus put specifically on Internet policy, and NTIA is pretty well positioned to do this with their role with ICANN. I also think it makes some sense to have the entity charged with leading Internet policy creation be outside of the FCC as we need to find ways to start some of these policy conversations over from scratch rather than trying to make the round peg of Internet applications fit within the square holes of existing FCC regulations. But at the same time I wonder how turf wars will be avoided on Internet policy. I'm particularly interested in what role will be defined for NTIA's Internet policy shop in the FCC's national broadband plan. While this could ultimately be a good thing, I'm withholding judgment until we actually see it up and running as I'm still a little wary that this is going to lead to more confusion than progress.

Conclusion
With all these thoughts in mind, I'm both more and less optimistic about NTIA. Overall I'd say these revelations have been a net positive, and after having a couple of opportunities to speak directly with Mr. Strickling the sense that I'd gotten from others who know him that he's the right man for this job has been confirmed. But at the same time the jury's still ultimately out as NTIA hasn't start releasing the tough decisions it's in the process of making of who deserves to get broadband stimulus bucks. So only time will tell. But in the meantime I'm committed to trying to engage in a more robust dialog with NTIA to make sure we don't allow these misunderstandings to take hold and negatively influence the perception of the good work Mr. Strickling and his hard-working team at NTIA are doing.

American Broadband Policy Planning Bass Ackwards

| 3 Comments | No TrackBacks

Many people have pointed out that in an ideal world America's broadband planning would be sequenced differently. For example, it makes sense to have a map of where broadband is and isn't before spending money to stimulate deployment to the un/underserved.

Yet it just dawned on me today that we're not just a little off in our sequencing, we're actually doing things in precisely the opposite order they should be.

The way things are currently aligned we're providing money for deployment from the stimulus, then we're putting out a national broadband plan, then we're going to have a national broadband map.

The problem with this is that none of these steps are able to leverage the other. Stimulus funding is being done in a vacuum. The plan's going to be put out after most of the money has been spent (or at least the rules will have been determined for how it should be spent). And we're not going to know where the holes in our broadband infrastructure are until after we've spent all our money and determined a plan to achieve universal broadband.

What we should've been doing instead is to flip this process around 180 degrees.

Then we would've started with a map to identify where there are holes. Then we'd put in place a plan for how to fill those holes. Then we'd fund those solutions and plans for achieving the goal of universal broadband.

Unfortunately it's probably too late to adjust this sequencing in any significant way. And there is an argument that this was the only way things could be done given that we spent the last eight years without doing anything to push forward the broadband agenda in America.

But even still I think it's important to realize that how we're sequencing things isn't just a little bit off, it's totally bass ackwards. I just hope that this reality doesn't impede us from making the progress we need to be relevant let alone competitive in the digital economy.

There's been a lot of buzz around the benefits and relative viability of wiring all community anchor institutions (schools, libraries, hospitals, etc.) with fiber as the way to get the best bang for the broadband buck. But recent conversations with my fiber-deploying friends have led me to worry that doing this could be a big mistake.

Before going any further let me say that I 100% support the notion that we need to get all of our community anchors wired with fiber. As I've argued before, these buildings have need for the capacity and performance of fiber today. Too many are stuck with networks that are inadequate and too expensive.

But with any broadband policy it's dangerous to assume that an idea is universally good as the devil's always in the details.

A cornerstone of the argument in favor of focusing on community fiber is that once these networks are built they can then be used to extend broadband all the way to homes and businesses. In reality, this isn't necessarily the case.

If we were to just go and build fiber networks to meet the needs of community anchor institutions then it's actually more likely that these networks won't be useful for future deployment. The reasons for this are manifold.

There might not be enough capacity designed into the network, either in dark fiber or spare conduit. There won't necessarily be an easy to way to physically access this capacity even if it's there. The network likely won't be engineered in the optimal way in terms of its layout to facilitate universal broadband as community anchors aren't evenly distributed, especially in rural areas. And even if the network is designed the right way, there needs to be a fair system with clear rules for how deployers can get equal access to these networks.

We need to realize that we've been making these mistakes for years. For example, government's been subsidizing rural healthcare networks that though they may serve the needs of hospitals aren't doing anything to help support universal broadband buildouts.

We also have to understand what's at stake as it goes beyond these networks just not being helpful. In fact, if we focus too narrowly on community fiber we could actually hurt our attempts to achieve the goal of universal broadband.

The reason is simple: if you build a network to serve community anchors, then those institutions won't be available to serve as anchor customers for a community-wide deployment. Without those community anchors as customers, the economics of deployment, especially in rural areas, becomes much harder and may actually make robust, sustainable broadband impossible in some areas.

So we could spend $10-20 billion to wire the country's community anchors with fiber and end up hurting rather than helping the future of universal broadband.

This all being said, that doesn't mean we shouldn't do this. Instead the point of this post is to shine a light on these issues so that if we do do this that we make sure we do it right.

We can't afford to take the easy route on this issue. We need to be taking a holistic approach to how we spend taxpayer dollars so that we can truly maximize the bang we get for each broadband buck.

Because otherwise what should be a terrific way to push our country's broadband agenda forward could turn into a huge mistake that may even do more harm than good.

About this Archive

This page is an archive of entries from December 2009 listed from newest to oldest.

November 2009 is the previous archive.

January 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.