August 2010 Archives

Why Subsidizing Satellite "Broadband" Is A BAD Idea

| No Comments | No TrackBacks

In the latest batch of the second round of the broadband stimulus, RUS announced $100 million worth of grants to satellite "broadband" providers. To some degree it isn't news that RUS is subsidizing satellite as they'd announced plans to set aside this amount previously. But this news does provide an opportunity to explore the fundamental question of: is subsidizing satellite service a good idea or a bad one?

On the surface one can see how some might consider it to be a good idea. On a cost-to-connect-subscriber basis it can seem like a great deal as no fiber has to be laid or towers installed. Just point a dish in the sky and you can be online, which also makes it potentially one of the fastest ways to get people connected.

But let's dig a bit deeper.

First off, there's no getting around that at least as of today, satellite service can't cost-effectively deliver broadband as defined by the FCC. Upstream it can't even reach 1Mbps, and downstream as an illustrative example the highest speed package from Hughes Net is 5Mbps down and 333Kbps up, but it costs $329.99 a month. That's why I call it satellite "broadband" as it's not really broadband, which raises the question: "If it's not really broadband, why are we subsidizing it?"

Secondly, satellite operators claim that next generation satellites will deliver significantly greater performance, but I have two arguments against this. The first is that it's not like these satellites are ready to shoot into orbit yet, so we don't really know when this service will be available. The second is that we don't really know how they're going to perform.

One big question related to this is how much of this additional capacity is going to be used to accommodate more subscribers rather than higher speeds per subscriber. My understanding of current satellites is that part of the reason their service sucks is that they're oversubscribed and some are out of capacity. What guarantees do we have that this won't happen? Which brings us back to the question of: "If it's not broadband today, and might not be broadband tomorrow, why are we subsidizing it?"

A third more minor quibble is that satellite doesn't deliver universal service as if your property doesn't have line of sight to the Southern sky through which to point your dish you're out of luck, which limits satellite's usefulness in America's more mountainous rural areas.

A fourth major area to consider regarding the merit of government subsidizing satellite "broadband" is the fact that these broadband stimulus dollars were only supposed to go to areas that don't have broadband currently available and they were supposed to be used to subsidize the cost of deployment.

Well on the first part of this equation, the subsidies given to satellite companies are being used to offset the cost of installing satellite dishes and sometimes to lower the cost of ongoing service. With this being the case, doesn't that mean that satellite service is already available in all of the areas being subsidized?

The only way this wouldn't be true is if the subsidies were going to pay for launching new satellites covering new areas, but if this is the case then that's rather suspect both because satellites cost a lot more than $100 million and the next wave won't be ready to launch for a while. So this would seem like the exact opposite of what they claim to be a technology-neutral approach to subsidizing broadband.

Back to the more likely case, that these subsidies are being used to offset the cost of satellite installation and subscription fees, if service is already available to the areas being subsidized, then aren't we just subsidizing a broken business model? If service is already available but people aren't signing up for it because it's too expensive and too slow, then why are we going out of our way to prop it up with public dollars? Shouldn't we let the market decide that they aren't delivering a compelling enough value proposition?

Making this investment even more questionable is that within the context of the broadband stimulus, which was supposed to be as much about creating jobs and building infrastructure as getting people connected, how do we justify putting this much money into projects that aren't going to be building up any infrastructure and that will create the minimum number of jobs?

What scares me the most about all of this is what happens if the federal government's able to get its act together enough to make the additional significant investments that will be needed to get rural America connect? Will the areas that receive subsidies for satellite "broadband" be ruled ineligible for receiving subsidies for real broadband since they already received government money? I wish I could say that there's no way something this asinine could happen, but it's likely the satellite companies would cry unfair competition if new subsidies did come available and it's not unlikely that government will continue to think in terms of simply getting people connected and not about how to build up our nation's communications infrastructure.

Now with all this being said, I'm actually not against subsidizing satellite service altogether. I fully recognize the value of satellite service as a way to get the most remote connected and to get online quickly wherever it's needed. So let's focus our subsidies on its strengths.

Rather than frame satellite subsidies as a way of delivering broadband to rural America, let's admit that it's only a stopgap solution until a better, terrestrial connection can be made. Let's figure out some way to make satellite affordably available to everyone who doesn't have access to terrestrial broadband today, and then simultaneously work on a plan to make sure that they all do have that terrestrial option ASAP.

Let's also make sure anyone involved with emergency management in rural areas has satellite available to get access wherever it's needed no matter how remote so they don't have to be isolated when dealing with crises.

Let's stop subsidizing satellite as if it were broadband or infrastructure and start subsidizing it for what it is: a quick, cheap, and preferably temporary way to get people connected to the Internet.

On a quick larger note, I think where these mistakes related to how we're subsidizing satellite really stem from is in our government continuing to try and embrace the notion that it should be technology neutral in its broadband policy making. If we were to start acknowledging the strengths and limitations of various broadband technologies I think we'd start to realize that they can all play a role in our broadband ecosystem because they all bring their own benefits. But if we try to treat them all the same we can end up in situations like we are here with how we're subsidizing satellite.

A hot story in the world of communications policy this week is that a federal court shot down the FCC's spectrum auction rules as it relates to when and how spectrum won in that auction can be resold.

While I'm not a wireless guy, this story did bring back up an issue that I can't quite get my head around.

For starters, this latest round of spectrum auctions raised $30 billion.

At the same time, the national broadband plan called for investing about $15 billion in a public safety wireless network while also claiming that there was no new money coming to subsidize broadband buildout.

Now the FCC's having to try to convince Congress to pony up the $15 billion while at the same time having to attempt to fix the broadband crisis in America with no new money.

Does anyone else see the disconnect here?

For some reason that $30 billion raised from selling public airwaves goes back into the general fund of the US government, never to be seen again.

Why is it we're not reinvesting those proceeds to build up and out our communications infrastructure? Regardless of what rules may be in place today, why can't we change them so that when the FCC sells off public assets it can use those funds to pay for the critical broadband projects that Congress seems unwilling to support financially? Does that just make too much sense or what?

Think about if we had been able to keep that money focused on communications.

First off, we'd already have that $15 billion needed for the public safety wireless network.

Secondly, we'd have additional funds to establish an ongoing mechanism to subsidize middle mile buildout in areas that need it.

Thirdly, we'd likely also be able to throw a couple of bucks into some new way of subsidizing last mile buildout to un/underserved areas.

Instead we're in a position where there's no guarantee we'll get that $15 billion for public safety, and we're getting bogged down in the fight around how to reform the Universal Service Fund, which means we're likely going to have little to no money available to subsidize last mile network deployment once the broadband stimulus runs out.

I'm sure there must be some regulatory or legal reason why the system works this way. But if the FCC is to affect any real change it needs to stop accepting the status quo and start acknowledging that there are better ways to do business.

And perhaps a good place to start is to make the case that whenever it auctions off public assets, it needs to retain some or all of those proceeds so it can fulfill its mission of making sure America has a world-class communications infrastructure.

RUS Doubles Down On DSL Among Other Mistakes

| No Comments | No TrackBacks

A few weeks ago RUS released the first batch of winners for the second round of the broadband stimulus program.

While critical of how they handled the selection process and of how they chose primarily private closed networks in the first round, I had to give RUS credit for focusing most of its support on networks that will put rural Americans on a level playing field with the rest of the world.

Yet their focus on full fiber networks was met by a bit of an outcry from many in the broadband industry, decrying RUS's investment in technology they claim is too expensive and not needed.

Unfortunately, it appears as though those misguided, self-interested criticisms have gotten through to RUS, as evidenced by the fact that they decided to focus much more heavily on subsidizing "cheaper" and less capable networks, in particular DSL, in the second round.

In total, RUS decided to fund about 50 projects in this first batch of second round winners that will use these federal dollars to buildout DSL networks.

The problem with this is that DSL is an outdated technology, and I'm not alone in saying that. If you look at market trends, cable networks are eating DSL's lunch in basically every competitive broadband market. If you look at the world's broadband leaders, most have moved past DSL entirely to focus solely on fiber and wireless buildout.

So what this means is RUS is spending millions of our dollars to saddle rural communities with networks that are not only outdated but may become functionally obsolete in the next few years as more high bandwidth apps and services become available.

I use the word "saddle" quite specifically as the reality is whatever networks we subsidize to rural America today is all they're going to have for the foreseeable future.

Unfortunately, RUS's ineptitude doesn't end there. For example, on the technology front about a dozen projects in this first batch of the second round were for WiMAX. Well you can't throw a rock in the tech press today without hitting a story about how the industry's moving past WiMAX to pick LTE as its next generation wireless standard of choice.

While WiMAX may not necessarily be technically inferior to LTE, the reality is that there's going to be a much stronger and more diverse ecosystem of devices that can leverage LTE networks than WiMAX, so yet again RUS has ignored the reality of clear trends in the evolution of broadband to stick rural America with inadequate networks.

Another glaring example of how RUS is subsidizing inadequate networks is that some of these award winners are only promising to deliver 3Mbps of total bandwidth. Notice how this doesn't even meet the FCC's baseline standard of 4Mbps down and 1Mbps up. Now, I don't blame RUS entirely for this as they had to put out their rules prior to the release of the FCC's national broadband plan, but is our government really so inept that these two agencies couldn't have gotten together and made sure they were on the same page to guarantee that RUS was only subsidizing networks that met the FCC's minimum standards?

In talking about how RUS has handled this first batch of the second round of the broadband stimulus, I'd be remiss if I didn't also call them out for how they released the information about this round's applicants.

For some reason, RUS chose not to include this list in the Applications Database at www.broadbandusa.gov. Instead they released them in a nearly 850-page long PDF. Not only is this a bear to search but there's actually less information than was originally available as no links to any executive summaries were included. Of course, most of these summaries were fluff, but still they afforded the public an opportunity to get to better know who's applying for what.

Making matters even worse is that when they released the list of winners all we know about the winning projects is a single paragraph. Many of these paragraphs say nothing about what technology they plan on using or any details about who the applicant is.

Where's the transparency? Where's the effort to allow the public to engage in this process? Is it really that hard to upload these into the database?

While the more important issues are related to RUS sinking taxpayer dollars into outdated technologies, these last points are important too as they raise additional questions about RUS's stewardship over the realm of rural broadband.

Trust me, I'd like nothing more than for RUS to fulfill its promise and become a champion for rural broadband and an effective agent for change. But despite my belief in some of the individuals in that organization, as a whole I can't help but start wondering if RUS is up for the monumental task of finding ways to get all Americans wired with world-class broadband.

On Wednesday I wrote about the Google-Verizon net neutrality kerfuffle relative to the outcries about its exception of wireless broadband from most open Internet regulations.

Today I want to address the other major sticking point among net neutrality supporters, namely the idea that network operators be allowed to offer prioritized levels of service on their networks.

If all you do is listen to the true believers of net neutrality then you'd think that these so-called "managed services" are a sign of the apocalypse, that the very existence of the Internet as we know it is threatened, and that we must save the open Internet from the self-centered anti-competitive instincts of major corporations.

Well I'm here today to say that while some of these fears are legitimate they're also clouding some fundamental truths by painting managed services as the devil.

The most important thing to realize in this discussion is that broadband networks are more than just gateways to the Internet. These networks already deliver private managed traffic in the form of TV and telephone services.

The question now is what other kinds of services could be enabled by the availability of prioritized access? Let's take a look at a few examples.

- A remote backup service where the backup server is connected directly to the broadband network. With prioritized access customers could ramp up their in-network speeds to 100Mbps and beyond on a fiber network to complete the initial backup in much less time than it'd take sending the data over the open Internet. And in fact in this model the only time that data would even touch the Internet is if you were accessing it while on the road outside of your community.

- A virtualized desktop service for students who take home a thin client laptop and tap into the unused computing power of desktops at schools after hours. With prioritized access this remote access could be essentially latency free, delivering an experience as if you were sitting right in front of the computer. This would allow more students to gain access to online resources from home by leveraging the underutilized capacity of the computers already in our schools. Again, another example of an application that may never touch the Internet if the student's only using desktop applications.

- An in-home medical care service for the elderly who can visit with their local nurses and doctors without the hassle of having to physically be in the doctor's office. With prioritized access this could be a smooth, stutter-free experience video-wise and it could also mean establishing a more secure connection than is possible over the open Internet so as to better protect sensitive health information for patients. Having a service like this will likely lower healthcare costs and improve performance by allowing patients to be seen in a more comfortable environment while also lowering the stress associated with having to leave their place to go to a healthcare facility. Yet again, this service could happen entirely within a broadband provider's network and never touch the Internet.

The problem with much of the criticism of the Google-Verizon net neutrality compromise is that people are denouncing these types of managed services as a universal evil. What they seem to fail to realize is that there are many examples whereby applications and services may be better delivered in-network via prioritized access.

Because of this, what should matter more is not demonizing the tool but instead focusing on how it can and should be used. We shouldn't be advocating against the use of a tool with so much potential for good just because we fear its potential for evil.

Instead what we should be discussing is what's needed to encourage innovation in this next generation of networked communications technologies while at the same time protecting the public's interests.

There's no denying that these managed services have the potential to create an unequal fast lane for the Internet that's only accessible by the biggest corporations willing to spend the most money. We also do need to be worried about broadband providers focusing all of their resources on building out capacity for these managed services to the detriment of investing in greater capacity for the open Internet.

Yet I'm as if not more worried about the possibility that we may forgo the many opportunities managed services present simply because we're worried about what might happen. What I think will be more productive is finding the right regulatory framework to encourage investment in this capacity alongside investment in open Internet capacity and figuring out how this new prioritized access can be made available to all innovators.

I think it's particularly important that we cut through the fear, uncertainty, and doubt surrounding managed services because there are many important and difficult questions that need to be addressed to make these managed services a reality.

For example, how do we deal with the fact that bigger players should logically get better deals when it comes to this priority access since they'll presumably be buying in larger quantities? This is a fundamental truth of capitalism, yet some might paint it as being unfair to the little guy. How we resolve this won't be easy.

Another issue is the idea of exclusive arrangements between broadband providers and app or service providers. Can a broadband provider pick a certain video security provider to partner with? If so, can they sell exclusive access to their prioritized service to them and leave other providers out?

I have reservations about this as its arguable that there isn't sufficient competition to serve as a check and balance against these kinds of relationships. If one video security provider became dominant and established deals with the major broadband operators it might prevent the next great video security company from establishing itself and growing.

Yet at the same time, how can we justify preventing private broadband operators from establishing business relationships with partners that can deliver value-added services to their customers?

These are the questions that really matter, but they're also the questions that can easily get glossed over if the debate continues to paint managed services as the devil.

For me, I see managed services not as the devil but as having the potential to establish a new paradigm for how the next generation of the Internet can work, when customers can get access to the bandwidth they need as they need it, where real-time applications can have low latency access, and sensitive services can have greater security.

In an ideal world, I see managed services as the natural evolution of the Internet, as potentially delivering a level of access that's currently unavailable over the open Internet but that is likely needed to enable the next generation of networked applications, a new class of apps and services that may not even require the Internet to start impacting people's lives in profound ways.

So as we continue to debate the many nuanced issues surrounding net neutrality, let's not fall prey to attempts to demonize the very idea of managed services. Instead let's acknowledge their potential to do good and focus on what's needed from a regulatory perspective to make sure that that good is ultimately realized by broadband providers and the public alike.

In all the lamenting about Google and Verizon's attempt at finding an industry-based solution to net neutrality there's been a very important point that's been made but that the significance of which has not been acknowledged, namely that wireless broadband does not equal wireline broadband.

In the Google-Verizon proposal, they specifically cite that wireless broadband should not be held to the same net neutrality standards as wireline. While many have cited that as a loophole in the plan, I see it as exposing the false truth that wireless can be a substitute for wireline access to the Internet.

To start this analysis it's important to acknowledge that there are technical differences between wireless and wireline networks in terms of their capacity and performance. Because of this it makes sense that they'll need to be treated differently from a regulatory perspective.

Google and Verizon are not alone in this thinking as the FCC has hinted that they also believe that the rules for net neutrality need to be different for wireless vs. wireline.

The results of this to me are simple: if we want every American to have equal access to the open Internet, then every American needs a robust wireline connection.

Now, to many people in the know this is not much of a revelation. Certainly anyone who advocates for full fiber networks believes that wireless is not a substitute for wireline.

But when it comes to making policy, this is a profound divergence from the status quo.

I'd be willing to bet that most policymakers in DC think that wireless is good enough. Our national broadband plan goes so far as to basically say that rural America should be fine with wireless alone. And our agencies tasked with addressing rural broadband, namely RUS, are investing in wireless networks almost as much as wireline.

One of the biggest debates in DC these past few years is that of wireless vs. wireline, with wireless advocates claiming that we don't need wires any more and that all our connectivity needs can be solved through the airwaves.

Well, putting aside issues of performance and capacity (where wireline can dramatically outclasses wireless), how can anyone claim with a straight face that wireless can replace wireline when the current thinking at a policy level is that wireless can't be counted for the same level of openness as wireline?

Given that wireless networks are operated differently from wireline, how can we preserve the open Internet without a wireline connection to every home?

For me, this issue throws into stark contrast the fact that wireless networks, while invaluable, can only be thought of as complementary to rather than a substitute for wireline networks.

What this gets to the heart of is the wrongheadedness of our policymakers' attitude of preserving a technology neutral perspective at the expense of making good policy. Our policymakers have taken the position that all bandwidth is created equal and therefore all that matters is which broadband network can get people connected in the most cost effective manner.

These issues surrounding how to preserve the open Internet through mobile networks shows that this is not the right approach. That instead we need to make sure we get people connected the right way to the Internet, and that if we think the right way is through an open pipe that means we need to get a wireline connection to every American.

I hope we're all able to take this moment in time to highlight this issue to our local, state, and federal policymakers. We can't afford to continue letting this stupid debate over wireless vs. wireline to continue when it's so obviously not a valid argument.

Wireless obviously has its place as a key cornerstone of our broadband ecosystem. But the time has come to acknowledge that if we want an open Internet, it can't be the only answer. To preserve the open Internet, every home needs a wire.

Why Can't Net Neutrality Debates Be Public?

| No Comments | No TrackBacks

With net neutrality officially back on the FCC's front burner, there's been an incredible uproar over the closed-door nature of the FCC's negotiations with the major players in broadband policy in their attempts to hammer out a compromise.

While I actually agree with some of what's come out about those and related discussions between Verizon and Google (namely the idea that it's OK to prioritize different types of traffic differently, but it's not OK to prioritize one provider of the same type of data over another), I can't help but wonder: why can't these net neutrality debates be public?

What's so remarkable about this whole issue is that throughout its five year history I'm not sure if there's ever been a real comprehensive public debate about it.

Sure there've been public forums at conferences large and small, but those panels are either skewed or at the very least incomplete, with only some players represented. Even the FCC's own public discussions have only been half steps in the right direction.

Every time there's been an attempt to dig into this issue in a public forum it's either been artificially constrained by having limited time to talk or by rules that prevent a true debate from happening. Very rarely have I seen a net neutrality debate go beyond "I'm right and you're wrong."

That doesn't mean it hasn't or can't happen, though. In fact, I think nothing would be more beneficial to the FCC's, Congress's, and the public's understanding of the many nuances of the net neutrality debate than if we had a true debate with all the major players in attendance.

Block off a whole day for this. Make sure it's open to the public and being webcast. Find ways for the public to comment, ask questions, and be involved. Give all the players a chance to say their piece, as well as the opportunity to refute what others have said. Make sure to include third parties whose opinions lie in the middle that have the technical wherewithal to call bologna when one side or the other tries to obfuscate or twist the truth. Pull in experts from other countries to give their feedback.

What this will do is force all parties involved to make their case and justify it in front of the world and their opposition. Part of what's holding the net neutrality debate back is that both sides are able to get away with making incomplete arguments that don't respect the truth that exists on both sides of this issue.

Now, I understand that putting this debate together and facilitating a real dialog may be easier said than done. Keeping the tenor constructive may be a challenge as these things can quickly devolve into shouting matches. Also, some of the major players may be hesitant to speak what's really on their minds if they have to do so under the bright lights of public scrutiny.

But to go so far as to actually be crafting new net neutrality regulations without having that direct public debate seems more than a little disingenuous to me.

And even worse, keeping the real debates private may have a detrimental impact on America as the only way that can work is if we presume the FCC has the independence and expertise to serve as a check and balance against corporations fighting for their own interests. I wish I could say that I have confidence in the FCC to play this role, but their track record to date has not shown this to be the case.

On this most fundamental of issues to determining our country's broadband future, we can't afford to leave it up to corporations, who whether it's right or wrong are focused primarily on furthering their own interests, and regulators, who seem to lack the spine to truly fight for the public's interests.

Of course, it can be argued that reaching a compromise of any sort that brings even a modicum of resolution to these issues is in the public's best interest. But it's hard to say if this is actually the case if the public doesn't really know what's being debated and how.

So if the FCC wants to have its compromise be taken seriously and to have any shot of garnering support from the public at large, it must step up, pull back the curtains, and finally facilitate what we've needed for a long time: a comprehensive public debate about net neutrality.

About this Archive

This page is an archive of entries from August 2010 listed from newest to oldest.

July 2010 is the previous archive.

September 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.