A Framework to Reform FCC Competition Policy

Principles to Rationalize Broadband Competition Policy

The broadband market presents a series of regulatory challenges. Broadband competition is vigorous, facilities-based and intermodal; while the relevant law is largely siloed. While past regulatory choices have lead to a robust broadband market, laws governing the communications and technology sector need modernization.[1]

Three broad observations should be kept in mind when considering how to conceptualize competition. First, the technical features of the broadband market make onerous regulation unwise. Generally speaking, the Federal Communications Commission (FCC) should strive for regulatory humility, identifying damages only as they occur and imposing appropriate remedies. Second, broadband has flourished because it has been subject to light regulation and intermodal competition. As evidenced by speed increases, usage, and prices, both wireless and wired are competitive markets that in turn compete with each other. Third, any restructuring of the approach of the FCC should be consistent with these market dynamics, and should take a more comprehensive view that ensures the continued development of high speed Internet. 

Principles to Guide Competition

Two features of broadband set it apart from other industries. First, convergence has fractured the traditional notion of “the” market. High speed Internet is only as useful as the applications that run on top of it, so what really makes the Internet useful is the ability to send email, watch videos, or make a voice call. As these applications proliferate and are substituted over various networks, they compete with each other in non-traditional ways. Advancements in voice communications recently put copper service in direct competition with voice-over-Internet protocol (VoIP) services and cellular. The change has been swift; since 2000, there has been a loss of nearly 100 million service lines.[2] As these kinds of substitutions continue in the future with different applications, competitive pressure will be placed on incumbents, which the legal regime should reflect.

Second, the development of Internet access infrastructure is best handled at the local level. No one is sure what form the Internet will take in the future and what works for the development of wireless in New York City cannot be similarly applied to wireline in Topeka. Prescriptive rules like network neutrality made in the name of competition are likely to distort market preferences, thus stunting development and leading to stranded investment.

The two features suggest that the best principle to guide decisions is regulatory humility.  Regulators are simply not capable of knowing what the future will bring and what kind of competitive elements will enter the market to topple the next large player. When the Internet was privatized its initial growth was spurred. This approach was written into “A Framework for Global Electronic Commerce.” The government’s official position then, as it should be now, is that the private sector should be in the lead. Moreover, governments should avoid undue restrictions on electronic commerce by only getting involved in the market when needed to support and enforce a predictable, minimalist, legal environment.[3]

Two strands of regulatory humility are present in the Framework. Regulators need to be conscious that they do not lose sight of long-run goals by focusing on immediate solutions. In so doing, they will avoid politically contentious projects and extended regulatory costs that consumers ultimately bear. The current discussion surrounding broadband reclassification exemplifies this concern. In the desire to impose strict network neutrality rules, a chorus of voices has demanded that broadband be placed under Title II of the Telecommunications Act, which would subject these companies to common carrier regulation. Currently, broadband is regulated under Title I, which has far fewer price controls and requirements than Title II. This light touch approach has resulted in the vibrant broadband ecosystem we now enjoy. Notwithstanding the long protracted political fight that would take place, countless business contracts that were freely negotiated would have to be thrown out and settled again under the FCC’s rules if Title II were imposed. Additionally, the Google’s and Facebook’s of the world would be subject to telephone regulation, which could potentially make their current business model illegal. Title II reclassification is a short-term solution with serious long-term costs. 

Another related but distinct form of regulatory humility involves the tendency to underappreciate markets. A widespread “tendency to underestimate the benefits of the market mechanism,” undervalues the role of competition and entry.[4] Contrary to what many said at the time, the AOL-Time Warner merger never ended competition, simply because the winds of consumer preferences changed. It is a lesson in competition that the FCC needs to employ.

U.S. regulators should adopt a three-step analysis for competition policy and new regulations:

  1. Prove the existence of market abuse or failure by documenting actual consumer harm, following the approach set by the Federal Trade Commission;
  2. Explain that current law or rules are inadequate, and that no alternatives exist including market correctives, deregulatory efforts, or public/private partnerships to solve the market failure; and
  3. Demonstrate how the benefits of regulation will outweigh the potential countervailing benefits, implementation costs and other associated regulatory burdens. 

 What the U.S. Gets Right with Broadband

Today, the United States ranks as the 10th fastest country in the world for wired broadband, up from a low of 15th fastest just two years ago.[5] Even though it surpasses many European countries like the United Kingdom and Denmark, it still trails densely populated and urban countries like Japan and South Korea, who face much lower costs in connecting neighborhoods. Nevertheless, average speeds of fixed Internet broadband in America have ticked up by about 20 percent every year, according to the FCC.[6]

As of January 2013, 99.5 percent of Americans have access to some form of broadband including both wired and wireless options. When wired broadband is considered by itself, 90 percent of the population has access to a wireline technology with an excess of 10 Mbps download speed. Just under 18 percent of the population can hook into a super fast fiber network now, up from just 11 percent a couple years ago.[7] While a lot of attention is paid to the largest firms, there are also nearly 2,000 providers of broadband service across the US.[8] It is no surprise that the U.S. adds broadband subscribers at among the highest rates in the world and is faster on average than many similarly industrialized countries such as Canada, New Zealand, Austria, France and even Australia, which has dumped billions into a massive fiber project.[9]

While merger discussions between Comcast and Time Warner Cable have placed the focus on fiber, DSL still commands over a third of the fixed broadband market.[10] DSL technologies being adopted now will give consumers faster speeds over a variety of networks, while AT&T’s $6 billion network upgrade will bring the entire telephone network onto the Internet. Even though there is widespread interest in the superfast speeds offered by Google Fiber and Verizon FiOS, a more practical and less costly upgrade for many Americans will be faster DSL. DSL is uniquely situated to serve consumers as it covers just shy of 90 percent of the households in the United States.[11] A Google announcement of a fiber project in Kansas City sparked a new wave of interest in fast broadband, and CenturyLink and AT&T have also entered into the fray with fiber projects. 

Compared to the Europeans, the U.S. tends to have cheaper broadband access on initial tiers below 12 Mbps, which helps to incentivize entry into the market.[12] However, U.S. broadband is more expensive for the higher speed tiers, which is consistent with the fact that the average U.S. user consumes double the data as her European counterpart.

The speed increases and cheap entry prices in turn are leading to the adoption of Internet video services.  These are cannibalizing traditional TV and placing further demands on Internet providers to upgrade networks. One survey found that 23 percent of Netflix subscribers have canceled their premium TV service, which is reflected in subscriber losses.[13] In 2013, cable companies lost 1.7 million video subscribers, while the telecommunications firms, Verizon and AT&T, picked up 1.4 million.[14] Intermodal competition has been hugely successful as a de facto policy of the FCC. Both Verizon and AT&T have seen strong growth in their broadband services, suggesting that consumers are switching for bundled TV and Internet. The top five cable companies stand to lose around 10 percent of their customers to cord-cutting or carrier-switching in the next 12 months.[15] While TV is seen as a separate market, it is clearly have an important competitive effect on broadband.

The wireless space is even more impressive. Last year alone, mobile data consumption grew 81 percent, while the speeds doubled.[16] Texts, too, have become more commonplace, jumping over 1,100 percent in a four-year period from December 2005 to 2009. Spectrum auctions led to early adoption, putting the US on the top of the total global 4G connections with 23 percent. U.S. mobile data traffic is projected to grow 3 times faster than U.S. fixed IP traffic from 2013 to 2018.[17]

The explosion in data use has been driven by the widespread adoption of the smartphone, which is probably the first technological adoption in history that has occurred on every continent simultaneously. The introduction of the iPhone didn’t just usher in the current cutthroat handset market, it also upset a balance of power that favored the wireless carriers. Ubiquitous handsets and access to the pipes gave telecommunications firms the upper hand before the smartphone, but in a change of course ushered by Apple,

Carriers are learning that the right phone—even a pricey one—can win customers and bring in revenue. Now, in the pursuit of an Apple-like contract, every manufacturer is racing to create a phone that consumers will love, instead of one that the carriers approve of.[18]

For AT&T, the introduction of the iPhone was a game changer. From 2007 to 2010, data traffic increased over 8,000 percent, requiring vast upgrades in its network.[19] The kind of investment needed to keep up with smartphone use is costly to be sure, but even as consumers have increased their use of data, prices have dropped. Wireless prices, according to the Bureau of Labor Statistics, are considerably lower than when collection began, which suggests that prices have dipped below inflation.[20]

Taken together, prices have declined, handsets now have more technological features, and the quality of the networks has advanced. By conventional standards, the markets seemingly have become more concentrated. However, far from harming competition, consumers are clearly seeing huge improvements.  

Because of the swift advances in technology, the smartphone has become the broadband choice for some. Half of cell Internet users ages 18-29 mostly use their cell phone to go online, forgoing either computers or tablets. Even the FCC noted in the 16th Wireless Report, that there is huge potential in smartphones,

Mobile wireless Internet access service could provide an alternative to wireline service for consumers who are willing to trade speed for mobility, as well as consumers who are relatively indifferent with regard to the attributes, performance, and pricing of mobile and fixed platforms.

As speed ticks up and the applications continue to flourish, the differences between fixed and wireless broadband will diminish considerably, leading to even more substitution between them and even more competitive pressure. 

Putting together the pieces for regulatory reform

The past two decades have been a time of immense change for the Internet ecosystem: intermodal rivalry between cable and telecommunications stretching across both TV and broadband, the rise of content owners over the net, new business constraints on wireless carriers, and substitution between fixed and wireless have all placed new competitive pressures on these industries. Definitions of competition need to incorporate these changes, by considering more carefully technology substitution and quality changes.  

The lighter touch regime afforded to broadband companies under Title I has been part of the reason for the rapid deployment of these networked technologies. In contrast, the old regulatory style of Title II is exactly the wrong option. AT&T’s multiyear process in upgrading their old telephone networks to an Internet based architecture exemplifies just how problematic Title II regulation is. While changes have been made throughout the AT&T networks, the end mile that is heavily regulated is still on legacy technology, some of which has not been manufactured in decades. Ensuring that consumer continue to get benefits requires that we reconsider the title classification system. This means that we should pursue technology neutral regulation that sees the market as converged and regulates after harms occur. Clearly, then we are talking about moving the FCC to an enforcement role.

Moving in this direction presents its own set of challenges, namely, that the FCC operates in the “public interest, convenience and necessity.” This phrase, which was never meant to have the power that it does, has never been defined in its 70 some years of use and has been the subject of much debate. Moving away from this standard to something more like the Federal Trade Commission’s consumer harm standard would be preferable, however it would create duplicative regulatory agencies. Thus, it begs a bigger question: What exactly should be the FCC’s role in the future?

Some have suggested that the FCC be folded into the FTC.[21] Even though it would be a laborious task, the current competitive environment requires fresh thinking. The Spectrum functions could be handed off to the National Telecommunications Information Agency (NTIA). Public safety concerns could be housed under the Department of Homeland Security, while the Universal Service Fund could be transferred to the Department of Education. Such a move should be an option on the table. At the end of the day, consumer harm is the standard by which we need to gauge business actions, not the public interest. The FTC has a long legal history of this and a bureau dedicated to economic understandings to back up their work.    

A clear example of the difference in approaches is the issue of network neutrality. While the FCC has spent nearly a decade trying to grab power to regulate, the FTC instead,

…Recommends that policy makers precede with caution in the evolving, dynamic industry of broadband Internet access, which generally is moving toward more – not less – competition. In the absence of significant market failure or demonstrated consumer harm, policy makers should be particularly hesitant to enact new regulation in this area.[22]

Simply put, network neutrality threatens to derail investment all in the name of public interest.

Considering that a merger between the agencies is unlikely for political reasons, the Commission would benefit from adopting a multi-stakeholder approach to broadband problems. In his dissent of the most recent network neutrality rules, Commissioner McDowell explored the general layout of such a program,

 In lieu of new rules, which will be tied up in court for years, the FCC could create a new role for itself by partnering with already established, nongovernmental Internet governance groups, engineers, consumer groups, academics, economists, antitrust experts, consumer protection agencies, industry associations, and others to spotlight allegations of anticompetitive conduct in the broadband market, and work together to resolve them. Since it was privatized, Internet governance has always been based on a foundation of bottom-up collaboration and cooperation rather than top-down regulation. This truly ‘light touch’ approach has created a near-perfect track record of resolving Internet management conflicts without government intervention.[23]

In leading this way, the FCC could more efficiently solve the problems that afflict consumers. It would also provide guidance for future developments and bring together the FCC and the FTC to stand as the US government’s unified voice on technology regulation.


Three broad themes provide an intellectual grounding to make sound policy in the coming years. First, broadband is a quickly changing market, which makes onerous regulation unwise. We should be agnostic about how these networks develop because no one is sure what the Internet should look like. Thus, the Commission should strive for regulatory humility and regulate after problems occur. Second, the market has flourished due to intense intermodal competition and smart regulatory practices, as evidence by the prices, speeds and quality increases. This light touch legal regime needs to continue in the future. Third, any restructuring of the FCC should be consistent with these market dynamics to ensure the continued development of high speed Internet. The FCC has the power to move huge network industries with their regulatory regime. Time has shown that when they keep their hands off and let consumers decide, everyone wins. When the Communications Act is updated, it needs to incorporate these lessons.   


[1] The House Energy and Commerce Committee decided to focus upon competition for one of its #CommActUpdate whitepapers.

[2] Craig Moffett, The State of the Net: 2012, Advisory Committee to the Congressional Internet Caucus,

[3] The Framework For Global Electronic Commerce, The White House,  

[4] Bryan Caplan, The Myth of the Rational Voter

[5] Akamai’s State of the Internet Q4 2013, Akamai,

[6] A Report on Consumer Wireline Broadband Performance in the U.S., Federal Communications Commission,  

[7] Broadband Statistics Report Access to Broadband Technology by Speed, National Broadband Map,; Superfast here is 50 Mbps or more.

[8] Broadband Statistics Number of Providers by Speed Tier, National Broadband Map,

[9] Household Download Index, Ookla,

[10] Internet Access Service: Status as of December 31, 2012, Federal Communications Communication,

[12] Christopher Yoo, U.S. vs. European Broadband Deployment: What Do the Data Say?  

[13] Erik Gruenwedel, Survey: Nearly a Quarter of Netflix Subs Cancel Pay-TV Service, Home Media Magazine,

[14] Jon Brodkin, Comcast and Time Warner Cable lost 1.1 million video customers in 2013, Ars Technica,

[15] Brian Fung,  ‘A soup of misery’: Over half of people say they’d abandon their cable company, if only they could, Washington Post,

[16] Cisco, Cisco Visual Networking Index: Global Mobile Data Traffic Forecast Update, 2013–2018,

[18] Fred Vogelstein, Weapon of Mass Disruption, WIRED,

[19] Marguerite Reardon, Is AT&T considering throttling heavy data users?, CNET,

[20] Databases, Tables & Calculations by Subject, Bureau of Labor Statistics,  

[21] Richard Bennett, Jeffrey A Eisenach, et al, Comments on Communications Act Modernization, Social Science Research Network,

[23] Dissenting Statement of Commissioner Robert M. McDowell, Preserving the Open Internet, GN Docket No. 09-191; Broadband Industry Practices, WC Docket No. 07-52; Report & Order, FCC 10-201