Anatomy of Overreach: The Obama Regulatory Agenda

Executive Summary

The U.S. has witnessed an explosion of federal regulatory activity. The expansion of the size and role of government appears at odds with the tradition in the U.S. of reliance on the private sector, and surprising given the regulatory apparatus designed to stop regulations whose costs outweigh benefits.

The regulatory onslaught is the result of a government-wide, systematic effort that begins by setting performance standards for the private sector that simply cannot be met and/or by mischaracterizing the functioning of markets. Having “established” that the private sector is failing, the next step is to turn to increased government intervention.

This short paper documents the progressive playbook for regulatory expansion and shows four specific examples in action from four different federal agencies. Of course, there are many more examples of regulatory overreach throughout the federal government. These are merely straightforward and direct instances of regulatory creep.


The regulatory explosion during the presidency of Barack Obama is daunting. On the day of his final state of the union, the administration had finalized more than ten new rules every day, on average, with the total reaching 25,155. In the process, it had generated (for those rules that were quantified) $727 billion in new burden costs and added more than 460 million new hours of paperwork. Indeed, government-wide paperwork hours are now at an all-time high.

While many associate the expanding regulatory reach solely with one area of interest or expertise, the reality is that it has occurred across all agencies and policy issue areas. This is quite striking. One might imagine that at any point in time there might be a few areas where a stronger government presence was merited over the last several years. By the same token, we would also expect the facts to have argued for the government’s reach to be scaled back in other areas. How is it that the private sector managed to universally “fail” during the past seven years?

The short answer is that it never had a chance. The progressive approach to policy analysis begins by “proving” that the private sector has failed to meet the needs of American families, thereby leaving government as the only option and the enormous regulatory expansion as the inevitable result.

In the remainder, I document the regulatory overreach, characterize the progressive playbook for policy analysis and illustrate how it has led to dramatic overreach by the Department of Labor, the Federal Communications Commission, the Environmental Protection Agency, and the Treasury Department.

The Progressive Approach to Policy Analysis

The Obama Administration’s regulatory overreach is stunning in both its depth (see above) and its breadth. To get a flavor of the latter, consider this list of regulatory actions across the agencies:

• Federal Communications Commission – Special Access Regulation and Title II Regulation of the Internet
• State Department/White House – Rejecting the Keystone XL pipeline
• Energy Department – Energy efficiency standards
• Environmental Protection Agency – the Clean Power Plan
• Health and Human Services – the Medicare Part D rule
• Treasury Department – the designation of MetLife as a Systemically Important Financial Institution (SIFI)
• Education Department – the gainful employment rule
• Department of State/Defense – The Iran nuclear deal
• Health and Human Services – the Affordable Care Act rules
• Labor Department – the overtime rule
• Homeland Security Department – deferred action for illegal immigrants
• White House – repeated violation of release dates for the Unified Agenda

One could identify even more. I turn now to a more detailed description of the patterns underlying the regulatory explosion.

Labor Market – Overtime Rules

President Obama in March 2015 directed the Department of Labor (DOL) to expand the number of salaried workers covered by federal overtime standards. Under the Fair Labor Standards Act (FLSA), employees who work more than 40 hours per week must be paid 1.5 times their usual pay rate for each overtime hour.

However, there is a so-called “white-collar exemption” for executive, administrative, or professional employees who are ineligible to receive overtime pay. The Secretary of Labor can change who is entitled to overtime pay by modifying the requirements for the white-collar exemption.

But why should the Department of Labor be dictating pay rates to begin with? What problem would that solve? Progressives frequently assert that growth in labor productivity is outpacing growth in compensation. The primary piece of evidence is shown in Figure 1. It is intended to suggest that the economy is not properly compensating workers, and is one of the key reasons many believe it necessary to raise the minimum wage, expand overtime pay coverage, mandate paid family leave, and increase union membership.

AAF examined this assertion.

Unfortunately, the figure is based on faulty statistical analyses. In particular, these claims are based on an analysis that (a) compares labor productivity of the entire economy to compensation for private sector production and nonsupervisory workers; that is, does not use an apples-to-apples comparison; and (b) employs two different price indexes (one for productivity and another for compensation) to adjust for inflation. If one does an apples-to-apples comparison and uses a consistent inflation-adjustment, real compensation has grown closely with labor productivity over the past fifty years. See Figure 2.

The progressive playbook is simple. “Prove” that private labor markets do not work (“the economy is rigged”) and then turn to intrusive government interference to “solve” the problem.

Federal Communications Commission – Special Access and Title II Internet

Internet-based businesses have been a dynamic source of innovation, employment and growth. In recent years, the Federal Communications Commission (FCC) chose to regulate both the business and the residential Internet using an antiquated apparatus designed for a monopoly, copper-wire telephone system.  The heightened regulation of business Special access services, as they are known in telecommunications, encompasses a range of data and voice services used by businesses and competitors provided by incumbent telephone companies. The FCC is tightening the antiquated regulatory noose on this sector and thereby ignoring and endangering the potential advantages of innovative approaches from cable and Ethernet providers. Meanwhile, the controversial, so-called Title II regulation of the residential Internet is unprecedented.  Both are dangerous steps in the wrong direction.  How did this happen?

The Internet ecosystem has been a source of innovation, employment and growth over the past decade. From 2004 to 2009, nearly 15 percent of all economic growth came from innovations in the Internet space. Estimates place this sector at about 5 percent of Gross Domestic Product (GDP), making it larger than agriculture, transportation, and the house rental sectors combined. In part, this was a tribute to the deliberate “light-touch” regulatory approach followed by the Federal Communications Commission from the 1970s.

Now the FCC has put Special Access regulation at the top of its agenda. Special access services from incumbent telephone companies have faced competition in the past decade to provide high capacity data and voice lines from both cable companies and other network infrastructure players.  These new entrants have leapfrogged the older technology in the local telephone networks by utilizing fiber. In this way, the special access market is taking on the dynamism characteristic of the consumer Internet market.

In the latter case, however, President Obama sent shockwaves through the sector by supporting so-called “network neutrality” in the form of  “Title II regulation” and the FCC soon followed suit. The move was equally startling and disappointing, as the Title II regulatory regime was developed to regulate a monopoly telephone network and is ill-suited for the dynamic Internet setting, but is well-suited to ensure that FCC power goes unchecked. It likely will mean higher costs, which will be passed to families. Economically irrational regulation means less innovation and investment, to the detriment of consumers everywhere. And poor regulation threatens the Internet economy that supports about 2.6 million jobs.

How did the administration accomplish a U-turn on a successful Internet regulatory regime? As in other areas, the first step was to assert that the current system was failing. A useful compendium of the straw men set up by the progressives is contained in President Obama’s statement.

For example, he urges regulators to “keep the internet free and open.” Here one cannot help but pick up two different uses of “free.” While everyone agrees that the Internet should be a space of permissionless innovation, being innovative does come at a cost. Progressives present this as evidence the current system is clearly failing, suggesting that freedom on the Internet should come at no cost to produce. Next is the assertion that there should be “no toll roads on the information highway.” This is related, but sets the standard that there cannot be specific charges for specific services – just as the toll covers the use of a particular road. Charging differentially for superior service is standard business practice, but is disqualified on the progressive Internet.

The next step is to assert that the real problem is not with individuals, but rather with companies; i.e. the assertion that one “can’t let any company pay for priority.” Of course, companies would only do this on behalf of their customers. And, finally, he argues that any combination of these items will impair the ability of “consumers, not cable companies, to get to decide which sites to use.”

This simple notion that all data should be treated equally and free when it passes over the Internet is superficially appealing but actually difficult and dangerous. Networks must be constantly managed to ensure service. At any point in time, the cumulative burden of Netflix streams, LinkedIn invitations, Facebook posts, web surfing, Tweets, email, movie downloads, phone calls, and other content may threaten the ability of an Internet Service Provider to handle the volume. This is why Internet engineers, beginning in the 1980s, built into the very core of the Internet the kind of differentiated treatment of data that the president now decries. With these new rules, the dynamism and market entry that has ramped up in recent years might have sustained a serious blow.

Having used this playbook once successfully, the FCC has shifted focus to the special access market. In the process, it seeks to impose a one-size-fits all approach inconsistent with basic business practice.

Environment – Greenhouse Gas Regulation

The Environmental Protection Agency (EPA) released an ambitious plan to reduce greenhouse gas (GHG) emissions from existing power facilities 32 percent by 2030. The rule is one of the most costly in the past decade, as EPA estimates annual costs of $8.4 billion and hundreds of thousands of compliance hours.

How could such a rule be adopted?  By establishing a convenient set of “facts.” For example, although EPA estimates approximately 34,000 fewer jobs in the “electricity, coal, and natural gas sectors” in 2020, it proffers a gain of 83,300 jobs for “demand-side energy efficiency employment.” This is at odds with any research literature (and common sense).

Similarly, in EPA’s press initial release, the agency touted that its proposal would “shrink electricity bills roughly 8 percent.” However, buried deep in EPA’s regulatory impact analysis, they concede a hike in retail electricity prices of 3.2 percent in 2020. This is a far cry from shrinking utility bills.

Finally, the EPA misleads on the benefits of the climate rule. A large portion of the benefits come from reduced emissions of fine particles, ozone, sulfur dioxide, and nitrogen oxides – which in turn reduce the incidence of respiratory impacts – not from climate improvements.

Under normal circumstances, such a rule would be disqualified on standard benefit-cost grounds, but not after a set of artificial benefits and costs are presented.

Financial – Metlife Designation

The Financial Stability Oversight Council (FSOC) was created by the Dodd-Frank Wall Street Reform Act to serve as the regulator of “systemically important financial institutions” or SIFIs. Dodd-Frank gave the FSOC the power to identify entities that, by their mere size or interconnectedness, threatened the financial system as a whole, and to impose upon those entities a much more severe regulatory regime.

Recently, the FSOC designated MetLife as a SIFI. But why is the FSOC after an insurance company? After all, the recipe for financial trouble is to combine leverage with short-term funding — i.e., owe a lot of people a lot of money and owe it fast. This trouble is exacerbated by a mismatch in the duration of assets and liabilities – for example, when deposits that might have to be repaid overnight are used to make 30-year home loans.

Insurance companies, in contrast, have longer-term liabilities (e.g., life insurance policies) that they put aside reserves of longer-term assets to fund. Indeed, looking at the MetLife 2013 balance sheet (get the Metlife annual report here) one can only find something like $40 billion of plausibly short-term borrowing, a tiny 5.7 percent of all liabilities and a mere 11.8 percent of funds made available by selling marketable securities on the balance sheet.

MetLife just does not look like a financial threat, so how did the FSOC make the case for designation? It argued for a different set of “facts.” First, MetLife is a large financial institution and was conveniently lumped in with other “too big to fail” financial institutions (like commercial banks).

Second it focused on its label (a large insurance company) and not its actual activities and products. In this way, it could be grouped with American International Group (AIG), a large insurer that had to be bailed out during the financial crisis. Of course, it was the non-insurance activities – namely the marketing of credit default swaps – that got AIG into trouble.

As in other areas of regulation, FSOC’s basic approach is to create an artificial set of standards that the private sector simply cannot attain, and support the decision with an artificial set of facts.


While policymakers on both sides of the aisle can agree that federal regulation is sometimes necessary to serve the public and address market failures, the pace and breadth at which today’s regulators issue rules fails a common sense test. The growth of government and the extended reach of its regulatory power argue for legislative efforts to rein in not only the quantity of rules finalized by federal agencies, but also the scope and cost of these regulations. Though several reasonable attempts to allow for greater oversight and basic accountability have been introduced in Congress, none have yet made it to the president’s desk.