Insight

Lack of Information Hinders Regulatory Review

Executive Summary

  • Given the economic burdens administrative agency rules can impose, it is important to evaluate regulations after they have been in effect for several years to determine if they are working as intended.
  • Since federal efforts at retrospective review have been underwhelming, this analysis attempts to identify whether public researchers have the information they would need to fill in the gaps; based on this review, it is apparent that agencies often fail to disclose all their assumptions and methods for developing regulatory impact analyses when they publish rules.
  • Combined with incomplete publicly available data, these factors make it difficult for public researchers to generate quality retrospective analysis.

Introduction

Administrative agency regulations can impose substantial economic burdens. While agencies perform a regulatory impact analysis (RIA) when they propose and finalize rules, too often similar analysis fails to take place after the rules have taken effect to ensure they work as intended and have the expected economic impacts, despite presidential administration efforts and even legal requirements to do so.

In the absence of federal agencies performing timely retrospective analyses of their own regulations, the American Action Forum (AAF) analyzed several major rules finalized in fiscal year 2012 to determine if there is enough publicly available information for non-governmental researchers to take up the task.

That effort found that in addition to a lack of timely available data that can be used to evaluate the success of rules, the initial regulatory analyses produced by agencies when they issue rules often fail to disclose important assumptions and methods, making comparable retrospective analyses nearly impossible.

Retrospective Analysis Background

Presidential administrations dating back to President Carter’s have recognized the need to review and evaluate existing regulations to determine if they delivered the benefits promised or were still necessary. Every administration since has directed executive agencies to review rules in some form or fashion.[1]

Congress has also enacted retrospective review requirements for some rules. The Regulatory Flexibility Act requires agencies to review rules with a “significant economic impact on a substantial number of small entities” (SEISNOSE) every 10 years “to determine whether such rules should be continued without change, or should be amended or rescinded, consistent with the stated objectives of applicable statutes, to minimize any significant economic impact of the rules upon a substantial number of such small entities.” This review is commonly referred to as Section 610 review and is the broadest congressionally enacted retrospective review requirement.

The success of these administrative and congressional efforts has been mixed. Recent presidential administrations’ ad hoc directives for regulators to review rules has likely contributed to agencies’ failure to develop a consistent review process. Most often, as with the Obama Administration, the effort results in a short burst of activity, culminating in reports highlighting burden reduction but no long-term strategy.

Section 610 review has also largely been a bust. It is plagued by vague definitions and agency indifference. Agencies face no consequences for failing to review rules. Accordingly, review happens less than it should and, if it occurs at all, is often a check-the-box exercise.

As a result of these deficiencies, AAF reviewed a subset of decade-old rules to determine if non-governmental researchers could reasonably perform their own retrospective analysis to gauge if regulatory outcomes matched agency expectations at the time rules were promulgated.

Regulations Selected for the Analysis

AAF used the Office of Management and Budget’s (OMB) 2013 Report to Congress on the Benefits and Costs of Federal Regulations and Unfunded Mandates on State, Local, and Tribal Entities to identify a subset of rules for analysis. This report was chosen as it contains a list of rules from varying agencies, including quantified estimates of both costs and benefits, and with all rules being at least 10 years old, also includes rules determined to be SEISNOSE at the time of their publication (and accordingly should have been reviewed under Section 610).

The report covers the period from October 1, 2011 – September 30, 2012. OMB concluded review of 47 major final rules (out of 278 total final rules).[2] Twenty-two major rules were transfer rules — rules that primarily cause income transfers, usually from taxpayers to program beneficiaries. Of the remaining 25 non-transfer rules, we identified 14 non-transfer rules for which OMB monetized the expected costs and benefits of the regulations. The list of the selected regulations is provided in Table 1 below. Those rules determined to be SEISNOSE at the time of publication are marked with an asterisk.

Table 1. List of regulations selected for retrospective analysis

Regulation Title Implementing Agency Expected Costs Expected Benefits
Administrative Simplification: Adoption of Standards for Electronic Funds Transfer Department of Health and Human Services $0.2–$0.3 <$0.1
Administrative Simplification: Standard Unique Identifier for Health Plans and ICD-10 Compliance Date Delay Department of Health and Human Services $ 0.7

Range

$0.4–$1.0

$0.5

Range

$0.2–$0.8

Administrative Simplification: Adoption of Operating Rules for Electronic Funds Transfer and Remittance Advice* Department of Health and Human Services $0.2–$0.3 $0.1–$0.3
Hazard Communication Department of Labor $0.6

Range

$0.5–$1.6

$0.2

$0.1–$0.2

Standards for Living Organisms in Ships’ Ballast Water Discharged in U.S. Waters*

 

Department of Homeland Security $0.2

Range

$0.1–$0.4

$0.1

Range

$0.1–$0.2

Energy Efficiency Standards for Fluorescent Lamp Ballasts*

 

Department of Energy $1.0

Range

$0.8–$1.6

$0.3

Range

$0.2–$0.5

Energy Conservation Standards for Residential Clothes Washers*

 

Department of Energy $1.1

Range

$1.0–$1.8

$0.2

Range

$0.2–$0.3

Petroleum Refineries – New Source Performance Standards – Subparts J and Ja Environmental Protection Agency $0.4–$0.7 $0.1
National Emission Standards for Hazardous Air Pollutants from Coal- and Oil-Fired Electric Utility Steam Generating Units and Standards of Performance for Electric Utility Steam Generating Units* Environmental Protection Agency $28.1–$76.9 $8.2
Oil and Natural Gas Sector – New Source Performance Standards and National Emission Standards for Hazardous Air Pollutants Environmental Protection Agency $0.2 $0.1
Joint Rulemaking to Establish 2017 and Later Model Year Light-Duty Vehicle Greenhouse Gas Emissions and CAFE Standards Environmental Protection Agency and Department of Transportation $28.8

Range

$21.2–$28.8

$8.8

Range

$5.3–$8.8

National Registry of Certified Medical Examiners Department of Transportation $0.1

Range

$0.1–$0.2

<$0.1
Hours of Service* Department of Transportation $0.5

Range

$0.2–$1.0

$0.4
Positive Train Control Systems Amendments

 

Department of Transportation <$0.1

Range

$0–$0.1

<$0.1

Source: Office of Management and Budget, 2013; costs presented in 2001 billions of dollars

Of the 14 selected regulations, the agencies found a potential impact on small entities for six different regulations. In terms of a public entity being able to complete a retrospective analysis, those that were supposed to be reviewed by law should be good candidates. Agencies should have processes in place to obtain and assess data, and for these six rules, the process of reviewing them should already be complete or close to it. Only one of those six rules, however, has been reviewed under Section 610. In February 2022, the Environmental Protection Agency (EPA) reviewed the rule on National Emission Standards for Hazardous Air Pollutants. EPA analyzed whether the provisions that have potential impact on small entities should remain unchanged or another amendment should be considered. The agency decided that the rule did not require any changes. For the rest of the regulations no ex-post impact analysis has been published yet.

Information Necessary to Perform Retrospective Analysis

For the purposes of retrospective analysis, the first step is to identify the main purpose of the rule to determine whether the implemented regulatory change has achieved its expected goal. For example, improving highway safety and driver health is the purpose of the National Registry of Certified Medical Examiners rule and improving the quality and consistency of information provided to employees regarding chemical hazards and associated protective measures is the goal of the Hazard Communication rule. Additionally, it is important to identify the main variables that are supposed to change after the regulation is in place (i.e., number of certified medical examiners, carbon emissions, volume of wastewater discharged into the water bodies, etc.), as well as estimated costs and benefits associated with the regulation to assess whether the regulation follows the expected path. Accordingly, a good place to start to figure out what information will be necessary is to review the RIA of a given rule when it was published.

After reviewing the RIAs performed for the 14 selected regulations and reviewing the data availability on selected topics, we have identified two main types of problems that would challenge conducting retrospective analysis of these regulatory changes. The first issue refers to the lack of clarity regarding the assumptions that were made in the RIA to generate the estimate of associated costs and benefits. The second problem is the lack of current data availability — or the absence of data entirely. A detailed analysis of these challenges is provided below.

Challenges Associated with the Information Provided in RIAs

The accompanying RIAs of the selected regulations did not always provide comprehensive and sufficient information to assess whether the regulation has reached its goals. In many cases the incompleteness of the RIA is due to the lack of data availability to perform comprehensive estimations (more details below). Notably, the level of details provided in the RIAs differs among different regulations, which implies that some of the RIAs allow one to perform a more comprehensive assessment after 10 years of issuing the regulation than others, although all the rules reviewed for this analysis had gaps in data availability.

The main challenge associated with most of the RIAs is that there is no precise list of the assumptions made, or the rationale behind the assumptions, used to estimate potential costs and benefits of the regulation. This makes it challenging to assess actual costs and benefits of the regulation with the same methodology and compare current results to the estimates. Furthermore, RIA authors do not always precisely provide the data sources of their estimates; it thus becomes complicated to follow the trends of the relevant variables that are affected by the change in regulation. These limitations also violate the criteria of “Transparency and Reproducibility of Results” that are determined by OMB to be one of the main requirements of RIAs. These criteria imply that qualified third parties should be able to understand the main components of the analysis and the methodology of how the estimates were developed. This is not always the case, however.

Another issue is that the timeframes used for the estimates in some RIAs differ from the typical 10-year retrospective review period. The analysis of potential costs and benefits of a new rule is sometimes provided over a longer period, such as a 30-year period for the Energy Conservation for Residential Clothes Washers rule, without disaggregating the estimates on a yearly basis. This makes it problematic to assess the yearly or 10-year performance of the regulation compared to its expected results.

The reason why some RIAs are incomprehensive is the lack of data availability for the RIA authors to make an inclusive assessment. For example, in some cases the authors report that they either could not find the complete data or could not identify relevant models to quantify associated potential benefits. For example, these limitations did not allow the authors of the RIA for the Standards for Living Organisms in Ships’ Ballast Water Discharged in U.S. Waters rule to assess potential benefits associated with the reduction of the secondary spread of invasive species. Another challenge is that some RIAs do not analyze the full range of stakeholders that would be affected by the regulatory change (especially when it comes to the environmental regulations listed in the table above, including those on emission standards, reducing emission levels, wastewater discharges, etc.).

Challenges Associated with Data Availability

As mentioned above, certain challenges to retrospective review are due to the lack of data availability. Even when precise estimate details are provided, it is not always possible to perform an impact assessment 10 years after a rule is issued due to data limitations.

The problem of data availability is twofold. First, some important data is often absent (especially regarding environmental issues, such as emission levels, water resources, and energy efficiency standards). This presents a challenge not only to the quality of the initial RIA and the level of detail of a regulation’s cost-benefit analysis, but the ability to evaluate the impact of a regulation retrospectively, as well. Second, some needed data may be accessible only after a Freedom of Information Act request. This makes the assessment process more time consuming and complicated.

Lack of consistency in data reporting represents another challenge. Some data are only available for certain years, which does not allow one to follow year-by-year developments. The availability of recent or the most up-to-date data is also an issue: In many cases the most recent data available is from 2019, making it impossible to observe the impacts of regulations over the last few years. For example, one of the goals of the rule regarding certified medical examiners is to reduce the number of car accidents involving commercial trucks and buses; however, the most recent publicly available data on accidents (and accidents by types of cars involved) comes from 2019.

Recommendations

For public researchers to fill in the gaps of retrospective regulatory review, improvements need to be made in the development of RIAs and the data collection and dissemination process.

At the initial stage of RIA development, it is important that authors consider that their findings and estimates will be used for the ex-post analysis of the regulation. For this reason, they should try to clearly represent all their assumptions, the rationale of their analysis, and all the information sources they used.

Retrospective analysis also requires the availability of verifiable and consistent data. To solve the problem of the lack of data availability, it is important to identify the data gaps. If data do not exist, agencies should start collecting these data and make it publicly available. If it is available, data should be reported in a consistent manner. Moreover, the agencies responsible for a rule should ensure timely publication of the collected data to allow interested parties to examine the most recent developments.

Conclusion

Retrospectively reviewing the impacts of regulations is particularly important to understand how effectively a rule has worked, whether it achieved expected benefits, and what can be done to improve its outcomes.

This analysis revealed that it can be challenging for the public to conduct ex-post assessment due to a lack of clear details provided in RIAs and limited data availability. Accordingly, it is critically important to clearly provide the details (including assumptions, rationale, and data sources) of ex-ante impact analysis as well as invest more in data collection and data publishing to allow agencies, and the public, to better perform ex-post impact evaluation of major rules.

[1] Aldy, Joseph E. Learning from Experience: An Assessment of the Retrospective Reviews of Agency Rules and the Evidence for Improving the Design and Implementation of Regulatory Policy. Prepared for the Administrative Conference of the United States. November 17, 2014. Pp. 27-36.

[2] For the purposes of this analysis OMB defines a rule major if it meets one of the following criteria: (i) rules are designed as major under 5 U.S.C. § 804(2); (ii) Rules designated as meeting the analysis threshold under the Unfunded Mandates Reform Act of 1995; (iii) Rules designated as “economically significant” under section 3(f)(1) of Executive Order 12866.

Disclaimer