Skip to main content

Justice, Fairness, Inclusion, and Performance.

Implementation of the Evidence Act: A Discussion at the NAPA EOM Panel

February 05, 2019

Background:

In one of its last actions before adjourning last month, the 115th Congress passed the “Foundations of Evidence-Based Policymaking Act of 2018.” This new law implements about half of the recommendations made by the Commission on Evidence-Based Policymaking in September 2017. Among other things, the new law requires agencies to designate Evaluation Officers and Chief Data Officers, and to publish their data sets in machine-readable form.

This panel session discussed some of the implementation challenges that will face the Office of Management and Budget and the agencies, and potential approaches that could be taken to maximize success.

Discussants:

  • Robert Shea, Moderator; Grant Thornton (and former commissioner on the Commission)
  • Nick Hart, Bipartisan Policy Center (and former research director for the Commission)
  • Nancy Potok, Chief Statistician of the U.S. (and former commissioner on the Commission)
  • Kathy Stack, former OMB lead of its evidence and innovation unit
  • Christian Hoehner, Senior Policy Director, Data Coalition

Presentation Highlights:

The Evidence Act has three distinct sections:

  • Evidence-Building Activities. The new law creates a capacity for, and governance of, evidence building and the use of data. Key provisions include:
    • Creates a core capacity for conducting evaluation in agencies via the designation of agency evaluation officer, and the creation of a career path and/or a professional occupational job series for evaluators.
    • Requires agencies to develop evidence-building plans/learning agendas that would align research and evaluation efforts with key policy questions that policymakers want answered.
    • Creates a temporary advisory committee to provide insights on effectively promoting the use of federal data for evidence building.
  • Increases Access to Data. As a first step, agencies must establish chief data officers and the Office of Management and Budget (OMB) will stand up a cross-agency chief data officers council to share best practices. They are then required to:
    • develop open data plans;
    • make their data generally “open by default;”
    • make their data available under an open format and open license, and as machine-readable data; and
    • develop and publicly release a comprehensive inventory of their data assets.

In addition, the General Services Administration will maintain a one-stop website and a governmentwide data catalog.

  • Strengthens Privacy and Confidentiality of Data. The law codifies several existing administrative practices and reauthorizes a 2002 privacy law. Provisions include:
    • Ensuring that government administrative data used by researchers is protected from inappropriate disclosure of personal information.
    • A shift in assumptions from “no access without explicit authorization” to “presumption of access unless prohibited.”

Key Implementation Challenges.

Of course, the primary locus of implementation will be at the agency and program levels, but having the top institutions aligned will be important to create a sustainable implementation over a period of years. Presenters highlighted several key challenges in the initial implementation of the Act:

  • The first is identifying the big policy questions that need to be asked by agencies. Agencies have to define what they want to know and where the data is located that can answer these questions. The annual agency strategic objectives reviews could be one locus for identifying questions, but some of the questions will reach across agency boundaries and will need a cross-agency advocate to raise and answer them.
  • A second challenge is coordinating the various CXOs and program-level executives – both in OMB as well as within the agencies -- since each have different priorities and portfolios as to what the priorities should be for a common research agenda. There is currently too much infighting over data and turf to craft such agendas and a new governance structure will be needed to provide coordination.

Initial Implementation Strategies.

To align the top-level institutions of government, the panel discussed the roles of three sets of players who will be important to ensuring the focus of the new law is on culture change and not merely on compliance with statutory requirements:

  • Office of Management and Budget. A key role for OMB will be to connect people within and across agencies and ensure the agencies frame research and evaluation questions that matter. OMB is in a position to play a catalytic role in creating a new learning culture and the necessary supporting systems required to develop and bring evidence to bear at all stages of policy development and implementation.

OMB’s intent is to use the requirements of this new law to influence changes agency cultures to be more evidence-based in decisionmaking. To do this, OMB will have to put in place the right governance structure and coordinate across its own internal stovepipes – otherwise OMB will follow its traditional implementation approach which will result in agencies doing a good job at compliance by filling out OMB-issued templates.

How can OMB create an internal data governance council that doesn’t just focus on developing guidance and statutory implementation for the act? One approach might be to use the currently-ongoing development of the Federal Data Strategy to help meaningfully bridge institutional silos. OMB has done this successfully in the past; the Bush Administration’s Program Assessment Rating Tool was a good example of how OMB staff bridged across its internal administrative silos.

In addition, top agency leaders will have to make implementation a priority. One way to engage them would be for OMB budget examiners to work with agency leaders to create meaningful internal governance structures. OMB examiners could serve as conveners to ask the right questions of the right people to develop an evidence agenda, since many complex issues reach across agency boundaries. And if agencies follow through, the OMB budget teams could offer more resources.

  • Congress. It would be particularly helpful if Congress were to offer a reinforcing message, such as via appropriations committee report language. Also, authorizing committees could ask their agencies to identify the most important questions that the agencies need to know answers for. The Congress needs to develop a process for engagement. This has precedent. A similar coordination effort was undertaken in the mid-1990s in the House via a special cross-committee task force that assessed agencies’ first strategic plans developed under the Government Performance and Results Act.
  • Outside Stakeholders. State and local governments could formulate mission-oriented research questions; academics and research community could focus on statistics and evaluation techniques; private industry could focus on business-related interests.

Advocacy groups have a role as well, providing outside pressure to help sustain momentum (this happened with the implementation of the DATA Act). For example, the Data Coalition serves as an advocate for the principles of the broader Open Data movement, such as timeliness, accessibility, machine-readable data, non-discriminatory access to data, open format, etc. With respect to the Evidence Act, it supports the standardization of data elements, the creation of an inventory of data, and the sharing of data.

Private foundations have evolved in recent years to become ‘impact investors,” and they want to know if their dollars are making a difference. They will likely be an advocate for implementation, as well, since this Act creates means of measuring and assessing impact.

Starting Places for Implementation.

OMB thinks it already has a good jumpstart on the top-level implementation of this legislation because many of the requirements in the Evidence Act tie into activities already underway by OMB. For example, one of the Cross-Agency Priority Goals is already underway to develop a Federal Data Strategy . Many of the Evidence Commission’s recommendations are reflected in OMB’s agenda.

Several top-level implementation priorities will likely include:

  • Designation of agency Chief Data Officers and Evaluation Officers
  • Determining which questions will be the focus of agency evidence-building plans/learning agendas
  • A focus on short-term actions, such as developing standardized data definitions to be used by agencies in developing their data and evidence inventories, and agency-level evaluation policies.
  • Integrating the use of open data into government operations at the program level, e.g., for reducing waste-fraud-abuse, performance measurement, etc.

Anchoring the Changes Within Agencies.

The agencies are where the work must be done to bring the legislation to life —becoming more than just another set of requirements but seized as an opportunity to shape the administrative culture at all levels toward proper use of evidence. If OMB and Congress truly want to institutionalize an evidence-driven culture, efforts will be needed to connect evidence and data to the management of actual programs, including at the state and local levels.

Several agency-level implementation priorities could include:

  • Developing evidence-building plans/learning agendas that encompass the major questions agencies need to answer about how to deliver on their missions;
  • Applying that learning agenda to identify and commission the research needed to fill evidence gaps and deepen the knowledge base for its efforts; and
  • Applying the growing body of evidence to continuously improve results.

Agencies could also link the implementation of this Act with the implementation of another recent law, the Program Management Improvement Accountability Act, which is targeted to agency program managers. The Evidence Act would give them the data and evaluation capacity needed to meet the requirements of the Program Management Act.

Finally, a public scorecard for each agency could be developed to capture the stages of progress and allow agencies to compare with their peers. The right metrics would capture not procedural changes but successful application of evidence to policy. A possible example is a scorecard of agency development and use of evidence developed by the nonprofit Results for America. In that scorecard, the Department of Labor is one of the pioneering agencies that has made significant progress in embedding the use of evaluation and evidence into its decision making process. Scorecards have been used in other program areas to draw attention to an initiative and create a sense of urgency for continued implementation momentum.