Volume 29, Number 4, October 2000

Research and the Government Performance and Results Act

Beverly K. Hartline



The Government Performance and Results Act (GPRA) was passed by Congress and signed into law in 1993. Its purpose is to promote accountability in government. It requires that agencies prepare and follow strategic plans, commit to measurable results in an annual performance plan, and measure and report on performance in an annual performance report. Fiscal Year 1999 was the first year of full implementation, and agencies have now completed one full cycle of planning, promising, measuring, and reporting.

Both Congress and the Administration supported GPRA, because they felt it would promote communication between the agencies and Congress, strengthen accountability for the use of Federal funds, reduce waste and inefficiency, and help guide the priorities and actions of agency managers and staff. With respect to its impact on Federally sponsored research, there were many concerns about its potential to damage the research enterprise. Since research results cannot be predicted in advance, how can agencies commit to specific outcomes? How can one even measure the performance of fundamental research? Will GPRA stifle innovation, long-term research, and ‘risky’ science? Will program overlaps that are productive be eliminated, under the mistaken impression that they comprise wasteful duplication? Finally, how much will GPRA’s reporting requirements further overload and distract the R&D system?

Because of these concerns, Congress, the Administration, and the scientific community put special attention on the application of GPRA to research, and several studies were conducted. In this talk, I discuss the conclusions of some of the studies and provide web references to government sites providing GPRA-related plans and reports for the National Science Foundation (NSF), the Department of Energy (DOE), and the Office of Management and Budget (OMB). GPRA remains very much a work-in-progress, and continuing participation by the scientific community, including physicists, can help ensure that it benefits, rather than harms, American science.

Studies Guiding GPRA Application to Research

From 1994 to 1996, the National Science and Technology Council (NSTC) convened an interagency working group to consider how agencies could assess fundamental science. The group published its conclusions in the report, Assessing Fundamental Science. It found that the Federal agencies’ primary role is stewardship and portfolio management, rather that the direct conduct of research, that leadership across the frontiers of science is the overarching national goal for Federal research programs, and that one size won’t fit all for goal setting and performance measurement. The report presents nine principles to guide agencies in assessing fundamental research:

  • Begin with a clearly defined statement of program goals.
  • Develop criteria intended to sustain and advance the excellence and responsiveness of the research system.
  • Establish performance indicators that are useful to managers and encourage risk taking.
  • Avoid assessments that would be inordinately burdensome or costly or that would create incentives that are counter productive.
  • Incorporate merit review and peer evaluation of program performance.
  • Use multiple sources and types of evidence; for example, a mix of quantitative and qualitative indicators and narrative text.
  • Experiment in order to develop an effective set of assessment tools.
  • Produce assessment reports that will inform future policy development and subsequent refinement of program plans.
  • Communicate results to the public and elected representatives.

In the summer of 1997, while agencies were in the final phases of preparing their strategic plans and starting work on their performance plans for Fiscal Year 1999, Jack Gibbons, then Assistant to the President for Science and Technology, issued a guidance memorandum on GPRA to the heads of the 19 Federal agencies that sponsor research. Gibbons made seven major points in his memo. He told the agencies to treat R&D visibly and in a manner that promotes leadership through quality and innovation. He suggested that they should choose performance goals and measures that would be useful to guide the priorities and actions of agency staff and lead to clear and fair performance reports. The plans should communicate in a way non experts can appreciate and describe how the agency coordinated its research with other agencies and stakeholders. Finally he urged each agency to use the full flexibility of GPRA to tailor its implementation to its specific R&D mission. The act gives the OMB Director the authority to approve goals stated in an "alternative form" and to waive certain administrative requirements and controls. The alternative GPRA form is not quantitative but is based instead on descriptive statements of a minimally effective and successful program, and the NSF chose to use this approach for many of its measures.

In 1997 and 1998, both the General Accounting Office (GAO) and the Congressional Research Service (CRS) conducted several studies and assessments of GPRA implementation in the agencies, focusing on the draft and final strategic and performance plans and the processes followed by the agencies to develop them. Various congressional committees held hearings, notably the House Science Committee.

Between 1997 and 1999 the National Research Council Committee on Science, Engineering, and Public Policy conducted a study that resulted in the report, Evaluating Federal Research Programs. The purpose of the COSEPUP study was to (1) identify and analyze the most effective ways to assess the results of research, and (2) help the Federal government determine how agencies can better incorporate research activities into their GPRA plans. Some participants believed that research–including basic research–could be measured in a way that provides quantitative information on outcomes. Others disagreed, stating that there is no sensible way to respond to GPRA for basic research, given its long-range nature. COSEPUP concluded that useful outcomes of basic research cannot be measured directly on an annual basis because they are inherently too unpredictable. However, measures of the quality, relevance, and leadership position of research are possible, could be reported regularly via the judgements of appropriately selected peers, and could be the basis of meaningful application of GPRA to research. COSEPUP recommended, further, that agencies include in their strategic & performance plans the goal of maintaining adequate human resources in fields critical to their missions. With respect to research coordination among Federal agencies, COSEPUP proposed that the government establish a formal process to identify and coordinate areas of research supported by multiple agencies.

COSEPUP is starting its second GPRA study, which will involve case studies of the responses of about five Federal agency research programs to GPRA. The Panel will select case studies from a pool of 10 agencies at a meeting in June 2000.

GPRA Implementation

Fiscal year 1999 was the first full year of GPRA implementation. Strategic Plans were produced in 1997. Most agencies are now involved in the required 3-year revision. The performance plans for FY 1999 were submitted as part of the FY 1999 budget request, then revised after appropriations were finalized. By March 31, 2000, all agencies were to have submitted their reports on FY 1999 performance. The government-wide Budget Request for FY 2001, in fact, included a government-wide report on agency and program performance, and each agency has available through its web page its more detailed performance report. The reference list includes some of these web sites. If you are interested in some other agency’s strategic plan, performance plan, and performance reports, and cannot find the right link through the agency’s home page, check the web pages for the agency’s budget office or Chief Financial Officer.



1. Evaluating Federal Research Programs, COSEPUP, National Research Council, 1999, Website: www2.nas.edu/cosepup

2. Assessing Fundamental Science, National Science and Technology Council, 1996, website: www.nsf.gov/sbe/srs/ostp/assess/start.cfm

3. Government-wide budget request for FY 2001, performance plan for 2001 and performance report for FY1999. Website: w3.access.gpo.gov/usbudget/fy2001/pdf/budget.pdf

4. National Science Foundation performance plan and report. Website:

www.nsf.gov/cgi-bin/getpub?nsf0064, www.nsf.gov/cgi-bin/getpub?nsf0055

5. Department of Energy performance plan and report. Website: www.cfo.doe.gov/stratmgt/DOE1999AR.pdf

6. Government Accounting Office reports on GPRA implementation. Website: http://www.gao.gov/new.items/gpra/gpra.cfm

Beverly K. Hartline

Los Alamos National Laboratory

SSR Directorate, Mailstop A127

Los Alamos, NM 87545