Date: March 3, 1998
Contacts: Dan Quinn, Media Relations Officer
Sean McLaughlin, Media Relations Assistant
(202) 334-2138; e-mail <news@nas.edu>

Publication Announcement

Better Testing Approach Could Save Money
And Improve Effectiveness of Military Systems

Even at a time of shrinking defense budgets, the cost of developing a single new military system, such as the Longbow Apache helicopter, can be as high as $5 billion. These complex systems -- for weapons, communications, and other important functions -- rely more than ever on advanced technology and software, placing new demands on military officials who must determine whether they are effective and reliable.

The Department of Defense (DOD) could improve the performance of its systems and at the same time save money by more fully adopting a testing and evaluation approach similar to what is done in private industry, according to a new report from a panel of the National Research Council. Such an approach, which employs earlier testing in more realistic settings to spot potential design flaws, is not yet commonplace in most military settings. Instead, the military tends to test a system's operational performance when it is nearly final, making any flaws expensive and difficult to fix.

In describing a new model for testing and evaluating defense systems, the panel said DOD should strive to combine information from a variety of testing sources, especially tests in laboratories and on related systems. The department must fully describe all test activities using standard terminology, and create an archive of what was tested, under what circumstances, and with what results. Combining this information also will require the use of sophisticated statistical methods and models, and ultimately will improve decision-making about which systems are ready for full-rate production.

Although various Defense Department agencies apply some of these principles, there has not been a consistent, department-wide approach, the report says. To aid in implementing these changes, DOD should expand the role of its Director of Operational Test and Evaluation to support all of DOD's testers in applying better statistical and quality management principles. The Director of Operational Test and Evaluation also should be responsible for defining and documenting steps to reduce variability from program to program, and for ensuring that best practices are used consistently.

Copies of Statistics, Defense Testing, and Acquisition are available at www.nap.edu or by calling 202-334-3313  or 1-800-624-6242.  Reporters may obtain a pre-publication copy from the Office of News and Public Information at the letterhead address (contacts listed above).


NATIONAL RESEARCH COUNCIL
Commission on Behavioral and Social Sciences and Education
Committee on National Statistics

Panel on Statistical Methods for Testing and Evaluating Defense Systems

John E. Rolph (chair)
Professor of Statistics and Chair,
Department of Information and
Operations Management
Marshall School of Business
University of Southern California
Los Angeles

Marion Bryson
Director of Research and Development
North Tree Management
Monterey, Calif.

Herman Chernoff
Professor of Statistics
Department of Statistics
Harvard University
Cambridge, Mass.

John D. Christie
Senior Fellow and Assistant to the President
Logistics Management Institute
McLean, Va.

Louis Gordon
Consultant
Palo Alto, Calif.

Kathryn B. Laskey
Associate Professor, Department of Systems Engineering
George Mason University
Fairfax, Va.

Robert C. Marshall
Professor and Head, Department of Economics
Pennsylvania State University
State College

Vijayan N. Nair
Professor of Statistics and of Industrial and
Operations Engineering
Department of Statistics
University of Michigan
Ann Arbor

Robert T. O'Neill
Director, Office of Epidemiology and Biostatistics, and
Acting Director, Division of Epidemiology and Surveillance
Center for Drug Evaluation and Research
Food and Drug Administration
U.S. Department of Health and Human Services
Rockville, Md.

Stephen M. Pollock
Professor of Industrial and Operations Engineering
Department of Industrial and Operations Engineering
University of Michigan
Ann Arbor

Jesse H. Poore
Professor of Computer Science
Department of Computer Science
University of Tennessee, and
President
Software Engineering Technology Inc.
Knoxville

Francisco J. Samaniego
Professor, Intercollege Division of Statistics, and
Director of Teaching Resources Center
University of California
Davis

Dennis E. Smallwood
Roger's Professor
Department of Social Sciences
U.S. Military Academy
West Point, N.Y.

RESEARCH COUNCIL STAFF

Michael L. Cohen
Study Director