Date: March 25, 2003 Contacts: Barbara J. Rice, Deputy Director Andrea Durham, Media Relations Assistant Office of News and Public Information (202) 334-2138; e-mail <news@nas.edu>
FOR IMMEDIATE RELEASE
Ground Rules for Designing Identity-Verification Systems Are Key to Protecting Privacy
WASHINGTON -- To best protect an individual's privacy, specific guidelines should be followed when designing systems to authenticate the individual's identity, says a new report from the National Academies' National Research Council. Supported by an in-depth examination of the related technical, legal, and policy issues, the report offers a comprehensive set of guidelines to ensure that an individual's privacy is not unnecessarily compromised, whether by commercial or government organizations.
"The ability to remain anonymous and have a choice about when and to whom one's identity is disclosed is an essential aspect of a democracy," said Stephen Kent, chair of the committee that wrote the report, and chief scientist for information security, BBN Technologies, Cambridge, Mass. "The technology a system uses, whether scanning a face or using a smart card, is less important in maintaining privacy than the way the system is designed and the scope of the system. Our report offers good design practices to follow when setting up an authentication system. Using them will improve security for the system and privacy for the users, considerations particularly relevant to ongoing policy debates such as about national identity cards and frequent traveler cards."
The report provides a conceptual toolkit for designing an authentication system that is sensitive to privacy concerns. "Authentication" refers to the act of confirming a specific claim, such as "I am Joe Smith." The first design step -- even before settling on a particular security technology such as passwords, smart cards, or facial or voice recognition -- is determining what type of authentication, if any, is necessary. A key question is whether the person's identity is important in a particular circumstance. For example, there is no need to know who downloads a blank tax form from a government Web site. Other situations require that users be authorized, but the identity of a specific individual user may not always be required, such as when everyone in a family uses the same password to arm and disarm their home security system.
It is particularly important that federal, state and local governments follow the suggested guidelines when designing authentication and security systems because of governments' unique relationships with people, the report says. Many transactions with government are mandatory, and some must persist over an individual's lifetime. In addition, people may expect more from these agencies than from commercial entities in protecting the security and confidentiality of personal data.
As a social and economic actor in society, an individual has multiple identities, such as licensed driver, bank account owner, golf club member, registered voter, and parent. As computer and Internet technologies are increasingly used for everything from shopping to gaining access to places of work to paying bills or filing tax returns, people are required to divulge personal information in a wide variety of daily activities. Privacy concerns raised by such interactions include questions about how securely this information is transmitted and stored, whether it is used for purposes other than the one for which it was collected, and whether the information is shared with other agencies and businesses. Linking information across identities represented in different systems, such as using Social Security numbers to reference medical records, can lead to a loss of privacy.
The committee stressed that appropriately designed authentication systems require the minimum information necessary to achieve a specific security goal. For example, ensuring that only golf club members and their guests enter the clubhouse does not require knowing their bank account or Social Security numbers. The information people must reveal to interact with a system -- be it a business, workplace, or government office -- is often more extensive and invasive than is necessary.
Conversely, some authentication systems are inherently insecure because they rely upon information that is disclosed often and to many requestors, and therefore not secret, such as a Social Security number. For situations where intrusive methods of authentication are necessary, such as verifying a specific individual's identity during the process of opening a bank account, the report provides a list of issues to consider and actions to take that will enhance security while reducing the risk of invasion of privacy. Those include minimizing the breadth and intimacy of the data collected and the length of time they are retained. System designers should also be precise about, and explicitly inform users, who will have access to the data and the purpose of that access. In addition, users should be able to check on and correct the information used for authentication purposes.
The study was sponsored by the National Science Foundation, Office of Naval Research, General Services Administration, Federal Chief Information Officers' Council, and the Social Security Administration. The National Research Council is a private, nonprofit institution that provides science policy advice under a congressional charter granted to the National Academy of Sciences. A committee roster follows.
Read the full text of Who Goes There? Authentication Through the Lens of Privacy for free on the Web, as well as 2,500 other publications from the National Academies. Printed copies are available for purchase from the National Academies Press; tel. (202) 334-3313 or 1-800-624-6242 or on the Internet at http://www.nap.edu. Reporters may obtain a pre-publication copy from the Office of News and Public Information (contacts listed above).
NATIONAL RESEARCH COUNCIL Division on Engineering and Physical Sciences Computer Science and Telecommunications Board
Committee on Authentication Technologies and Their Privacy Implications
Stephen T. Kent (chair) Chief Scientist Information Security BBN Technologies Cambridge, Mass.
Michael Angelo Staff Fellow Compaq Computer Corp. Houston
Steven M. Bellovin* Fellow AT&T Labs Research Florham Park, N.J.
Bob Blakley Chief Scientist Enterprise Solutions Unit IBM Tivoli Software Austin, Texas
Drew Dean Computer Scientist SRI International Palo Alto, Calif.
Barbara Fox Senior Software Architect Microsoft Corp. Redmond, Wash.
Stephen H. Holden Assistant Professor Department of Information Systems University of Maryland, Baltimore County Baltimore
Deirdre Mulligan Acting Clinical Professor of Law University of California Berkeley
Judith S. Olson Professor and Richard W. Pew Chair in Human-Computer Interaction University of Michigan Ann Arbor
Joe Pato Chief Technologist Internet Security Operation Hewlett-Packard Labs Cambridge, Mass.
Radia Perlman Distinguished Engineer Sun Microsystems Burlington, Maine
Priscilla M. Regan Associate Professor George Mason University Fairfax, Va.
Jeffrey Schiller Network Manager Massachusetts Institute of Technology Cambridge
Soumitra Sengupta Assistant Professor Columbia University New York City
James L. Wayman Director Biometrics Test Center San Jose State University San Jose, Calif.
Daniel J. Weitzner W3C Technology and Society Domain Leader World Wide Web Consortium Cambridge, Mass.
STAFF
Lynette I. Millett Study Director * Member, National Academy of Engineering