Dec. 12, 2017

Report Offers Guidance on How to Monitor the Quality of STEM Undergraduate Education

WASHINGTON – Monitoring the quality and impact of undergraduate science, technology, engineering, and mathematics (STEM) education will require the collection of new national data on changing student demographics, instructors’ use of evidence-based teaching approaches,  student transfer patterns, and other dimensions of STEM education, says a new report from the National Academies of Sciences, Engineering, and Medicine. The report identifies three overarching goals to improve various components of undergraduate STEM education.

A skilled STEM workforce is a critical driver for U.S. economic growth and international competitiveness, but recent trends have raised concerns for the health of these enterprises. Undergraduate STEM education not only prepares students who major in these fields to enter the STEM workforce, but also prepares all students -- both majors and non-majors -- with knowledge and skills they can apply across a range of jobs and in civic life.  However, many students with an interest in and aptitude for STEM, especially females and underrepresented minorities, are not completing degrees in these fields, partly because of documented weaknesses in STEM teaching, learning, and student support. 

A growing body of research is beginning to address these weaknesses, identifying new and more effective strategies to engage, motivate, and retain diverse students in STEM. Many federal, state, and local initiatives are now underway to apply these new approaches, but policymakers and the public do not know whether these initiatives are achieving nationwide improvement in undergraduate STEM education. 

To address this concern, the National Science Foundation asked the National Academies to develop indicators that can be used to monitor the status and quality of undergraduate STEM education over time at the national level.

Currently, one of the most widely used methods for measuring the “value” of a college or university is to assemble and analyze data on employment outcomes such as earnings and the extent to which graduates find jobs related to their chosen field of study. However, research has demonstrated that both graduation rates and post-graduation earnings vary widely, depending on the type and selectivity of the institution and the characteristics of incoming students.  In addition, graduates’ earnings and job placement are influenced by labor market demand, varying by time, place, and field in ways that are characteristic of a market economy.  Furthermore, many STEM majors enter occupations that are not traditionally considered part of the STEM workforce, but their STEM knowledge may indeed contribute to their earnings. Thus, there is a large variation in the flow of students from STEM majors to STEM occupations.  For all of these reasons, some economists and leaders in higher education agree that these methods alone are not suitable measures of institutional quality.

The report lays out a conceptual framework for a national indicator system with three overarching goals – increase students’ mastery of STEM concepts and skills by engaging them in evidence-based practices and programs; strive for equity, diversity, and inclusion of STEM students and instructors by providing opportunities for access and success; and ensure adequate numbers of STEM professionals by increasing completion of STEM credentials as needed in the different STEM disciplines. The committee determined that progress toward these three goals could be assessed using 21 indicators, such as the use of valid measures of teaching effectiveness and the diversity of STEM degree and certificate earners in comparison with the diversity of degree and certificate earners in all fields.  

The committee found that nationally representative data are not currently available from public or proprietary sources for most of the proposed indicators.  For example, the Integrated Postsecondary Education Data System (IPEDS), a major federal data source, focuses primarily on full-time students’ attainment of credentials at the institution at which they began their studies.  This focus is not aligned with student trajectories in undergraduate STEM, which often involve part-time studies and transfer across institutions.  This lack of data for the proposed indicators limits policymakers’ ability to track the progress toward the committee’s proposed goals.

In an effort to reduce the complexity of implementing the indicator system, the report also offers three options for obtaining the data required for all of the indicators: creating a national student unit record data system, expanding National Center for Education Statistics (NCES) data collections, and combining existing data from nonfederal sources. The first option would provide the most accurate, complete, and useful data to implement the proposed indicators of a student’s progress through STEM education; the second option would take advantage of a well-developed system that NCES uses to obtain IPEDS data annually from two- and four-year institutions; and the third option could be carried out by the federal government or another entity, such as a higher education association.

The committee noted that some of the indicators require research as the first step to develop clear definitions and identify measurement methods prior to beginning data collection. In addition, ongoing research may identify important factors related to the quality of undergraduate STEM education that would require new indicators beyond those proposed in the report. These and other developments in undergraduate education imply that in the coming years, it will be important to review and revise the committee’s proposed STEM indicators and the data and methods used for measuring them. 

The study was sponsored by the National Science Foundation.  The National Academies of Sciences, Engineering, and Medicine are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. The National Academies operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln. For more information, visit 

Kacey Templin, Media Relations Officer
Andrew Robinson, Media Relations Assistant
Office of News and Public Information
202-334-2138; e-mail
Follow us on Twitter @theNASEM


Division of Behavioral and Social Sciences and Education
Board on Science Education

Committee on Developing Indicators for Undergraduate STEM Education

Mark B. Rosenberg (chair)
Florida International University

Heather Belmont
School of Science
Miami-Dade College

Charles F. Blaich
Center of Inquiry and the Higher Education Data Sharing Consortium
Wabash College
Crawfordsville, Ind.

Mark Connolly
Associate Research Scientist
Wisconsin Center for Education Research
University of Wisconsin

Stephen W. Director1
Provost and University
Distinguished Professor Emeritus
Northeastern UniversityBoston

Kevin Eagan
Assistant Professor in ResidenceDepartment of Education, and
Managing Director
Higher Education Research Institute
University of California
Los Angeles

Susan Elrod
Provost and Executive Vice Chancellor for Academic Affairs
University of Wisconsin

Kaye Husbands Fealing
School of Public Policy
Georgia Institute of Technology

Stuart I. Feldman
Schmidt Sciences
Schmidt Philanthropies
Palo Alto, Calif.

Charles Henderson
Professor of Physics, and
Mallinson Institute for Science Education, and
Center for Research on Instructional Change in Postsecondary Education
Western Michigan University

Lindsey Malcom-Piqueux
Associate Director for Research and Policy
Center for Urban Education, and
Research Associate Professor
Rossier School of Education
University of Southern California
Los Angeles

Marco Molinaro
Assistant Vice Provost for Educational Effectiveness
University of California

Rosa Rivera-Hainaj
Assistant Vice President of Academic Affairs
Our Lady of the Lake University
San Antonio

Gabriela C. Weaver
Vice Provost for Faculty Development, and
Institute for Teaching Excellence and Faculty Development
University of Massachusetts

Yu Xie2
Bert. G. Kerstetter ’66 University Professor of Sociology and the Princeton Institute for International and Regional Studies
Department of Sociology
Princeton University
Princeton, N.J.


Margaret Hilton
Study Director

1Member, National Academy of Engineering
2Member, National Academy of Sciences