Targeted News Service
December 15, 2017
Monitoring the quality and impact of undergraduate science, technology, engineering, and mathematics (STEM) education will require the collection of new national data on changing student demographics, instructors’ use of evidence-based teaching approaches, student transfer patterns, and other dimensions of STEM education, says a new report from the National Academies of Sciences, Engineering, and Medicine. The report identifies three overarching goals to improve various components of undergraduate STEM education.
A skilled STEM workforce is a critical driver for U.S. economic growth and international competitiveness, but recent trends have raised concerns for the health of these enterprises. Undergraduate STEM education not only prepares students who major in these fields to enter the STEM workforce, but also prepares all students—both majors and non-majors—with knowledge and skills they can apply across a range of jobs and in civic life. However, many students with an interest in and aptitude for STEM, especially females and underrepresented minorities, are not completing degrees in these fields, partly because of documented weaknesses in STEM teaching, learning, and student support.
A growing body of research is beginning to address these weaknesses, identifying new and more effective strategies to engage, motivate, and retain diverse students in STEM. Many federal, state, and local initiatives are now underway to apply these new approaches, but policymakers and the public do not know whether these initiatives are achieving nationwide improvement in undergraduate STEM education.
To address this concern, the National Science Foundation asked the National Academies to develop indicators that can be used to monitor the status and quality of undergraduate STEM education over time at the national level.
Currently, one of the most widely used methods for measuring the “value” of a college or university is to assemble and analyze data on employment outcomes such as earnings and the extent to which graduates find jobs related to their chosen field of study. However, research has demonstrated that both graduation rates and post-graduation earnings vary widely, depending on the type and selectivity of the institution and the characteristics of incoming students. In addition, graduates’ earnings and job placement are influenced by labor market demand, varying by time, place, and field in ways that are characteristic of a market economy. Furthermore, many STEM majors enter occupations that are not traditionally considered part of the STEM workforce, but their STEM knowledge may indeed contribute to their earnings. Thus, there is a large variation in the flow of students from STEM majors to STEM occupations. For all of these reasons, some economists and leaders in higher education agree that these methods alone are not suitable measures of institutional quality.
The report lays out a conceptual framework for a national indicator system with three overarching goals—increase students’ mastery of STEM concepts and skills by engaging them in evidence-based practices and programs; strive for equity, diversity, and inclusion of STEM students and instructors by providing opportunities for access and success; and ensure adequate numbers of STEM professionals by increasing completion of STEM credentials as needed in the different STEM disciplines. The committee determined that progress toward these three goals could be assessed using 21 indicators, such as the use of valid measures of teaching effectiveness and the diversity of STEM degree and certificate earners in comparison with the diversity of degree and certificate earners in all fields.
The committee found that nationally representative data are not currently available from public or proprietary sources for most of the proposed indicators. For example, the Integrated Postsecondary Education Data System (IPEDS), a major federal data source, focuses primarily on full-time students’ attainment of credentials at the institution at which they began their studies. This focus is not aligned with student trajectories in undergraduate STEM, which often involve part-time studies and transfer across institutions. This lack of data for the proposed indicators limits policymakers’ ability to track the progress toward the committee’s proposed goals.
In an effort to reduce the complexity of implementing the indicator system, the report also offers three options for obtaining the data required for all of the indicators: creating a national student unit record data system, expanding National Center for Education Statistics (NCES) data collections, and combining existing data from nonfederal sources. The first option would provide the most accurate, complete, and useful data to implement the proposed indicators of a student’s progress through STEM education; the second option would take advantage of a well-developed system that NCES uses to obtain IPEDS data annually from two- and four-year institutions; and the third option could be carried out by the federal government or another entity, such as a higher education association.
The committee noted that some of the indicators require research as the first step to develop clear definitions and identify measurement methods prior to beginning data collection. In addition, ongoing research may identify important factors related to the quality of undergraduate STEM education that would require new indicators beyond those proposed in the report. These and other developments in undergraduate education imply that in the coming years, it will be important to review and revise the committee’s proposed STEM indicators and the data and methods used for measuring them.
The study was sponsored by the National Science Foundation. The National Academies of Sciences, Engineering, and Medicine are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. The National Academies operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln. For more information, visit http://national-academies.org.
The National Academies of Sciences, Engineering, and Medicine, Division of Behavioral and Social Sciences and Education Board on Science Education
Committee on Developing Indicators for Undergraduate STEM Education
Mark B. Rosenberg (chair), President, Florida International University, Miami.
Heather Belmont, Dean, School of Science, Miami-Dade College, Miami.
Charles F. Blaich, Director, Center of Inquiry and the Higher Education Data Sharing Consortium, Wabash College, Crawfordsville, IN.
Mark Connolly, Associate Research Scientist, Wisconsin Center for Education Research, University of Wisconsin, Madison.
Stephen W. Director (Member, National Academy of Engineering), Provost and University, Distinguished Professor Emeritus, Northeastern University, Boston.
Kevin Eagan, Assistant Professor in Residence, Department of Education, and Managing Director, Higher Education Research Institute, University of California, Los Angeles.
Susan Elrod, Provost and Executive Vice Chancellor for Academic Affairs, University of Wisconsin, Whitewater.
Kaye Husbands Fealing, Chair, School of Public Policy, Georgia Institute of Technology, Atlanta.
Stuart I. Feldman, Head, Schmidt Sciences, Schmidt Philanthropies, Palo Alto, CA.
Charles Henderson, Professor of Physics, and Director, Mallinson Institute for Science Education, and Co-Director, Center for Research on Instructional Change in Postsecondary Education, Western Michigan University, Kalamazoo.
Lindsey Malcom-Piqueux, Associate Director for Research and Policy, Center for Urban Education, and Research Associate Professor, Rossier School of Education, University of Southern California, Los Angeles.
Marco Molinaro, Assistant Vice Provost for Educational Effectiveness, University of California, Davis.
Rosa Rivera-Hainaj, Assistant Vice President of Academic Affairs, Our Lady of the Lake University, San Antonio.
Gabriela C. Weaver, Vice Provost for Faculty Development, and Director, Institute for Teaching Excellence and Faculty Development, University of Massachusetts, Amherst.
Yu Xie (Member, National Academy of Sciences), Bert. G. Kerstetter ‘66 University Professor of Sociology and the Princeton Institute for International and Regional Studies, Department of Sociology, Princeton University, Princeton, NJ.
Margaret Hilton, Study Director.
Contact: Kacey Templin, media relations officer, Office of News and Public Information, 202/334-2138, firstname.lastname@example.org; Andrew Robinson, media relations assistant, Office of News and Public Information, 202/334-2138, email@example.com.
Copyright 2017 Targeted News Service LLC. All Rights Reserved.
Quality News Today is an ASQ member benefit offering quality related news
from around the world every business day.