September 24, 2010, 10:59 AM ET
A Modest Proposal: Searching for an Academic Bottom Line
Does much learning occur at the University of Michigan, Colorado College, or the University of Texas at San Antonio? Do students at Duke University fare better in the job market than their counterparts at Northwestern or Cornell? There are so many important questions like these regarding higher education for which we do not have answers, and colleges have generally resisted providing that information in a uniform matter that would allow comparisons of performance at colleges and universities by consumers, funders, and taxpayers generally.
I have a modest proposal of three ways that we could get immensely important information that would make for more informed customers and donors, stimulate healthy competition between schools, and promote greater concern for undergraduate education by the schools themselves, particularly the national research universities. Moreover, these proposals are not inordinately expensive, nor would proceeding with them impede in any major way institutional autonomy.
Require all schools receiving federal funds to require newly entering students to take one of the following tests: ACT, SAT, Critical Learning Assessment, or the National Assessment of Educational Progress Test administered at age 17 in English and Mathematics. Require those schools to administer the same test to those student shortly before receiving the bachelor's degree. Provide information on the test results to the federal government or a private not-for-profit agency so that changes in test scores can be disseminated to the public. Did student performance on these academic measures improve during the college years? Finally, require graduating students to take the Graduate Record Examination in the major subject, if such an exam exists, and to report the tests scores to the federal government or the private agency. Average "value added" on the tests of basic skills/general education (as well as the raw data) would be reported to the public as well as student performance on subject exams.
Require each year that every college or university provide to the Internal Revenue Service the Social Security numbers of students receiving bachelor's degrees in the preceding 12 months. Require the IRS to publish, by school of graduation, average and median employment earnings data for graduates 1 year, 5 years, and 10 years after graduation, based on federal income tax returns. Since employability in good paying jobs is a major goal of many college students, this would be very useful information.
When I served on the Spellings Commission on the Future of Higher Education, Boeing executive Rick Stephens told us about a fascinating thing his company did: categorize employees by college attended and do some assessment of those schools based on the job performance of the workers. Rick knew that some schools rather consistently provided workers that met or exceeded expectations, while others did not. Why should we not expand the survey to hundreds of employers and make the results available to the public (on average performances, of course, not individual workers)? Why should not companies receiving more than $5 million in federal contracts annually and employing 1,000 or more workers be required to fill out a very simple standardized form on all employees assessing job performance, classified by colleges attended? And why should not the Department of Labor, Department of Education, Census Bureau or some agency provided summary data from this information on any school for which there are at least 25 employees? This would be a good way of gathering some good data on which schools seem to turn out young adults that are useful and productive.
There are all sorts of objections to these ideas, and some of them are quite legitimate. Obviously some laws would have to change to allow these proposals to be fully operative. Statistical methods exist to mesh together schools using, say, the ACT test with those using the CLA. Schools that argue, for example, that the SAT is not really a good instrument to measure learning or understanding in core disciplines can opt for another test, say the CLA that purportedly measures critical thinking. The gains of disseminating this information are potentially so great, that we should try to minimize the problems and move in the direction of doing at least one, if not all three, of the ideas above. This would allow us to rely less upon published magazine rankings (full disclosure: I direct the computation of the rankings used by Forbes), force schools to emphasize important outcomes rather than inputs (thereby reducing the academic arms race), perhaps increase price competition, etc.