Office of Planning and Budgeting

The Wisconsin Education Approval Board, which oversees all for-profit colleges located in the state and any online-learning programs offered to its residents, may require that those institutions achieve specific performance standards in order to operate within Wisconsin. Specifically, that board is proposing to require that at least 60 percent of a college’s students complete their studies within a certain time-frame and at least 60 percent of its graduates have jobs.  Public universities and private nonprofit colleges are not under the board’s jurisdiction and would therefore be exempt from the requirements.

The board already collects and publishes data on its institutions. According to those reports, average completion rates fell from 82 to 59 percent over the last six years and the percentage of graduates who were employed during a given year dropped from 44 to 22 percent (in the same time frame).

The Chronicle reports that the board is basing its standards on what they believe “Wisconsin consumers would find ethical, responsible, and acceptable for institutions choosing to enroll them.” However, for-profit colleges have already submitted letters to the board arguing that the proposed standards are “arbitrary and should not be broadly applied to a diverse set of programs, which often enroll underserved populations.”

While the federal government’s “gainful employment” rule is similar to Wisconsin’s proposal, it is unusual to see a state attempt this type of regulatory system. Some states have increased their requirements for online and for-profit institutions—but Wisconsin’s proposal is especially aggressive. For-profits that wish to operate in Washington must receive authorization from the Washington Student Achievement Council, which considers institutions; “financial stability, business practices, academic programs, and faculty qualifications”—but does not yet hold them to specific graduation or employment standards.

On Wednesday, Wisconsin’s board voted unanimously to postpone a final decision until a team made of board members, representatives from colleges and universities, and State legislators can review the proposal more thoroughly.  The team is scheduled to make recommendations to the board in June of 2013.

Massive open online course (MOOC) providers, Coursera and Udacity, are now offering to sell information about high-performing students to employers who have available positions.

Coursera, which is known for working with high-profile colleges, announced its version of the headhunter service on Tuesday. Already some similarly high-profile tech companies, such as Facebook and Twitter, have enrolled in the service. Once an employer signs up, Coursera provides them with a list of students who match the company’s academic, geographic, or skill-based requirements. If an employer sees a profile they like, Coursera emails the student asking if she or he would like to be introduced to the company. Every introduction costs the employer a flat fee, the revenue of which goes primarily to Coursera. However, 6 to 15 percent of the revenue goes back to whichever college(s) offered the MOOC(s) the student attended.

The Chronicle reports that Coursera’s co-founder, Andrew Ng feels “this is a relatively uncontroversial business model that most of our university partners are excited about.” Regardless, Coursera is giving each of its partnering colleges a chance to opt out of the new headhunter service. If a college declines, any and all students enrolled in its MOOCs will be unable to participate in the job-matchmaking program. If a college accepts, its students will still have the option to personally opt out of the service—either altogether or only for courses they are unable to complete.

Udacity, which works with individual professors to offer MOOCS rather than entire institutions, offers a similar headhunter program. So far, approximately 350 partner companies including Google, Amazon, and Facebook have already signed up.

Neither Coursera nor Udacity was willing to disclose the price of their respective services, but both MOOC providers said they give employers more than just student grades. They also collect and share data on students who frequently participate in discussion forums and help answer questions from their classmates. Mr. Thrun, of Udacity, said employers often find those “softer skills” to be more valuable than sheer academic performance as they can be better predictors of placement success.

As a means of both acknowledging and analyzing the recession’s impact on students, this year’s National Survey of Student Engagement (NSSE) included a new set of questions asking how students’ finances affect their stress and academic activities. Approximately 15,000 first-year and senior students from “a diverse group of 43 institutions” responded to the new addendum.  The results, which were released last week, indicate that “finances were a significant concern for the majority of students.” 

As seen in Table 5 from the official report:

  • The majority of students frequently worried about paying for college and regular expenses.
  • Roughly 1 in 3 students said financial concerns interfered with their academic performance.
  • About 30 percent said they frequently chose not to buy required academic materials due to cost.
  • More students looked into working more hours than into borrowing more money as a way to cover costs.
  • Approximately 3 in 4 students still agreed that college is a good investment.

In addition to these findings, the study found that over 55 percent of full-time seniors said that their choice of major was influenced by factors such as ability to find a job and/or the prospect of career advancement.  Yet, 89 percent of students overall said the most influential factor in choosing a major was still how well it fit with their talents and academic interests.

A landmark study from 1990 classified 212 US institutions as liberal arts colleges, but new research shows a 39 percent decline in that number—only 130 institutions currently meet the original study’s classification criteria. Of the 82 institutions no longer classified as liberal arts colleges, a handful were subsumed by larger institutions, while about half had shifted their mission away from the standard liberal arts definition.

Historically, definitions of liberal arts colleges (including Carnegie Classifications) have highlighted their focus on undergraduate studies; selective admissions; small class sizes; emphasis on nurturing diverse perspectives and personal growth; and de-emphasis on cultivating professional skills. According to the more recent study, however, many liberal arts institutions are now offering more “professional” programs and incorporating more research into their curricula. The authors speculate that liberal arts colleges may be making this shift away from their standard definition in response to economic pressures. For example, schools may be attempting to:

  • Offset dwindling revenue streams by attracting new segments of the market;
  • Remain competitive in a market flooded by online and for-profit institutions; or
  • Accommodate students’ increased focus on vocational preparation.

To expand on that last point, recent federal and state-level preoccupation with graduates’ potential earnings has put liberal arts colleges in a difficult position as degrees in traditional liberal arts fields (i.e. social sciences and humanities) may be less lucrative for graduates than other degrees (i.e. professional or STEM degrees). For example, according to, the average first-year earnings of a graduate with a four-year degree in the State of Virginia are about $30,000 if the degree was in sociology or about $46,000 if the degree was in civil engineering.

A widely-regarded strength of the US higher education system is its diversity. However, if liberal arts colleges shift their missions to include the research and career-preparatory goals of other schools, the system may become more homogeneous—leaving students with fewer educational options.

For the first time in 15 years, fewer students are enrolling in higher education overall. Enrollments at public four-year and private non-profit institutes actually increased, but falling for-profit and two-year enrollments pulled down the average. According to preliminary data released this week by the U.S. Department of Education’s National Center for Education Statistics, colleges and universities eligible for federal financial aid experienced a 0.2 percent decrease in their total enrollments between Fall 2010 (21,588,124 students) and Fall 2011 (21,554,004 students). Although slight, this drop could indicate that fewer individuals are using some sectors of higher education as a refuge from the recent recession and/or that rising tuition rates are driving students out of some markets. Regardless, the new trend could be problematic for those advocating for higher education attainment as well as for universities hoping to derive more revenue from increased enrollments.

Some specific findings include:

  • For-profit institutions were hit the hardest. Enrollments dropped by 1.9 percent at four-year for-profits and by a whopping 7 percent at two-year for-profits. This is likely due to tougher federal regulations on for-profits as well as for-profit institutions deciding to use more selective admissions practices.
  • Two-year institutes struggled, while four-year schools continued to thrive. Two-year enrollments (across all sectors) are down by an average of 2.4 percent, while four-year enrollments are up an average of 1.2 percent. Much of the two-year drop was driven by the aforementioned drop in two-year for-profit enrollments; however, California’s recent limit on community college enrollments can also help explain the decrease in two-year numbers.
  • Part-time enrollments grew, but full-time enrollments shrunk. About 0.8 percent more students enrolled in part-time programs (across all sectors), whereas 0.8 percent fewer students enrolled in full-time programs. Two possible explanations are that the job market has recovered enough to keep more students employed or that more students now need income to support themselves during school.

UW enrollments reflect those of four-year public institutes across the country. Total enrollments at four-year public institutes increased by an average of 1.5 percent from Fall 2010 to Fall 2011. UW’s total enrollments (undergraduate and graduate students combined) increased by 1.6 percent from Fall 2010 (49,940 students) to Fall 2011 (50,745 students) and by another 1.6 percent from Fall 2011 to Fall 2012 (51,576 students). For more UW enrollment statistics, see the OPB Factbook.

Last Friday, the U.S. Department of Education released its annual update on federal student loan cohort default rates (CDRs) and, although national CDRs are gloomily high, UW’s rates are impressively low. As the Department is in the process of switching to a more accurate three-year CDR measure, this year’s report includes both the FY 2010 two-year and the FY 2009 three-year CDRs. These rates represent the percentage of student borrowers who failed to make loan payments for 270 days within two or three years, respectively, of leaving school.

The Department provides breakdowns of its data by institution type, state and school. Here are some key findings:


  • The FY 2010 two-year CDR increased from 8.8 to 9.1 percent overall. Public institutions increased from 7.2 to 8.3 percent, private nonprofits increased from 4.6 to 5.2 percent, but for-profits decreased from 15.0 to 12.9 percent (though their two-year CDR is still the highest).
  • The FY 2009 three-year CDR is 13.4 percent overall (this is the Department’s first year reporting three-year data) with public institutions at 11 percent, private nonprofits at 7.5 percent, and for-profits at 22.7 percent.


  • UW’s three-year CDR is a remarkable 3.1 percent—more than 10 percentage points below the national average.
  • UW’s two-year CDR increased slightly from 1.4 to 2.1 percent, but is still well below the national average.
  • The State of Washington’s three-year CDR is 11.3 percent—below the national average, but still above approximately half the states.

Unfortunately, the Department does not release loan default rates disaggregated by student demographic (even though it collects this information), which prevents schools from identifying and catering assistance to students with the most need. While third-parties have conducted studies indicating that Pell Grant recipients and Latino students are more likely to default on loans, schools and legislators need better data from the federal government in order to fully identify at-risk groups and mitigate rising default rates.

As a recent post discussed, if you attend college, you are more likely to earn more money. But, as you might imagine, the financial value of higher education depends on what program you choose and where.

Information on the annual earnings of students from different programs and institutions is exactly what Sen. Ron Wyden, a Democrat of Oregon, and Sen. Marco Rubio, a Republican of Florida, hope to provide. Their recently-introduced “Student Right to Know Before You Go Act” proposes creating a state-based, individual-level data system linking the average costs and graduation rates of specific programs and institutions to their graduates’ accrued debt and annual earnings.

Although useful, Senator Wyden acknowledged that such information is limited and that focusing on financial indicators alone could undermine the importance of liberal arts—whose graduates may not earn large salaries right after college. He stated that the bill’s intention is “to empower people to make choices.” However, “people” include not just students, but policy makers—such as Florida’s Governor Rick Scott who sparked controversy last October when he asserted that state money should go to job-oriented fields, rather than fields like anthropology which, he said, do not serve the state’s vital interest.

Regardless of the bill’s success, about half of the states already have the ability to link postsecondary academic records with labor data. And some, such as Tennessee, have already done so. Here in Washington, the Education Research and Data Center is in the process of connecting certain employment and enrollment data for schools, such as the UW, to analyze in the coming months.

All this begs the question: Is college chiefly for personal economic gain?

A recent report by the College Board highlights both the financial and nonfinancial payoffs of college. Additionally, David A. Reidy, head of the philosophy department at University of Tennessee Knoxville, stated in a recent Chronicle article that four-year degrees, particularly in liberal-arts, are not solely for job training. “The success of the American democratic experiment depends significantly on a broadly educated citizenry, capable of critical thinking, cultural understanding, moral analysis and argument,” he wrote. Philosophy and other core disciplines help nurture such a citizenry, he continued, “And the value there is incalculable.”

Today, the National Center for Education Statistics (NCES) published a report summarizing enrollment, price of attendance, and completions data submitted by all Title IV institutions to the Integrated Postsecondary Education Data System (IPEDS) in fall 2011.

Here are some of the findings:

  • Between 2009-10 and 2011-12, the average undergraduate tuition and required fees at 4-year public institutions nationwide (after adjusting for inflation) increased more for in-state students (9 percent increase) than for out-of-state students (6 percent increase). This is consistent with the UW’s experience, where the tuition increase (after adjusting for inflation) was 31.1% ($2,509.05) for residents and 9.8% ($2,509.55) for non-residents over that period.
  • In 2010-11, of the 25,645,985 undergraduate students enrolled in Title IV institutions in the nation, 50.9% attended 4-year institutions – of these, 59.7% attended public institutions. The public share of the 3,876,611 graduate students enrolled in Title IV institutions was 47.6%.
  • Females constitute 57.0% of the undergraduate and 60.2% of the graduate students in the nation. They also account for 58% of the degrees granted by all 4-year institutions.

The US Department of Education released their second annual ranking of universities by costUsers can rank institutions by tuition rate (sticker price) or by a net cost of attendance measure. Institutions are also ranked by annual percentage increases in these measures. The Department presents these data as a tool to help students and families find good educational and financial fits when selecting an institution, and also aims to publically identify and shame institutions that increase tuition the most.

While any attempt to centralize and simplify higher education data to facilitate easier consumer evaluation and comparison is an important effort, there are many potential unintended consequences relating to both the measures used and in aggregating this type of information across such a large, varied set of institutions. Economists Robert Archibald and David Feldman address some of these problems in an Inside Higher Ed piece published today.

A few weeks ago, the National Research Council’s Panel on Measuring Higher Education Productivity published its 192-page report on Improving Measurement of Productivity in Higher Education, marking the culmination of a three-year, $900,000 effort funded by the Lumina Foundation and involving 15 higher education policy experts nationwide.

In explaining the need for a new productivity measure, the Panel made several key observations:

  • It’s all about incentives: Institutional behavior is dynamic and directly related to the incentives embedded within measurement systems. As such, policymakers must ensure that the incentives in the measurement system genuinely support the behaviors that society wants from higher education institutions and are structured so that measured performance is the result of authentic success rather than manipulative behaviors.
  • Costs and productivity are two different issues: Focusing on reducing the cost of credit hours or credentials invites the obvious solutions: substitute cheap teachers for expensive ones, increase class sizes, and eliminate departments that serve small numbers of students unless they somehow offset their costs. In contrast, focusing on productivity assesses whether changes in strategy are producing more quality-adjusted output (credit hours or credentials) per quality-adjusted unit of input (faculty, equipment, laboratory space, etc.).
  • Using accountability measures without context is akin to reading a legend without looking at the map: Different types of institutions have different objectives, so the productivity of a research university cannot be compared to that of a liberal arts or community college, not least because they serve very different student populations who have different abilities, goals, and aspirations. The panel notes that, among the most important contextual variables that must be controlled for when comparing productivity measures are institutional selectivity, program mix, size, and student demographics.

The Panel also contributed a thorough documentation of the difficulties involved in defining productivity in higher education. From time to time, it is helpful to remind ourselves that, while it may be “possible to count and assign value to goods such as cars and carrots because they are tangible and sold in markets, it is harder to tabulate abstractions like knowledge and health because they are neither tangible nor sold in markets”. The diversity of outputs produced by the institutions, the myriad inputs used in its activities, quality change over time and quality variation across institutions and systems all contribute to the complexity of the task.

Despite these difficulties, the Panel concluded that the higher education policy arena would be better served if it used a measure of productivity whose limitations were clearly documented than if it used no measure of productivity at all. It proposed a basic productivity metric measuring the instructional activities of a college or university: a simple ratio of outputs over inputs for a given period. Its preferred measure of output was the sum of credit hours produced, adjusted to reflect the added value that credit hours gain when they form a completed degree. Its measure of input was a combination of labor (faculty, staff) and non-labor (buildings and grounds, materials, and supplies) factors of production used for instruction, adjusted to allow for comparability. The Panel was careful to link all components of its formula to readily available data published in the Integrated Postsecondary Education Data System (IPEDS) so that its suggested measure may easily be calculated and used. It also specified how improvements to the IPEDS data structure might help produce more complete productivity measures.

The key limitation in the Panel’s proposal – fully acknowledged in the report – is that it does not account for the quality of inputs or outputs. As the Panel notes, when attention is overwhelmingly focused on quantitative metrics, there is a high risk that a numeric goal will be pursued at the expense of quality. There is also a risk that quantitative metrics will be compared across institutions without paying heed to differences in the quality of input or output. The report summarizes some of the work that has been done to help track quality, but concludes that the state of research is not advanced enough to allow any quality weighting factors to be included in its productivity formula.

While readers may lament the Panel’s relegation of measures of quality to further research, especially given the time and resources invested in its effort, the report remains a very useful tool in understanding the issues involved in assessing productivity in higher education and provides valuable food for thought for policymakers and administrators alike.

← Previous PageNext Page →