Recent college graduates may not realize that a reason for their faltering careers could be because they have been “hamstrung by their lack of learning” in school. But deciding how to assess what they learned in college is not straightforward.
A follow-up study from the authors of “Academically Adrift,” a book that showed how “many students experience ‘limited or no learning’” in college, tracked the same students into their lives after graduation. As part of the original study , students had taken the Collegiate Learning Assessment (C.L.A.), “a test of critical thinking, analytic reasoning and communications skills”.
Even after statistically controlling for students’ sociodemographic characteristics, college majors and college selectivity, those who finished school with high C.L.A. scores were significantly less likely to be unemployed than those who had low C.L.A. scores. The difference was even larger when it came to success in the workplace. Low-C.L.A. graduates were twice as likely as high-C.L.A. graduates to lose their jobs between 2010 and 2011, suggesting that employers can tell who got a good college education and who didn’t. Low-C.L.A. graduates were also 50 percent more likely to end up in an unskilled occupation, and were less likely to be satisfied with their jobs.ge, they improved less than half of one standard deviation. For many, the results were much worse. One-third improved by less than a single point on a 100-point scale during four years of college.
The C.L.A. has gained the support of employers who “say grades can be misleading and that they have grown skeptical of college credentials”.
Even as students spend more on tuition—and take on increasing debt to pay for it—they are earning diplomas whose value is harder to calculate. Studies show that grade-point averages, or GPAs, have been rising steadily for decades, but employers feel many new graduates aren’t prepared for the workforce.
Over a hundred colleges participate in CLA+, a test-based program that enables graduates to prove their skills to potential employers. Some schools like California Polytechnic State University promote this test for its benefits to individual students, while other schools focus more on the CLA+ an assessment that shows the overall return on value they provide.
Two years into the job, Daniels has arrived at a major impasse with Purdue’s faculty: how to prove that students are actually learning something while at the university. Backed by Purdue’s Board of Trustees and inspired by the work of Richard Arum and Josipa Roksa (the authors of Academically Adrift: Limited Learning on College Campuses) and others who argue that undergraduates aren’t learning crucial critical thinking skills, Daniels says the university must be accountable to students, parents, taxpayers and policy makers. He’s tasked a faculty body with choosing just how Purdue will assess gains in critical thinking and other skills after four years there, and he wants to start the assessment process soon — by the fall.
Purdue wants the student growth assessment “for the same reason that hundreds of other universities are already doing this — that research has shown that in some cases little to no intellectual growth occurs during the college years,” … “And the marketplace is saying emphatically that they find far too many college graduates lacking in critical thinking and communication skills and problem solving, etcetera.”
The CLA+ is not free of controversy.
… A 2013 study, for example, found that student performance on such tests varies widely based on motivation for taking the test. In other words, a student who has no reason to do well on the test might not take it seriously, and therefore can skew the results negatively for the institution. Others have questioned the appropriateness of basing assessment on small groups of students and whether the gains are likely to be notable at a university like Purdue that admits well-prepared students.
The most popular comment from the Purdue article made a good point.
Yes. It is time that universities and colleges follow the NCLB model on testing because it has worked so well….