NCAN members and other regular readers of this blog are routinely inundated with new research, white papers, policy briefs, and data points. It is easy to get buried under the relentless advance of academia and policy research wanting to convince you to pay attention to this, then this, then this, and then, finally, that.
But within any field some research stands out and withstands the test of time, and the college access and success space is no different. I started thinking about what some of the top pieces of research are for our field.
“The top 10 pieces of college access and success research” was decidedly clickbait-y from the beginning, but it was also harder to compile than I anticipated. I asked on Twitter for suggestions and got a number of them. I scoured literature reviews, the Common Measures Handbook, Google Scholar, and the Impact Genome Project for inspiration (and confirmation). Ultimately, are these the top 10 pieces of college access and success research? No; there are way more than 10 listed here. But the list has 10 entries and that counts for something.
These are some of the seminal citations I return to time and again and think you should too. (Please enjoy my slight homage to "Seinfeld" with each listing’s title.)
For nearly 30 years, the Consortium on Chicago School Research has been producing valuable research on student outcomes from early childhood through college and career. High school graduation (or some kind of equivalency) is a good leading indicator and sometimes pre-requisite for matriculating to postsecondary education, and the “on-track” indicator is a good predictor of high school graduation. The research considers a student “on-track” for high school graduation at the end of freshman year if they have accumulated five course credits (the number needed to be promoted to 10th grade) and have no more than one semester F in a core credit. “On-track” is an important early warning indicator that can help schools identify and intervene with students who may be at risk of not graduating.
The “Potholes” report is both an important companion to the “On-Track” research and a critical standalone piece that helps to fill in where students stumble on the way to postsecondary education. "Potholes" makes clear that going to college doesn’t happen by accident; schools and systems have to be purposeful about helping students get there. Some of the report’s key findings include:
“[Chicago Public Schools] students who aspire to complete a four-year degree do not effectively participate in the college application process.”
“Attending a high school with a strong college-going culture shapes students’ participation in the college application process.”
“Only about one-third of CPS students who aspire to complete a four-year degree enroll in a college that matches their qualifications."
These findings highlight challenges the field continues working to change.
Cliff Adelman was a researcher at the U.S. Department of Education and eventually a senior associate at the Institute for Higher Education Policy. His contributions to higher education research were numerous, but two that stand out are his 1999 and 2006 reports that looked at students’ steps toward and setbacks from obtaining a college degree. Both tomes draw on nationally representative survey data from the National Center for Education Statistics and rely on both descriptive statistics and logistic regression to look at the impact of a huge swath of variables on academic performance and completion.
Because of their massive size, the reports defy any reasonable summary, but 1999’s “Answers in the Tool Box” report identifies “the intensity and quality of secondary school curriculum” and “continuous enrollment once a true start has been made in higher education” as the variables most explanatory of degree completion. The 2006 follow-up is, among other things, known for its conclusion that “formal transfer from a community college to a four-year college and formal transfer from one four-year college to another were positively associated with degree completion, but wandering from one school to another was not.” This wandering, which Adelman calls “swirling,” was associated with “a significant and negative relationship to degree completion.”
Unfortunately, Adelman passed away in 2018, but his legacy lives on through his research.
The Advisory Committee on Student Financial Assistance was an independent panel that advised both the U.S. Department of Education and Congress on issues related to financial aid that operated for nearly 30 years. Funding for the grouplapsed in 2015, but the group was well-known for the counsel it provided in helping to shape federal financial aid programs; for example, the committee’s advice helped to create, and then reform, the FAFSA.
Although the committee’s publishing was prolific, I include two reports here from 2002 and 2005 because they are early, persuasive, and comprehensive artifacts that lay out a number of the policy battles in which NCAN has engaged over the years (e.g., simplifying student aid, reforming FAFSA verification). They also propose reforms that eventually came to fruition (e.g., FAFSA on the web, early FAFSA). The “Student Aid Gauntlet” in particular highlights four national imperatives: empower students, make it easy, lose the paper, and work together. Surely this sounds familiar.
Readers who want a better understanding of both the history of college access and which issues have been persistent going on decades will find a lot to appreciate from the now-defunct but still-appreciated advisory committee.
It’s a fair critique that adding collections of research in a blog post that is a collection of research is a little cheeky. Despite that, the three articles above, written by some of the titans of postsecondary education research, gather the knowledge available at the time to make policy recommendations for how to improve college access and success. Keep in mind that we are a decade removed from Long’s 2008 effort and Deming and Dynarski’s 2009 addition to the field, but these deserve a spot on this list for the landscape they paint on core topics.
Drs. Ben Castleman and Lindsay Page are household names in college access for their work on summer melt and, specifically, how to combat it through text message nudging. “A Trickle or a Torrent” starts a long chain of research from the duo (with other authors) that examines the frequency with which college-intending high school graduates fail to matriculate to postsecondary institution the following fall. Text message “nudging” has become a common strategy among college access and success programs and, with increasing frequency, high schools to ensure students don’t melt during the summer. This study pairs data from a nationally representative survey with data from NCAN member uAspire to investigate the question and finds summer melt rates of between 8% and 40%.
“The Hoxby and Turner Study” has been a seminal piece of college access literature since it advanced the concept of “undermatching.” Undermatching, in which high-achieving, low-income students matriculate to a less-selective institution than their academic achievement would likely gain them admission, has become a big topic of discussion in the field.
Hoxby and Turner’s research found that a low-cost intervention (~$6 per student) “causes high-achieving, low-income students to apply and be admitted to more colleges, especially those with high graduation rates and generous instructional resources.” The students who matriculate to these institutions did as well as students in a control group who did not receive the intervention. This set off a trend of postsecondary institutions more purposively recruiting these students than they had before.
The research has implications beyond high-achieving students; indeed, the idea of considering and improving “match” for students at every level of academic achievement has been gaining steam in the field as marginal improvements to institutional selectivity are likely to have positive impacts on completion rates and other postgraduate measures.
Over the past decade, NCAN members have become increasingly focused on documenting and measuring their programs’ impact and communicating it to stakeholders. The two studies and one book chapter above look at the impacts of three NCAN member programs: the College Advising Corps, College Possible, and Bottom Line. (Admittedly, there are a number of member program evaluations in progress and others may be published, but these are the three most prominent I am aware of. Feel free to reach out if you think your program’s should be represented here!)
External evaluation of program processes and outcomes is a valuable (but also often expensive and time-consuming) endeavor. I anticipate over the next decade we will see even more evaluations of college access and success programs. Kudos to these three for paving the way.
Let’s get this out of the way, this is a pretty competitive category. Financial aid is a much-studied topic in higher education. That said, if I were asked to provide evidence that financial aid impacts students’ postsecondary outcomes, these are three studies I would hand over.
Goldrick-Rab et al. (2012) find in an experimental study that randomly provided privately-funded need-based $3,500 scholarships to students at 13 public universities in Wisconsin raised 4-year completion rates by 5 percentage points.
Denning (2018), using a quasi-experimental design, finds that eligibility for being declared financially independent (which increased combined grants and loans by about $1,400) increased the probability of graduating a year earlier by 1.8 percentage points. The additional financial aid helped some sophomores and juniors to persist, induced students to take on more credits without seeing a corresponding decline in GPA, and may have caused students to work fewer hours.
Denning, Marx, and Turner find in a quasi-experimental design in Texas that first-time students at four-year public institutions who were just eligible to receive a maximum Pell Grant were 1.5 percentage points (p.p.) more likely to receive a bachelor’s degree within four years of matriculating and 3.3 p.p. more likely to graduate within five and six years of matriculating. Additionally, each additional $1,000 of grant aid received at college entry was associated with a greater than $1,000 increase to earnings starting four years after enrollment.
Given the significant public and policy investment in financial aid, a robust evidence base supporting the return on investment for students and society is important.
About one-third of undergraduate students attend public two-year colleges, which makes these institutions’ level of service and outcomes for students critical for the nation’s postsecondary landscape. Although these institutions’ completion rates tend to be much lower than their four-year counterparts, on average, CUNY’s Accelerated Study in Associate Programs (ASAP) are a model for how to reverse that trend and show that with the proper supports, students can be successful. Components of the ASAP programs include:
“required full-time study (at least 12 credits per semester)
a consolidated schedule, in which students took their courses in morning, afternoon, evening or weekend blocks to free up time for family, work, and other responsibilities
cohorts organized by major whereby students took classes with fellow ASAP students
full-time ASAP staff devoted to comprehensive and personalized advisement and career development services
twice-monthly required advising sessions for all students; and
special programs for ASAP students, including tutoring, weekly seminars, employment services, leadership opportunities, and transfer advising.”
Results from the study show that compared to a control group, ASAP-participating students were 12% more likely to be retained after the first year, gained credits more quickly, and were nearly 2.5 times more likely to have earned an associate degree (30% vs. 12%) after two years and more than twice as likely to have earned one after three years (55% vs. 26%). This study is the first analysis of the ASAP program (others have subsequently found significant positive effects on outcomes). Additionally, an ASAP-style program was introduced in Ohio, and that implementation also doubled graduation rates. Overall, ASAP is a promising, evidence-based roadmap of how to improve student outcomes at community colleges.
These two studies are a good reminder that not everything in the field has to be a top-to-bottom reform to make a difference for students. Sometimes, little process tweaks or marginal increases to benefits provided by stakeholders can make a big difference.
Such is the case with Smith (2013), which found that increasing the number of four-year colleges to which students apply from one to two increased their probability of enrollment by 40%; moving from two to three applications increased their probability of enrollment by an additional 10%. Similarly, Pallais (2015) found that when ACT increased the number of free score reports it allowed students to send and kept a $6 fee for each additional report, students sent more score reports and sent them to a wider range of colleges in terms of selectivity. Additionally, low-income ACT takers “attended more selective colleges,” and “back-of-the-envelope calculations suggest that the policy substantially increased low-income students’ expected earnings.” As Pallais’ abstract notes: “Small policy perturbations can have large effects on welfare.” That’s a good thing for all of us to keep in mind.
Fully anticipating that there is research readers feel should have been included here, I welcome your feedback on this article at email@example.com. A future blog post may expand on this list.